The directories

Here are my other performance optimization points, welcome to refer to, comment and ridicule

  • Summary of the technical difficulties and solutions encountered in the front end ~
  • Performance testing tool research
  • React Performance Optimization
  • ReactNative performance optimization
  • The first screen loading speed of Vue is slow
  • DOM operation caused by page lag problem and solution
  • Webpack performance optimization

preface

Webpack is a mainstream modular packaging tool with powerful functions. When using Webpack, if you do not pay attention to performance optimization, there is a great possibility of performance problems. Performance problems are mainly divided into slow packaging and construction speed during development, repetitive work during development and debugging, and low output file quality. Therefore, performance optimization is mainly analyzed from these aspects.

Optimize construction speed

Webpack starts up and recursively parses the dependent files based on the Entry configuration. This process is divided into two processes: searching for files and analyzing and converting matched files. Therefore, you can optimize the configuration from these two perspectives.

Narrow down your search for files

Search process optimization methods include:

  1. The resolve field tells WebPack how to search for files, so take care of the resolve field configuration first:
  • Resolve. Modules :[path.resolve(__dirname, ‘node_modules’)] to avoid layers of lookup.

Resolve. modules tells Webpack which directories to look for third-party modules. The default is [‘node_modules’], which looks for./node_modules,.. / node_modules,.. /.. / node_modules.

  • Set resolve.mainfields :[‘main’]. Set as few values as possible to reduce the search steps for the entry file

MainFields defines which entry file of the third-party module to use. Since most third-party modules use the main field to describe the location of the entry file, you can set a single main value to reduce search

  • Set resolve.alias for large third-party modules to make Webpack use the library’s min file directly, avoiding in-library resolution

React:

resolve.alias:{
    'react':patch.resolve(__dirname, './node_modules/react/dist/react.min.js')}Copy the code

This will affect Tree-shaking. It is best for libraries that are more holistic, but for libraries that are more fragmented like LoDash. Avoid this approach.

  • Configure resolve. Extensions to reduce file lookups

Default: Extensions :[‘.js’, ‘.json’],

When an import statement does not have a file suffix, Webpack does a file lookup based on the extensions list, so:

A. Keep the list value as small as possible

B. Write the suffix in front of the file type with high frequency

C. Import statements in source code as far as possible write file suffixes, such as require(./data) should be written require(./data.json)

  1. The module.noParse field tells Webpack which files do not need to be parsed. It can be used to exclude parsing of non-modular library files such as jQuery, ChartJS, and react.min.js if resolve. Alias is configured. Because react.min.js has been built to run directly in a browser, it is a modular file.

The noParse values can be RegExp, [RegExp], or function

module:{ noParse:[/jquery|chartjs/, /react\.min\.js$/] }
Copy the code
  1. When configuring the Loader, use test, exclude, and include to narrow the search scope

Use the DllPlugin to reduce compilation times for the base module

DllPlugin is a dynamic link library plug-in whose principle is to extract the basic modules that web pages depend on and package them into A DLL file. When a module to be imported exists in a DLL, the module is no longer packaged, but obtained from the DLL. Why does this increase build speed? The reason is that most DLLS contain common third-party modules such as React and React-dom, so as long as these modules are not upgraded, they only need to be compiled once. I think doing this has the same effect as configuring resolve.alias and module.noParse.

Usage:

Use DllPlugin to configure a webpack_dll.config.js to build the DLL file:

// webpack_dll.config.js
const path = require('path');
const DllPlugin = require('webpack/lib/DllPlugin');
module.exports = {
 entry:{
     react:['react'.'react-dom'],
     polyfill:['core-js/fn/promise'.'whatwg-fetch']
 },
 output:{
     filename:'[name].dll.js',
     path:path.resolve(__dirname, 'dist'),
     library:'_dll_[name]'}, plugins:[new DllPlugin({name:'_dll_[name]'// DLL global variable name path:path.join(__dirname,'dist'.'[name].manifest.json'),// Describe the generated manifest file})]}Copy the code

The name value of the DllPlugin parameter must be the same as the output.library value, and the generated manifest file will reference the output.library value.

The resulting file:

| - polyfill. DLL. Js | -- polyfill. Manifest. Json | -- the react. DLL. Js └ ─ ─ the react. The manifest. JsonCopy the code

Xx.dll. js contains a package of n modules, which are stored in an array with the array index as ID. These modules are exposed globally by a variable named _xx_DLL and can be accessed through window._xx_DLL. The xx.manifest.json file describes which modules the DLL file contains, and the path and ID of each module. The xx.manifest.json file is then imported using the DllReferencePlugin in the project’s main config file.

Use the DllReferencePlugin in the main config file to import xx.manifest.json:

//webpack.config.json
const path = require('path');
const DllReferencePlugin = require('webpack/lib/DllReferencePlugin');
module.exports = {
    entry:{ main:'./main.js'} / /... Plugins :[new DllReferencePlugin({manifest:require('./dist/react.manifest.json')
        }),
        new DllReferenctPlugin({
            manifest:require('./dist/polyfill.manifest.json')]}})Copy the code

The final build generates main.js

Use HappyPack to enable multi-process Loader conversion

In the whole construction process, the most time-consuming is the Loader’s file conversion operation, and the Webpack running on Node.js is a single-threaded model, that is, it can only be processed one file at a time, not in parallel. HappyPack can split tasks into child processes and send results to the main process. JS is a single-threaded model and can only be improved in this multi-process manner.

HappyPack uses the following:

npm i -D happypack
// webpack.config.json
const path = require('path');
const HappyPack = require('happypack');

module.exports = {
    //...
    module:{
        rules:[{
                test: / \. Js $/, use:'happypack/loader? id=babel']
                exclude:path.resolve(__dirname, 'node_modules')
            },{
                test:/\.css/,
                use:['happypack/loader? id=css']
            }],
        plugins:[
            new HappyPack({
                id:'babel',
                loaders:['babel-loader? cacheDirectory']
            }),
            new HappyPack({
                id:'css',
                loaders:['css-loader'})]}}Copy the code

In addition to ID and loaders, HappyPack also supports these three parameters: Threads, verbose, and ThreadPool. Threadpool represents a shared process pool.

Enable multi-process compressed JS files using ParallelUglifyPlugin

When using UglifyJS plug-in to compress JS code, it is necessary to parse the code into the AST (abstract syntax tree) represented by Object, and then apply various rules to analyze and process the AST. Therefore, this process takes a lot of calculation and time. ParallelUglifyPlugin enables multiple sub-processes, each of which uses UglifyJS compression code and can be executed in parallel, significantly reducing compression time.

It is also very simple to use. Replace the original UglifyJS plug-in with a cost plug-in, and use it as follows:

npm i -D webpack-parallel-uglify-plugin

// webpack.config.json
const ParallelUglifyPlugin = require('wbepack-parallel-uglify-plugin'); / /... plugins: [new ParallelUglifyPlugin({uglifyJS:{uglifyJS:{uglifyJS:{uglifyJS:{uglifyJS:{uglifyJS:{//... Speed up the build})]Copy the code

Optimize the development experience

After you modify the source code during development, you need to automatically build and refresh the browser to see the effect. This process can be automated using Webpack, which listens for changes to files, and DevServer, which refreshes the browser.

Use auto refresh

Webpack listens for files

Webpack can enable listening in two ways:

  1. Start WebPack with the –watch argument;
  2. Set watch:true in the configuration file. In addition, the following configuration parameters are available. Setting watchOptions properly can optimize the listening experience.
module.exports = {
    watch: true, watchOptions: {ignored: /node_modules/, aggregateTimeout: 300, // how long will build after the file changes, the bigger the better poll: 1000, // number of queries per second, the smaller the better}}Copy the code

Ignored: Set the directory that you don’t listen to. Ignoring node_modules can significantly reduce Webpack memory consumption

AggregateTimeout: how long it takes to start a build after a file changes. The larger the aggregateTimeout, the better

Poll: Determines whether a file has changed by polling the system for file changes. Poll indicates the number of queries per second. The smaller the number, the better

DevServer refreshes the browser

DevServer refreshes the browser in two ways:

  • Inject proxy client code into the web page and initiate a refresh through the client
  • Load an iframe into the web page and refresh the iframe to achieve the refresh effect

By default, and devServer: {inline:true} refresh the page in the first way. In the first way, DevServer does not know which chunks the web page depends on, so it injects client code into each Chunk, causing a slow build when many chunks are exported. Since only one client is required for a page, disabling inline mode reduces build time and improves with more chunks. Closing mode:

  • Start with webpack-dev-server –inline false
  • Configuration devserver:} {the inline: false

After closing the inline entrance website http://localhost:8080/webpack-dev-server/

In addition, you can set devServer.compress to whether to use Gzip compression. The default value is false

Enable the module hot replacement HMR

Module hot replacement does not refresh the entire page, but only recompiles the changed module, and replaces the old module with the new module, so the preview reaction is faster, the wait time is less, and the current page can be preserved without refreshing the page. The principle is to inject proxy clients into each chunk to connect to DevServer and web pages. Opening mode:

  • webpack-dev-server –hot
  • Use HotModuleReplacementPlugin, more troublesome

After this function is enabled, if the submodule is modified, partial refresh can be realized. However, if the root JS file is modified, the whole page will be refreshed. The reason is that when the submodule is updated, the event is passed up one layer until the file of one layer receives the currently changed module, and then the callback function is executed. If layers are tossed out until no file is received at the outermost layer, the entire page is refreshed.

Using the NamedModulesPlugin allows the console to print out the name of the replaced module instead of the numeric ID, listens with Webpack, and ignores files in the node_modules directory to improve performance.

Optimize output quality – compress file volume

Environment differentiation – Reduce the production environment code size

The code operation environment is divided into development environment and production environment. The code needs to do different operations according to different environments. Many third-party libraries also have a large amount of if and else code judged according to the development environment. Differentiating environments can reduce the size of the code in the output production environment. The DefinePlugin plugin is used in Webpack to define the environment for which the configuration file applies.

const DefinePlugin = require('webpack/lib/DefinePlugin'); / /... plugins:[ new DefinePlugin({'process.env': {
            NODE_ENV: JSON.stringify('production')}})]Copy the code

Note that the reason for json.stringify (‘production’) is that the value of the environment variable requires a double-quoted string, and the value after stringify is ‘”production”‘

You can then use the defined environment in the source code:

if(process.env.NODE_ENV === 'production'){
    console.log('You're in the production environment')
    doSth();
}else{
    console.log('You're developing the environment')
    doSthElse();
}
Copy the code

When process is used in code, Webpack is automatically packaged into the code of the Process module, which emulates the process in Node.js, to support non-Node.js runtime environments. To support the process.env.node_env === ‘production’ statement.

Compressed code -JS, ES, CSS

Compressed JS: Webpack built-in UglifyJS plug-in, ParallelUglifyPlugin

Can analyze THE JS code syntax tree, understand the meaning of the code, so as to eliminate invalid code, remove the log input code, shorten the variable name optimization. Common configuration parameters are as follows:

const UglifyJSPlugin = require('webpack/lib/optimize/UglifyJsPlugin'); / /... plugins: [ new UglifyJSPlugin({ compress: { warnings:false// Drop useless code without warning drop_console:true// Delete all console statements in IE Collapse_vars:true, // Inline variable reduce_vars that are defined but used only once:true}, output: {beautify: {beautify: {beautify: {beautify: {beautify: {beautify:false// The most compact output with no Spaces and no tabs comments:false, // delete all comments}})]Copy the code

Starting webpack with webpack — optimize-Minimize can inject the default configuration of UglifyJSPlugin

Compressed ES6: third-party UglifyJS plug-in

As more and more browsers support direct ES6 code execution, you should run native ES6 as much as possible, with less code and better performance than converted ES5 code. When you run ES6 code directly, you also need to compress it. The third-party Uglip-webpack-plugin provides the ability to compress ES6 code:

NPM i-d uglify-webpack-plugin@beta //webpack.config.json const UglifyESPlugin = require('uglify-webpack-plugin'); / /... Plugins :[new UglifyESPlugin({uglifyOptions: {// more than UglifyJS nested layer: {warnings:false,
                drop_console: true,
                collapse_vars: true,
                reduce_vars: true
            },
            output: {
                beautify: false,
                comments: false}}})]Copy the code

Also to prevent babel-loader from switching to ES6 code, remove babel-preset-env in. Babelrc since it is babel-preset-env that is responsible for switching TO ES6 to ES5.

Compressed CSS: CSS-loader? Minimize, PurifyCSSPlugin

Cssnano is based on PostCSS, and can not only delete Spaces, but also understand the code meaning, such as color:#ff0000 to color:red, csS-loader built-in CSSNano, only need to use CSS-loader? It turns on CS Nano compression.

Another way to compress CSS is to use the PurifyCSSPlugin, which needs to be used in conjunction with the extract-text-webpack-plugin. Its main function is to remove unused CSS code, like JS Tree Shaking.

Remove JS dead code using Tree Shaking

Tree Shaking can weed out dead code that you don’t need. It relies on ES6’s modular import and export syntax, first seen in Rollup and introduced in Webpack 2.0. Suitable for Lodash, utils.js and other scattered utility class files. It only works if the code follows ES6’s modularity syntax, which is static (paths in import and export statements must be static strings and cannot be placed in other code blocks). If you use modularity in ES5, for example, module.export = {… }, require(x+y), if (x) {require(‘./util’)}, Webpack cannot figure out which code can be removed.

Enable Tree Shaking:

  • Modify. Babelrc to preserve ES6 modular statements:
{
    "presets": [["env", 
            { "module": false}, // turn off Babel module conversion, preserve ES6 modularity syntax]]}Copy the code
  • Starting webpack with –display-used-exports prints a prompt for code culling in the shell

  • Use UglifyJSPlugin, or boot with –optimize — minimize**

  • When using third-party libraries, resolve. MainFields: [‘ jsNext :main’, ‘main’] needs to be configured to specify ES6 modularized code entry when parsing third-party library code

Optimize output quality – speed up network requests

Use CDN to speed up static resource loading

Principle of CDN acceleration

By deploying resources around the world, CDN enables users to access resources nearby and speeds up access. To access the CDN, you need to upload static resources of web pages to the CDN service. When accessing these resources, the URL provided by the CDN service is used.

CDN will enable caching for resources for a long time. For example, if a user obtains index. HTML from the CDN, the user will still use the previous version until the cache expires, even if the index. HTML is replaced later. Industry practice:

HTML files: store them on your own server and disable the cache, do not access the CDN static JS, CSS, images and other resources: enable the CDN and cache, and attach the Hash value calculated by the content to the file name. In this way, the Hash value will change as the content changes, the file name will change, and they will be downloaded again regardless of the cache time.

In addition, under the HTTP1.x protocol, browsers limit the number of concurrent requests to the same domain name to 4 to 8. Therefore, all static resources can be placed on different CDN services, such as JS files under js.cdn.com and CSS files under css.cdn.com. This brings a new problem: increased domain name resolution time, which can be solved by dnS-prefetch to reduce domain name resolution time. Urls such as //xx.com omit the protocol. The advantage of this is that the browser automatically determines whether to use HTTP or HTTPS when accessing resources based on the mode of the current URL.

In summary, the build needs to satisfy the following points:

  • The STATIC resource import URL is changed to the absolute path to the CDN service
  • The file name of a static resource must have a Hash value calculated based on the content
  • Final configuration of CDN for different types of resources in different domain names:
const ExtractTextPlugin = require('extract-text-webpack-plugin');
const {WebPlugin} = require('web-webpack-plugin'); / /... output:{ filename:'[name]_[chunkhash:8].js',
 path: path.resolve(__dirname, 'dist'),
 publicPatch: '//js.cdn.com/id/'}, module:{rules:[{test: /\.css/,
     use: ExtractTextPlugin.extract({
         use: ['css-loader? minimize'],
         publicPatch: '//img.cdn.com/id/', // Specify the CDN address for storing resources such as images in the CSS file}),},{test: /\.png/,
    use: ['file-loader? name=[name]_[hash:8].[ext]'}]}, plugins:[new WebPlugin({template:'./template.html',
     filename: 'index.html',
     stylePublicPath: '//css.cdn.com/id/'}), new ExtractTextPlugin({filename: '[name]_[contenthash:8].css', // Add Hash to output CSS file})]Copy the code

Multi-page applications extract common code between pages to take advantage of caching

Principle Large web sites are usually composed of multiple pages, each page is an independent single-page application, multiple pages will rely on the same style file, technology stack, etc. If these common files are not extracted, each chunk packed with a single page contains the common code, which is equivalent to n duplicate code transfers. If the public file is extracted from a file, then when the user visits a web page, loads the public file, and then visits other pages that rely on the public file, the file is directly used in the browser’s cache, and the public file is transferred only once.

Application methods

  1. Extract common code that multiple pages depend on into common.js, where common.js contains the code for the base library
const CommonsChunkPlugin = require('webpack/lib/optimize/CommonsChunkPlugin'); / /... plugins:[ new CommonsChunkPlugin({ chunks:['a'.'b'], // Extract name from which chunk:'common', // The extracted common part forms a new chunk})]Copy the code
  1. Find the base libraries you depend on, write a base.js file, and extract the common code from common.js into the base. Common.js removes the base library code, while Base.js remains unchanged
//base.js
import 'react';
import 'react-dom';
import './base.css';
//webpack.config.json
entry:{
    base: './base.js'
},
plugins:[
    new CommonsChunkPlugin({
        chunks:['base'.'common'],
        name:'base', //minChunks:2, indicating the minimum number of chunks required to be extracted, in case there is no code in common.js})]Copy the code
  1. You get base library code base.js, common. Js without the base library, and the page’s own code file xx.js. The pages are referenced in the following order: base.js–> common.js–> xx.js

Split up the code to load on demand

The principle of

One of the problems of single-page applications is that a single page is used to host complex functions, and the file volume to be loaded is large. If not optimized, the first screen will take too long to load, affecting the user experience. Doing on-demand loading can solve this problem. Specific methods are as follows:

  1. Categorize site features into relevant categories
  2. Each category is merged into a Chunk and the corresponding Chunk is loaded on demand
  3. For example, put only the first screen related features in the Chunk where the execution portal is located, so that a small amount of code is loaded first and the rest of the code is loaded when it needs to be used. It is best to anticipate what the user will do next and load the corresponding code in advance so that the user does not feel the network load

practice

The simplest example is: the first time a page loads main.js, the page displays a button, when the button is clicked, the split show.js will be loaded, and functions in show.js will be executed

//main.js
document.getElementById('btn').addEventListener('click'.function(){
    import(/* webpackChunkName:"show"* /'./show').then((show)=>{
        show('Webpack');
    })
})

//show.js
module.exports = function (content) {
    window.alert('Hello ' + content);
}
Copy the code

Import (/* webpackChunkName:show */ ‘./show’).then() is the key to implementing on-demand loading. Webpack has built-in support for import() statements. Webpack will regenerate a Chunk with a./show.js entry. When the code runs in the browser, it will only start loading show.js when the button is clicked, and the import statement returns a Promise that the loaded content can be retrieved in the then method. This requires browsers to support the Promise API, and for those that don’t, inject the Promise Polyfill.

/*webpackChunkName:show */ is the name of the Chunk that is dynamically generated. The default name is [id].js. To output the configured ChunkName correctly, Webpack also needs to be configured:

/ /... output:{ filename:'[name].js',
    chunkFilename:'[name].js', // specify the file name of dynamically generated Chunk when exporting}Copy the code

The book also provides a more complex scenario for asynchronously loading components in a React Router.

Optimize output quality – improve code runtime efficiency

Use Prepack to evaluate ahead of time

How it works: A Prepack is a partial evaluator that compiles code ahead of time by putting the results into the compiled code, rather than evaluating the code at run time. By pre-executing the source code in the first phase to get the execution results, and then directly exporting the results to improve performance. But Prepack is not mature enough to be used in an online environment.

Method of use

const PrepackWebpackPlugin = require('prepack-webpack-plugin').default;
module.exports = {
    plugins:[
        new PrepackWebpackPlugin()
    ]
}
Copy the code

Using the Scope Hoisting

The principle, translated as “scope promotion”, is a function introduced in Webpack3. It analyzes the dependencies between modules and merges the separated modules into a function as far as possible, but does not cause code redundancy, so only the modules referenced once can be merged. Due to the need to analyze the dependencies between modules, so the source must be ES6 modular, otherwise Webpack will degrade processing without Scope.

Method of use

const ModuleConcatenationPlugin = require('webpack/lib/optimize/ModuleConcatenationPlugin'); / /... plugins:[ new ModuleConcatenationPlugin(); ] , resolve:{ mainFields:['jsnext:main'.'browser'.'main']}Copy the code

The webpack –display-optimization-bailout output log indicates which file caused the degradation

Use the output analysis tool

Starting Webpack with these two parameters generates a JSON file that output analysis tools mostly rely on for analysis:

Webpack –profile –json >stats.json where –profile records the build time, –json outputs the build result in JSON format, >stats.json is a pipe command in UNIX/Linux, The meaning is to pipe the content to a stats.json file.

  • Official tool Webpack Analyse

Open the tool’s official website http://webpack.github.io/anal… , the analysis results can be obtained

  • webpack-bundle-analyzer

Visual analysis tool, more intuitive than Webapck Analyse. It’s also easy to use:

1. NPM I-g webpack-bundle-Analyzer is installed globally

2. Generate the stats.json file as described above

3. Run webpack-bundle-Analyzer in the root directory of the project. The browser automatically opens the result analysis page.

Other Tips

  • When configuring babel-loader, use: [‘ babel-loader? CacheDirectory ‘] cacheDirectory is used to cache Babel compilation results and speed up recompilation. Also note that the node_modules folder is excluded, since the files use ES5 syntax and there is no need to use Babel conversion.
  • Configure externals to exclude code that does not need to be packaged because it has been introduced using script tags, and noParse to exclude code that does not use modular statements.
  • Configuring the performance parameter outputs the performance check configuration of the file.
  • Configure profile: true to capture performance information for Webpack builds to analyze what is causing the builds to perform poorly.
  • Configure cache: true, whether to enable caching to improve build speed.
  • You can use url-loader to convert small images into Base64 and embed them in JS or CSS to reduce loading times.
  • Compress the image through imagemin-webpack-Plugin and make Sprite image through Webpack-Spritesmith.
  • Set devtool to cheap-module-eval-source-map in your development environment because this source map is the fastest to generate and speeds up builds. Set devtool to hidden-source-map in production