Development experience optimization

The purpose of optimizing the development experience is to improve the efficiency of development, which can be divided into the following points: the build time can be very long when the project is large, and the time spent waiting for a build can add up to a large number.

1. Optimize loader configuration

As the conversion operation of files by Loader is time-consuming, the Loader needs to process as few files as possible. In order to have as few files processed by loader as possible, only those files that need to be processed can be hit by 'include'. For example, in a project that uses ES6, you can configure babel-loader as follows:Copy the code
Module. exports = {module: {rules: [{// do not write /\.jsx?$/ if js files are available. /\.js$/, // babel-loader supports caching of converted results with cacheDirectory enabled: ['babel-loader? CacheDirectory '], // use babel-loader include only for files in SRC directory under project root: path.resolve(__dirname, 'src'), }, ] }, };Copy the code
You can adjust the directory structure of the project appropriately to narrow the hit range through include when configuring the Loader.Copy the code

2. Optimize the resolve.modules configuration

'resolve.modules' is used to configure the directory in which Webpack will look for third-party modules. The default is' ['node_modules']', which means to look for the desired module in the current directory./node_modules. If you don't find it, go up to the directory... /node_modules, then go to.. /.. /node_modules, and so on. This is similar to node.js' module finding mechanism. If the installed third-party modules are stored in the./node_modules directory of the project root directory, there is no need to search layer by layer in the default way. You can specify the absolute path to save the third-party modules to reduce the search.Copy the code
Module.exports = {resolve: {// exports = {resolve: {// exports = {resolve: {// exports = {resolve: { [path.resolve(__dirname, 'node_modules')] }, };Copy the code

3. Optimize the resolve.mainfields configuration

We know that 'resolve.mainFields' is used to configure which entry file the third-party module uses. Any third-party module installed will have a 'package.json' file that describes the properties of the module. There are fields that describe where the entry file is located, and 'resolf. mainFields' that configure which fields to use as the description of the entry file. The reason multiple field description entry files can exist is because some modules can be used in multiple environments at the same time, requiring different code for different runtime environments. [isomorphic - fetch] (https://github.com/matthew-andrews/isomorphic-fetch), for example, It is [fetch API] (https://developer.mozilla.org/zh-CN/docs/Web/API/Fetch_API), an implementation, but at the same time for the browser and the Node. Js environment. It has two entry file description fields in package.json:Copy the code
{
  "browser": "fetch-npm-browserify.js",
  "main": "fetch-npm-node.js"
}
Copy the code

Isomorphic-fetch uses different codes in different operating environments because fetch API implementation mechanism is different, which is implemented through native FETCH or XMLHttpRequest in browser and HTTP module in Node.js.

The default value of resolve. MainFields is related to the current target configuration as follows: a: When the target is Web or Webworker, the value is ["browser", "module", "main"] b: In other cases, the value is ["module", "main"]. For example, if target is equal to Web, Webpack uses the browser field in the third-party module to find the entry file for the module. If it doesn't exist, the Module field is used. And so on. To minimize the search steps, you can keep the entry file description field as small as possible when you specify it for a third-party module. Since most third-party modules use the main field to describe the location of the entry file, Webpack can be configured as follows:Copy the code
Module.exports = {resolve: {mainFields: ['main'],},}; // exports = {resolve: {mainFields: ['main'],},};Copy the code
When optimizing with this method, you need to take into account the entry file description fields of all the third-party modules that the runtime relies on, and even one module that is wrong can cause the code to not work properly.Copy the code

4. Optimize the resolve. Alias configuration

The 'resolve.alias' configuration item maps the original import path to a new one through an alias. For example, the 'React' library installed in 'node_modules' has the following directory structure: ` ` ` ├ ─ ─ dist │ ├ ─ ─ the react. Js │ └ ─ ─ the react. Min. Js ├ ─ ─ lib │... There are dozens of files are ignored │ ├ ─ ─ LinkedStateMixin. Js │ ├ ─ ─ createClass. Js │ └ ─ ─ the React. Js ├ ─ ─ package. The json └ ─ ─ the React. Js ` ` ` can see out The React library contains two sets of code: 1) A set of modular code using the CommonJS specification. These files are stored in the 'lib' directory with the specified entry file 'react.js' in' package.json 'as the entry of the module. 2) The first set is to pack all the React related code into a single file, which can be directly executed without modularization. Dist /react.js is for the development environment and contains checks and warnings. Dist /react.min.js is used in online environments and is minimized. By default Webpack will recursively parse and process dozens of dependent files from the entry file './node_modules/react/react.js', which can be a time-consuming operation. Resolve. Alias enables Webpack to use a separate, complete 'react.min.js' file when dealing with the React library, skipping time-consuming recursive parsing. The Webpack configuration is as follows:Copy the code
Module.exports = {resolve: {// alias: {'react': {// alias: {'react': {// alias: {'react': {// alias: {'react': {// alias: {'react': {// alias: {'react': {// alias: {'react': { path.resolve(__dirname, './node_modules/react/dist/react.min.js'), // react15 // 'react': path.resolve(__dirname, './node_modules/react/umd/react.production.min.js'), // react16 } }, };Copy the code
With the exception of the React library, most libraries publish to the Npm repository with packaged full files, for which you can also configure aliases. ` ` `Copy the code

5. Optimize the resolve. Extensions configuration

If the file does not have a suffix, Webpack will automatically add the suffix to the file to try to check whether the suffix exists. 'resolve.extensions' is configured. The default is'' extensions' : ['.js', '.json'] if we encounter an import statement such as' require('./data') 'in development,' Webpack 'will first look for the' data.js' file, and if not, will train to look for the 'data.json' file. An error is reported if it is still not found. The longer the list is, or the later the correct suffix is, the more attempts will be made, so the configuration of the 'resolve.extensions' will also affect the performance of the build. There are a few things you need to follow when configuring 'resolve. Extensions' to optimize build performance as much as possible: 1) Keep the suffix attempts list as small as possible, and don't write to the suffix attempts list anything that isn't possible in the project. 2) File suffixes with the highest frequency should be placed first in order to exit the search process as soon as possible. 3) In the source code to write import statements, as far as possible with suffixes, so as to avoid the search process. For example, write 'require('./data')' as' require('./data.json') 'if you are certain. The Webpack configuration is as follows:Copy the code
Module.exports = {resolve: {// Possible extensions: ['js'],},};Copy the code

6. Optimize module-noparse configuration

The 'module.noParse' configuration allows' Webpack 'to ignore recursive parsing of parts of the file that are not modularized, which has the benefit of improving build performance. As mentioned above in the optimized 'resolve. Alias' configuration, the entire' react.min.js' file is not modularized. Let's ignore the recursive parsing of the 'react.min.js' file by configuring' module.noParse '. The Webpack configuration is as follows:Copy the code
const path = require('path'); Module. exports = {module: {// exports = {module: {// react.min.js' file is not modular, ignore 'react.min.js' file' noParse: [/react\.min\.js$/], }, };Copy the code

These are all the build performance optimizations related to narrowing the file search area. After adapting them to your project’s needs, your build speed will definitely improve.

7. Use the HappyPack plugin

HappyPack is a plug-in in Webpack. When our project is built by Webpack, there will be a lot of read and write operations. Generally, there will be a lot of files in a project, which will cause the build to be very slow. In addition, Webpack running on Node.js has a single-threaded model, which means that the tasks that Webpack needs to handle need to be done side by side, not all at once. To solve the problem of slow reads and writes, we can use HappyPack to harness the power of multi-core CPU computers. HappyPack can split tasks into multiple sub-processes for concurrent execution, and then send the results to the main process.

Here’s how to use HappyPack:

The dependencies need to be installed first, via NPM in the root directory of the project

npm i -D happypack
Copy the code

Here’s how HappyPack is used

const path = require('path'); const HappyPack = require('happypack'); Module.exports = {// JS implements entry: {main: './main.js',}, output: {// consolidates all dependent modules into a bundle. JS file filename: Resolve (__dirname, './dist'),}, module: {rules: [{test: /\.js$/, // Transfer processing of.js files to the HappyPack instance with id Babel use: ['happypack/loader? Id = Babel '], // exclude files in node_modules, exclude files in node_modules from Babel, exclude files in node_modules from Babel, exclude files in node_modules from Babel, exclude files in node_modules from Babel. Path.resolve (__dirname, 'node_modules'),}, {// Use the HappyPack instance test: /\.css$/ ExtractTextPlugin.extract({ use: ['happypack/loader?id=css'], }), }, ] }, plugins: [new HappyPack({// use the unique identifier id to indicate that the current HappyPack is used to handle a particular type of file id: 'Babel ', // how to handle.js files, loaders as in Loader configuration: ['babel-loader? CacheDirectory '],}), new HappyPack({id: 'CSS ', // how to handle.css file loaders: ['css-loader'],}),], devtool: 'source-map'Copy the code

Analyze the code above:

  • In the Loader configuration, all file processing is handed over to happypack/ Loader, followed by queryString? Id = Babel tells happypack/loader to select which instance of happypack to process files.

  • In Plugin configuration, two HappyPack instances are added to tell HappyPack /loader how to handle.js and.css files respectively. The value of the ID attribute in the option and the? Id = Babel. The loaders attribute is the same as that in the Loader configuration

In addition to the id and loaders parameters, HappyPack also supports the following parameters:

  • Threads means that several child processes are started to handle this type of file. The default number is 3. The type must be an integer.

  • Verbose Specifies whether to allow HappyPack to output logs. The default value is true.

  • ThreadPool represents a shared process pool. That is, multiple HappyPack instances use subprocesses in the same shared process pool to handle tasks to prevent excessive resource usage. See here for details.

Happypack principle

Probably the most time-consuming process in the entire Webpack build process is the Loader’s conversion of files, because there are so many files to convert and these conversion operations can only be processed one by one. At the heart of HappyPack is the idea of splitting these tasks into multiple processes in parallel to reduce the total build time.

As can be seen from the previous use, all files that need to be processed by Loader are first handed over to Happypack/Loader for processing. After collecting the processing rights of these files, Happypack can distribute them uniformly.

Each HappyPack instantiated with new HappyPack() tells the HappyPack core scheduler how to convert a class of files through a series of Loaders, and can specify how to assign subprocesses to such conversion operations.

The logic code of the core scheduler is in the main process, that is, the process running Webpack. The core scheduler will assign each task to the currently idle sub-process, and the sub-process will send the result to the core scheduler after processing. The data exchange between them is realized through the interprocess communication API.

The core scheduler notifies Webpack that the file has been processed when it receives a result from the child process.

8. Use ParallelUglifyPlugin

The most common JavaScript compression tool is UglifyJS, which is also built into Webpack.

To compress JavaScript code, you need to parse the code into the AST syntax tree represented by Object abstraction, and then apply various rules to analyze and process the AST, resulting in a huge amount of calculation and time-consuming process.

ParallelUglifyPlugin does just that. When Webpack has multiple JavaScript files that need to be exported and compressed, UglifyJS is used to compress and export them one by one. However, ParallelUglifyPlugin enables multiple sub-processes. The compression of multiple files is distributed to multiple sub-processes, each of which is actually using UglifyJS to compress the code, but in parallel. ParallelUglifyPlugin is therefore a faster way to compress multiple files.

Let’s look at how to use ParallelUglifyPlugin:

The dependencies need to be installed first, via NPM in the root directory of the project

npm i -D webpack-parallel-uglify-plugin
Copy the code

ParallelUglifyPlugin (ParallelUglifyJsplugin) ParallelUglifyPlugin (ParallelUglifyPlugin)

const path = require('path'); const DefinePlugin = require('webpack/lib/DefinePlugin'); const ParallelUglifyPlugin = require('webpack-parallel-uglify-plugin'); module.exports = { plugins: New ParallelUglifyPlugin({UglifyJS: {output: {UglifyJS: {output: {UglifyJS: {output: {// The most compact output beautify: false, // remove all comments comments: false,}, compress: {// Do not output warnings when UglifyJs removes code that is not used: False, // Delete all 'console' statements, compatible with IE drop_console: true, // The inline variable collapse_vars is defined only once: True, // Extract static values as follows: true,}},}),],};Copy the code

When instantiated via New ParallelUglifyPlugin(), the following parameters are supported:

  • **test: ** Uses regex to match which files need to be compressed by ParallelUglifyPlugin. The default is /.js$/, which compresses all.js files by default.

  • ** Include: ** Use re to match files that need to be compressed by ParallelUglifyPlugin. The default value is [].

  • **exclude: ** Use the regular to match files that do not need to be compressed by ParallelUglifyPlugin. The default value is [].

  • CacheDir: Used to configure the directory path where the cache is stored. Caching is not enabled by default. Set a directory path to enable caching.

  • **workerCount: ** Start several subprocesses to perform concurrent compression. The default is the number of CPU cores currently running on the computer minus 1.

  • **sourceMap: ** Whether to output the Source Map, which will slow the compression process.

  • **uglifyJS: ** Used to compress ES5 code configuration, type Object, directly passed through uglifyJS parameters.

  • **uglifyES: ** Used to compress ES6 code configuration, type Object, directly passed through the uglifyES parameters.

9. Enable hot module replacement

Module hot replacement is the ability to do sensitive real-time previews without refreshing the entire web page. The idea is that when we modify the code, we only recompile the changed module and replace the old one in the browser with the new one. The advantages of module hot replacement technology include:

  • Live preview is faster and has a shorter wait time.

  • Do not refresh the browser to preserve the current web page running state.

The hot module can be replaced in two ways:

The first is to inject a proxy client into the web page to connect DevServer to the web page. DevServer does not enable module hot replacement mode by default. To enable this mode, just start with the parameter –hot, the full command is webpack-dev –hot. We need to modify the following package.json file

"scripts": { "dev:inline": "webpack-dev-server", "dev:disable_inline": "webpack-dev-server --inline false", "start": "Webpack-dev-server --hot" // add line},Copy the code

We can start the hot replacement module by using the NPM run start command.

The second way is to enable the plug-in

const HotModuleReplacementPlugin = require('webpack/lib/HotModuleReplacementPlugin'); Module. exports = {entry:{// Inject proxy client main:['webpack-dev-server/client?http://localhost:8080/', 'webpack/hot/dev-server','./src/main.js'], }, plugins: [// This plugin is used to implement hot replacement of modules. In fact, when booted with '--hot' parameter, this plugin will be injected. / / generated. Hot - update. Json file. New HotModuleReplacementPlugin (),], devServer: {/ / tell devServer to open module hot hot replacement mode: true,}};Copy the code

The first few methods only optimize some development experience from the two dimensions of build speed and use experience. Later, I will optimize from the user feel time (first screen load time) and fluency. Stay tuned for…