preface

Recently, I have been sorting out knowledge points related to Webpack. On the one hand, the knowledge points I have mastered are fragmentary and not systematic enough, and SOMETIMES I don’t know where to start when I encounter problems. On the other hand, WebPack 5.0 is on the way, which is really a frustrating news.

Therefore, THIS prompted me to systematically review all knowledge points of Webpack 4.0, including the origin of Webpack, the use of various configurations, performance optimization, the underlying principle of Webpack, configuration analysis of related scaffolding, all reviewed a wave, the general catalog is as follows:

I have put a series of articles in this warehouse: Webpack learning to organize documents, interested students can go to see a wave.

This article is also a review of performance optimization in learning documents.

The link to the case code used in this article is at the bottom for you to help yourself.

 

Why optimize

First, why should we optimize? Of course, if your project is small and builds quickly, you don’t really need to focus on performance.

But as the project covers more pages, more functional and business code, and the webPack takes longer to build, performance optimization becomes a concern.

Because this build time is closely related to our daily development, when we start devServer or build locally, if it takes too long, it will greatly reduce our productivity.

Imagine a scenario where we suddenly encounter an urgent bug. It takes 3/4 minutes to start the project and 3/4 minutes to put the project build online after the change. Is it duang, duang, duang…

Let’s take a look at how to optimize webPack performance and speed up webPack construction.

 

Analysis tools

Before we can optimize, we need to have a quantitative indicator of where the problem is affecting build time, whether it is a chunk file that is too large, or a loader or plugin that is taking too long, etc.

There are tools that we can use to analyze the size and speed of the project, and then we can use them accordingly.

Volumetric analysis

The primary analysis

The stat.json file can be used to analyze the packaging results, which can be quickly generated using the following statements:

webpack --profile --json > stats.json
Copy the code

Then we use the stats.json analysis tool provided by the official website to analyze. After uploading the stats.json file, we can get the analysis result as shown in the following figure:

This includes the version of webpack, the time of packaging, the hash value of the packaging process, the number of modules, the number of chunks, the static file assets of the packaging layer, and the number of warnings and errors of packaging.

We can analyze the content it provides and locate the general problem.

Third-party tools

Webpack-bundle-analyzer is a package analysis tool. Its interface is very beautiful, and it can give the size and dependencies of each packaged file intuitively, which can help us analyze projects more conveniently.

Use as follows:

// config/webpack.common.js
const { BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');

const commonConfig = {
  // ...
  plugins: [
    new BundleAnalyzerPlugin({
      analyzerPort: 8889.// Specify a port number
      openAnalyzer: false,}),]// ...
}
Copy the code

The underlying layer of Webpack-Bundle-Analyzer also relies on stat.json files. The final analysis page is obtained by analyzing stat.json files

Through analysis tools, we can know which files take a lot of time and pack a lot of volume, so that the problem files can be optimized.

 

Velocity analysis

The speed-measure-webpack-plugin can help us analyze the total time spent on packaging, as well as the time spent on each loader and plugins, so as to help us quickly locate the configuration that can optimize WebPack.

As shown in the figure above, longer times are marked in red.

use

To add this plugin, create a plugins instance and wrap it around the webPack configuration file. Let’s modify the common webpack configuration file:

// config/webpack.common.js
const SpeedMeasurePlugin = require("speed-measure-webpack-plugin");
const smp = new SpeedMeasurePlugin();
// ...
module.exports = (production) = > {
  if (production) {
    const endProdConfig = merge(commonConfig, prodConfig);
    return smp.wrap(endProdConfig);
  } else {
    const endDevConfig = merge(commonConfig, devConfig);
    returnsmp.wrap(endDevConfig); }};Copy the code

The code configuration files presented in this article are divided into three parts, namely, the development environment configuration file, the production environment configuration file, and the common configuration file shared by the former two, as follows:

  • webpack.dev.js: Configuration file used by the development environment
  • webpack.prod.js: Configuration file used in the production environment
  • webpack.common.js: public configuration file

After performing the packaging, you can see the following effect:

Note: Speed-measure-webpack-plugin is not perfect for webpack upgrade. Not yet coexisting with your own custom plugin (add-asset-html-webpack-plugin for that matter) that is mounted on hooks provided by htML-webpack-plugin, Someone has already made an issue on Github, but it seems there is still no solution.

 

Optimization strategy

After the corresponding volume analysis and velocity analysis, we can start to optimize.

Use the new version

This is a panacea for WebPack performance optimization, and the upgrade is bound to bring performance improvements, and the improvements are significant.

Here’s a comparison:

From the above figure, we can see that Webpackage 4.0 is built much faster than Webpackage 3.0. The official also says that after the upgrade, the build time can be reduced by 60% to 98%.

With each release, there will be a lot of optimization inside WebPack, and Webpack is dependent on the Js running environment of Node, and the speed of WebPack will definitely be improved by upgrading the corresponding version.

Probably after WebPack 5.0 comes out, most of the performance optimization methods we talked about today will be integrated into WebPack itself, with a few simple configurations to complete the performance configuration.

At the same time, the new version of the package management tools (Npm, Yarn) can help us analyze the dependency of some packages and import them more quickly, thus improving the packaging speed.

Optimization with webpack4.0

  • v8Engine optimization (for ofalternativeforEach,MapSetalternativeObject,includesalternativeindexOf)
  • The default is fastermd4 hashalgorithm
  • webpack ASTYou can get it directly fromloaderPassed to theASTTo reduce parsing time
  • Use string methods instead of regular expressions

You can see the performance optimizations on the Iteration page of the Releases version of the WebPack library on Github:

av8Performance tuning examples:

To compare the speed of using includes instead of indexOf, create a compare-include-indexof. Js file and create an array with a length of 10000000 to record the time spent by the two functions:

const ARR_SIZE = 10000000;
const hugeArr = new Array(ARR_SIZE).fill(1);

// includes
const includesTest = (a)= > {
  const arrCopy = [];
  console.time('includes')
  let i = 0;
  while (i < hugeArr.length) {
    arrCopy.includes(i++);
  }
  console.timeEnd('includes');
}

// indexOf
const indexOfTest = (a)= > {
  const arrCopy = [];
  console.time('indexOf');
  for (let item of hugeArr) {
    arrCopy.indexOf(item);
  }
  console.timeEnd('indexOf');
}

includesTest();
indexOfTest();
Copy the code

Includes is much faster than indexOf:

  • includes:12.224 ms
  • indexOf:147.638 ms

So using newer versions of WebPack, Node, Npm, and Yarn on our projects as much as possible was the first step to speed up packaging.

 

Volume optimization

Webpack is a project packaging tool. Generally, after the project is packaged, it needs to be released to the server for users to use. For user experience, our project volume needs to be as small as possible, so the volume of webpack is an important part of Webpack.

Js compressed

Webpackage 4.0 supports code compression by default in production, i.e. in mode= Production mode.

In fact, webpack4.0 uses terser-webpack-plugin by default, and uglifyjs-webpack-plugin by default. The difference between the two is that the latter does not compress ES6 very well. At the same time, we can turn on the PARALLEL parameter to use multi-process compression to speed up compression.

// config/webpack.common.js
const TerserPlugin = require('terser-webpack-plugin');
// ...
const commonConfig = {
  // ...
  optimization: {
    minimize: true.minimizer: [
      new TerserPlugin({
        parallel: 4.// Start several processes to handle compression, os.cpus().length-1 by default})],},// ...
}
Copy the code

CSS compression

Compress CSS

We can compress CSS with the optimize- CSS-assets-webpack-plugin, and the default compression engine is CSSNano. Specific use is as follows:

// config/webpack.prod.js
const OptimizeCSSAssetsPlugin = require("optimize-css-assets-webpack-plugin");
// ...
const prodConfig = {
  // ...
  optimization: {
    minimizer: [
      new OptimizeCSSAssetsPlugin({
        assetNameRegExp: /\.optimize\.css$/g.cssProcessor: require('cssnano'),
        cssProcessorPluginOptions: {
          preset: ['default', { discardComments: { removeAll: true}}],},canPrint: true,})]},}Copy the code
Erase uselessCSS

PurgeCSS is used to erase useless CSS. It needs to be used in conjunction with the mini-CSs-extract-Plugin.

// config/webpack.common.js
const PurgecssPlugin = require('purgecss-webpack-plugin');
// ...
const PATHS = {
  src: path.join(__dirname, './src')};const commonConfig = {
  // ...
  plugins: [
    // ...
    new PurgecssPlugin({
      paths: glob.sync(`${PATHS.src}/ * * / * `,  { nodir: true})}),]// ...
}
Copy the code

Before we used this plugin, for example, we only used navContact, and we didn’t use anything else. We packaged it before we introduced it, and found that unused CSS would still be packaged:

After introducing the plugin and repackaging it, we found that the unused CSS was erased:

For more information, see the PurgeCSS documentation.

Image compression

Generally speaking, after packaging, the size of some image files is much larger than js or CSS files, so the first thing we need to do is to optimize the image. We can manually compress the image through online image compression tools, such as Tiny PNG, to help us compress the image.

However, this is rather complicated. In the project, we hope to be more automated and automatically help us do image compression. At this time, we can use image-Webpack-Loader to help us achieve this. It is based on the Imagemin Node library to achieve image compression.

Add image-webpack-loader after file-loader: image-webpack-loader

// config/webpack.common.js
// ...
module: {
  rules: [{test: /\.(png|jpg|gif)$/.use: [{loader: 'file-loader'.options: {
            name: '[name]_[hash].[ext]'.outputPath: 'images/',}}, {loader: 'image-webpack-loader'.options: {
            // Compress the JPEG configuration
            mozjpeg: {
              progressive: true.quality: 65
            },
            // Use imagemin**-optipng to compress PNG, enable: false to disable PNG
            optipng: {
              enabled: false,},// Use imagemin-pngquant to compress PNG
            pngquant: {
              quality: '65-90'.speed: 4
            },
            // Compress the GIF configuration
            gifsicle: {
              interlaced: false,},// Enable webp to compress JPG and PNG images into WebP format
            webp: {
              quality: 75}}}]},]}// ...
Copy the code

Let’s not use this loader for packaging, the size of the image is 2.1MB:

With image-webpack-loader, the image size is 666KB:

The effect of compression is obvious.

Break up the code

Sometimes we write modules that are not used at all, but are packaged anyway, which actually slows down the packaging of WebPack and increases the size of the packaged files, so we can use tree-shaking to remove that code.

SplitChunksPlugin can also be used to split a large file into several smaller files, which can effectively improve the packaging speed of webpack. For detailed configuration introduction, you can see splitChunksPlugin. It explains in detail how to configure splitChunks and the usage and significance of each parameter, but will not be explained here.

 

Speed optimization

So with volume optimization, let’s look at speed optimization.

Separate the two configurations

Generally speaking, in project development, we distinguish between development and production configurations, each with its own role.

In development: We needed Webpack-Dev-Server for fast development and HMR hot updates for non-refresh changes to the page that we didn’t need in production.

In production: we need to do code compression, directory cleanup, hash, CSS extraction, etc.

It’s easy to implement, as we mentioned earlier, just create three new webPack configuration files:

  • webpack.dev.js: Configuration file of the development environment
  • webpack.prod.js: Configuration file of the production environment
  • webpack.common.js: public configuration file

Use webpack-merge to merge the common configuration webpack.common.js of two configuration files, refer to the source code for details.

 

Reduce the search process

Configure webPack’s resolve parameter properly, using the resolve field to tell WebPack how to search for files.

The rational use ofresolve.extensions

If the import statement does not have a file suffix, WebPack will automatically apply the suffix and then try to query the existence of the file. The query order is in the order of resolve. Extensions.

For example 🌰 : If we configure resolve.extensions= [‘js’, ‘json’], then webPack will look for XXX.js first

If not, look for xxx.json again, so we should put the file suffixes we usually use first, or when we import modules, try to put the file suffixes.

Although the Extensions will find the values in the array first, we don’t want to put all the suffixes in it at once. This will cause multiple file lookups, which will slow down the packaging process.

To optimize theresolve.modules

This property tells WebPack which directory to search for when parsing a module, and both absolute and relative paths can be used. Using absolute paths will only search in a given directory, reducing the search hierarchy of modules:

// config/webpack.common.js
// ...

const commonConfig = {
  // ...
  resolve: {
    extensions: ['.js'.'.jsx'].mainFiles: ['index'.'list'].alias: {
      alias: path.resolve(__dirname, '.. /src/alias'),},modules: [
      path.resolve(__dirname, 'node_modules'), // specifies that node_modules in the current directory is searched first
      'node_modules'.// Place the default after]},// ...
}
// ...
Copy the code
useresolve.aliasReduce the search process

Alias is an alias that maps the original import path to a new import path.

For example, our project may have some relative path writing methods, we can use the alias configuration to reduce the search process;

For example, we often use the React library. In fact, we can directly use the react.min.js packaged in the dist directory to skip time-consuming module parsing.

// config/webpack.common.js
// ...
const commonConfig = {
  // ...
  resolve: {
    // ...
    alias: {
      react: path.resolve(__dirname, './node_modules/react/dist/react.min.js'),
      @alias: path.resolve(__dirname, '.. /src/alias'),}},// ...
}
// ...
Copy the code

The author has not tried this in the actual project, but it is also an idea, you can have the opportunity to try a wave.

 

Narrow your Build goals

Exclude modules that Webpack does not need to parse, i.e. use as few modules as possible when using loader.

We can use include and exclude to specify which modules the Loader will only apply to and which modules it will not apply to.

We modify the common configuration file webpack.common.js:

// config/webpack.common.js
// ...
const commonConfig = {
  // ...
  module: {
    rules: [{test: /\.js|jsx$/.exclude: /node_modules/.include: path.resolve(__dirname, '.. /src'),
        use: ['babel-loader']},// ...]}}// ...
Copy the code

NPM run build: exclude and include

Then we add these two parameters, which mean:

  • exclude: /node_modules/: excludingnode_modulesThe following files
  • include: path.resolve(__dirname, '.. /src'): only tosrcThe following files are used

Repackage, and the packing time becomes about 1400ms:

 

Use multithreading to speed up builds

Since WebPack running on Node.js is a single-threaded model, the things that WebPack needs to handle need to be done one at a time, not all at once.

If WebPack can handle more than one task at a time, it will have the power of a multi-core CPU computer.

HappyPack

How it works: Every time Webapck parses a module, HappyPack assigns it and its dependencies to the worker thread. Once the processing is complete, the processed resources are returned to the main HappyPack process to speed up packaging.

Using happypack in webpack4.0 requires version 5.0 of happypack.

We introduce the HappyPack public profile, he will use is the corresponding loader replace HappyPack/loader, at the same time, will replace loader in the plugin loaders options, we’ll replace the Babel – loader:

// config/webpack.common.js
// ...
const makePlugins = (configs) = > {
  const plugins = [
    // ...
    new HappyPack({
      loaders: ['babel-loader']]}),// ...
  return plugins;
}
// ...

const commonConfig = {
  entry: {
    main: "./src/index.js".entry2: "./src/entry2.js".entry3: "./src/entry3.js".entry4: "./src/entry4.js".entry5: "./src/entry5.js".entry6: "./src/entry6.js",},// ...
  module: {
    rules: [{ 
      test: /\.jsx? $/.// exclude: /node_modules/,
      // include: path.resolve(__dirname, '.. /src'),
      use: [
        'happypack/loader'
        // 'babel-loader']]}},// ...
}
// ...
Copy the code

In order to make the effect more obvious, we add a few more entry files under the project, without the use of happypack, a packaging time is about 8s:

After happypack is enabled, we can see from the console that happypack starts 3 processes by default, and the packaging time is around 6.5 seconds:

Note: The authors of HappyPack basically don’t maintain this plug-in now either, because the authors’ interest in the project is waning. He also recommended using the webpack official Thread-loader.

More parameters can be found on the HappyPack website

thread-loader

Webpack is the official launch of a multi-process solution to replace HappyPack.

Similar to HappyPack, WebPack parses one module at a time, and Thread-Loader assigns it and its dependencies to worker threads to achieve multi-process packaging.

We need to comment out the HappyPack code:

// config/webpack.common.js
// ...
const commonConfig = {
  // ...
  module: {
    rules: [{ 
      test: /\.jsx? $/.// exclude: /node_modules/,
      // include: path.resolve(__dirname, '.. /src'),
      use: [
        {
          loader: 'thread-loader'.options: {
            workers: 3.// Start several worker processes to handle packing, os.cpus().length-1 by default}},'babel-loader']]}},// ...
}
// ...
Copy the code

Let’s run it again, again around 6.5 seconds:

Pre-compiled resource modules (DllPlugin)

When we were packaging, generally the third-party modules would not change, so we thought we could just package the third-party modules in the first packaging and package them into a specific file. When the second webpack was packaged, We don’t need to import third-party modules in node_modules, but just use the files of the third-party modules that we packaged the first time.

The webpack.dllPlugin is a plug-in that solves this problem by generating an immutable copy of the package after the first build for other modules to reference, thus saving the time of compiling the package at development time on the next build.

Adding a Configuration File

We will create a new webpack.dll.js file under config, which will be used to pack our third-party package files into DLL folder:

// config/webpack.dll.js
const path = require('path');
const webpack = require('webpack');

module.exports = {
  mode: 'production'./ / environment
  entry: {
    vendors: ['lodash'].// Package lodash under vendors. Js
    react: ['react'.'react-dom'].// Package react and react-dom into react.js
  },
  output: {
    filename: '[name].dll.js'.// The output name
    path: path.resolve(__dirname, '.. /dll'), // Output file directory
    library: '[name]' // Expose our packaged file as all variables, which can be accessed in the name of the browser variable
  },
  plugins: [
    // Analyze the generated library file, generate the mapping between the library file and the business file, and put the result in the mainfest.json file
    new webpack.DllPlugin({
      name: '[name]'.// Use the same name as the library output above
      path: path.resolve(__dirname, '.. /dll/[name].manifest.json'),}})]Copy the code
  • The abovelibraryWilldllThe file is exported as a global variable for subsequent reference, as shown below:
  • mainfest.jsonA file is a mapping, and its purpose is to helpwebpackUse what we packed earlier***.dll.jsFile, not go againnode_modulesTo find.

The loader is packaged in vendor.dll. Js, and the loader is packaged in vendor.dll. React and react-dom are packaged in react.dll. Js:

Then we need to modify the common configuration file webpack.common.js and import the DLL file we generated into HTML. If we don’t want to manually add the DLL file to the HTML file, We can use the add-asset-html-webpack-plugin, which, as the name suggests, adds files to the HTML.

At the same time, we need to use webPack’s DllReferencePlugin to analyze the Mainfest. Json mapping file.

// config/webpack.common.js
const webpack = require('webpack');
const AddAssetHtmlWebpackPlugin = require('add-asset-html-webpack-plugin');

// ...

const commonConfig = {
  // ...
  plugins: [
    // ...
    new AddAssetHtmlWebpackPlugin({
      filepath: path.resolve(__dirname, '.. /dll/vendors.dll.js')}),new AddAssetHtmlWebpackPlugin({
      filepath: path.resolve(__dirname, '.. /dll/react.dll.js')}),new webpack.DllReferencePlugin({
      manifest: require(path.resolve(__dirname, '.. /dll/vendors.dll.mainfest.json'))}),new webpack.DllReferencePlugin({
      manifest: require(path.resolve(__dirname, '.. /dll/react.dll.mainfest.json'))})],// ...
}
// ...
Copy the code

The code here can also be optimized, you can refer to the author of the notes in the DLL optimization section.

We do a packaging and you can see that the packaging takes about 1450ms, and you can also see that the library files are packaged to vendor.chunk. js at 1.22MB.

After we annotate the DLL reference analysis, we repackage it, which takes about 1950ms, and vendor.chunk. js is 5.28MB.

Cache Cache correlation

We can enable caching of the corresponding loader or plugin to increase the speed of the secondary build. Generally, we can accomplish this through the following items:

  • babel-loaderOpen the cache
  • terser-webpack-pluginOpen the cache
  • usecache-loaderorhard-source-webpack-plugin

If there is a cache in the project, there is a.cache directory under node_modules to store the cache.

babel-loader

Babel-loader cacheDirectory = true

// config/webpack.common.js
// ...
module: {
  rules: [{test: /\.jsx? $/.// exclude: /node_modules/,
      // include: path.resolve(__dirname, '.. /src'),
      use: [
        {
          loader: 'babel-loader'.options: {
            cacheDirectory: true,}},]},]}// ...
Copy the code

The first package took about 8.5s, and when it was finished, we could see that node_modules generated a.cache directory containing Babel’s cache files:

We repack and find that the time has changed to about 6s:

TerserPlugin

We can enable caching by setting cache in the TerserPlugin to true:

// config/webpack.common.js
const TerserPlugin = require('terser-webpack-plugin');
// ...
const commonConfig = {
  // ...
  optimization: {
    minimize: true.minimizer: [
      new TerserPlugin({
        parallel: 4.// Start several processes to handle compression, os.cpus().length-1 by default
        cache: true,})]},// ...
}
Copy the code

The terser-webpack-plugin cache directory is generated in the. Cache directory:

We repack and find that the time has changed to about 5s:

HardSourceWebpackPlugin

This plugin is used to provide an intermediate cache for modules.

Use the following, which we can introduce directly in the plugin:

// config/webpack.common.js
const HardSourceWebpackPlugin = require('hard-source-webpack-plugin');
// ...
const plugins = [
  // ...
  new HardSourceWebpackPlugin(),
];
// ...
Copy the code

HardSourceWebpackPlugin generates the HardSourceWebpackPlugin for the first time, and generates the hard-source directory in the.cache directory. The first package takes about 6.6s:

If we repack, we’ll see that the time has changed to about 2.7s:

Use sourceMap wisely

As we mentioned earlier, when we were packaging to generate sourceMap, the more detailed the information, the slower the packaging

So it is important that we use the appropriate sourceMap in the appropriate environment during the code packaging process.

 

other

In addition to the common methods we mentioned above, there are some other methods, such as:

  • useES6 ModulesSyntax to guaranteeTree-ShakingPlay a role

Because tree-shaking only works for static imports of ES6 Modules, it doesn’t work for dynamic imports like CommonJs

  • The rational use ofPloyfill

If we didn’t do anything with the polyfills we introduced, Webpack would load all the polyfills in, causing the output file to be too large. UseBuiltIns =’ Usage ‘of @babel/preset-env is recommended, this configuration item will help us introduce the required spacers on demand according to browser compatibility; In addition, we can also use dynamic Polyfill service to issue different polyfill according to the User Agent of the browser. For details, please refer to polyfill. IO.

  • Preloaded resourcewebpackPrefetch

Use webpackPrefetch to preload some resources in advance, which means that some module resources may be needed in the future. After the core code is loaded, the required module code can be loaded when the bandwidth is free.

  • iconClass picture usagecss SpriteTo merge images

If there are too many icon images, use Sprite as a composite image to reduce network requests, or use font files.

  • html-webpack-externals-plugin

This plug-in can extract some public packages and import them using CDN instead of the bundle, thus reducing the size of the package file and speeding up the packaging.

  • The reasonable configurationchunkThe hash value

In production packaging, it’s important to hash the files so that the browser can cache our files, so when our code file doesn’t change, the user can just read the browser’s cached file. Generally, javascript files use [chunkhash], CSS files use [contenthash], and other resources (such as images, fonts, etc.) use [hash].

More performance optimization methods I will not list one by one, because there are too many methods of webpack performance optimization, we can develop relevant optimization schemes according to the actual problems encountered.

 

summary

Today this article introduces some optimization solutions for Webpack packaging, from project size to project speed, we put forward some optimization solutions, you can practice in specific projects.

Of course, I should also mention that if your project is built quickly, you don’t need to optimize your project using the methods mentioned in this article, which may be counterproductive.

For some of the content of the article, you can learn to organize the document in my Webpack to find the corresponding introduction.

Honestly, I want a thumbs up!

 

A link to the

  • Webpack learns to organize documents
  • Webpack Optimization – Double your build efficiency
  • Unlock the Webpack series in depth
  • Webpack builds a summary of performance optimization strategies
  • Geek time with Webpack
  • Master Webpack performance optimization in 30 minutes
  • Customize the front-end development environment with WebPack
  • Webpack is the official packaging tool

 

The sample code

Sample code can be seen here:

  • Performance optimization sample code