preface
Webpack packaging optimization does not have any fixed pattern, generally our common optimization is unpacking, block, compression, etc., is not applicable to every project, for a specific project, need to constantly debug and optimize.
For WebPack4, you are advised to configure webPack4 from scratch and use the default configuration at the beginning of the project.
Next, this article lists all of the technical solutions that can be used to optimize the packaging speed of WebPack, along with the limitations that can be applied to a real project. If you have any questions, please contact Mr. Bottle.
First, analysis of packing speed
The first step in optimizing the speed of WebPack builds is knowing where to focus your efforts. We can use the speed-measurement-webpack-plugin to measure the time spent in each phase of your WebPack build:
// Analyze the packing time
const SpeedMeasurePlugin = require("speed-measure-webpack-plugin");
const smp = new SpeedMeasurePlugin();
// ...
module.exports = smp.wrap(prodWebpackConfig)
Copy the code
Each project has its own specific performance build bottleneck, so let’s optimize each part of the packaging.
Two, analysis of the impact of packaging speed
In “The Prying Principle: Writing a JavaScript Wrapper,” we’ve explained that packaging is the process of packaging all the dependent modules into a single file, starting from the entry file. Of course, there are various compilation and optimization processes involved in the packaging process.
What are some of the common things that affect build speed during the packaging process?
1. To start packaging, we need to get all the dependent modules
Search for all dependencies, which takes a certain amount of time, the search time, then we determine:
The first time we need to optimize is search time.
2. Parse all dependency modules (parse into browser-usable code)
Webpack parses the corresponding file according to the loader we configured. In daily development, we need to use Loader to convert js, CSS, pictures, fonts and other files, and the amount of data in the converted files is also very large. Because of the single-threaded nature of JS, these transformation operations cannot process files concurrently, but need to process files one by one.
The second time we need to optimize is the parsing time.
3. Package all dependent modules into one file
All parsed code is packaged into a single file, which webPack optimizes in order to update the browser-loaded package (minus a small white screen time).
JS compression is the last stage of release and compilation, and webpack usually takes a long time. This is because JS compression needs to parse code into AST syntax tree first, then analyze and process AST according to complex rules, and finally restore AST to JS. This process involves a lot of calculation, so it is time-consuming. Packing is easy to get stuck.
The third time we need to optimize is compression time.
4. Secondary packaging
When we change a small file in the project, we need to repackage. All the files have to be repackaged, which takes about the same time as the initial packaging, but most of the files in the project remain unchanged, especially the third-party libraries.
The fourth time we need to optimize is the second packaging time.
Optimize parsing time – Enable multi-process packaging
Webpack running on Node.js is single-threaded, which means that webPack packaging can only be processed file by file. When WebPack needs to package a large number of files, the packaging time can be quite long.
1. Thread-loader (recommended by WebPack4)
By placing this loader in front of other loaders, loaders placed after this loader will run in a separate worker pool. A worker is a nodeJS process [node.js proces]. The maximum processing time of each process is 600ms, and the data exchange of each process is also limited within this time.
Thread-loader is also very simple to use, as long as the thread-loader is placed before the other loaders, and the loaders after the thread-loader are run in a separate worker pool.
Such as:
module.exports = {
// ...
module: {
rules: [{test: /\.js$/.exclude: /node_modules/.// Create a js worker pool
use: [
'thread-loader'.'babel-loader'] {},test: /\.s? css$/.exclude: /node_modules/.// Create a CSS worker pool
use: [
'style-loader'.'thread-loader',
{
loader: 'css-loader'.options: {
modules: true.localIdentName: '[name]__[local]--[hash:base64:5]'.importLoaders: 1}},'postcss-loader']}// ...
]
// ...
}
// ...
}
Copy the code
Note: Thread-loader is placed after style-loader because loaders after thread-loader cannot access files or get webpack options.
Officially, it says that each worker costs about 600ms, so in order to prevent high delay when starting the worker, the official provides the optimization of the worker pool: preheating
// ...
const threadLoader = require('thread-loader');
const jsWorkerPool = {
// options
// The number of worker generated, default is (CPU cores - 1)
// if require(' OS ').cpus() is undefined, then 1
workers: 2.// Periodically delete worker processes when idle
// The default is 500ms
// This can be set to infinity to keep the worker continuously present in watch mode (--watch)
poolTimeout: 2000
};
const cssWorkerPool = {
// The number of concurrent jobs executed in a worker process
// The default is 20
workerParallelJobs: 2.poolTimeout: 2000
};
threadLoader.warmup(jsWorkerPool, ['babel-loader']);
threadLoader.warmup(cssWorkerPool, ['css-loader'.'postcss-loader']);
module.exports = {
// ...
module: {
rules: [{test: /\.js$/.exclude: /node_modules/.use: [{loader: 'thread-loader'.options: jsWorkerPool
},
'babel-loader'] {},test: /\.s? css$/.exclude: /node_modules/.use: [
'style-loader',
{
loader: 'thread-loader'.options: cssWorkerPool
},
{
loader: 'css-loader'.options: {
modules: true.localIdentName: '[name]__[local]--[hash:base64:5]'.importLoaders: 1}},'postcss-loader']}// ...
]
// ...
}
// ...
}
Copy the code
Note: Use this only on time consuming Loaders.
2. HappyPack
Most of the actual time spent in webPack building is spent on loader parsing and conversion and code compression. HappyPack can package files in multiple processes (default CPU core number -1), which is more CPU efficient for multiple cores. HappyPack allows Webpack to process multiple tasks at the same time, taking advantage of the power of the multi-core CPU. The HappyPack allows Webpack to process multiple tasks at the same time. The HappyPack allows Webpack to process multiple tasks at the same time, taking advantage of the power of the multi-core CPU.
Happypack’s processing idea is to extend the original Webpack execution process of Loader from the form of a single process to the mode of multiple processes, and the original process remains unchanged. There are some limitations to using HappyPack. It is only compatible with some mainstream Loaders. You can see the official compatibility list.
Note: Ahmad Amireh recommends using Thread-Loader and declares that happypack will no longer be maintained, so it is not recommended
const path = require('path')
const webpack = require("webpack");
const HappyPack = require('happypack'); // Multiprocess loader
// System operation module provided by node
const os = require('os');
// Create a shared process pool and specify the number of shared process pools according to the number of cores
const happyThreadPool = HappyPack.ThreadPool({ size: os.cpus().length });
const createHappyPlugin = (id, loaders) = > new HappyPack({
// The HappyPack is used to process a particular class of files
id: id,
// How to handle.js files as in the Loader configuration
loaders: loaders,
// Other configuration items (optional)
// Represents a shared process pool. That is, multiple HappyPack instances use a child process from the same shared process pool to process tasks to prevent excessive resource usage
threadPool: happyThreadPool,
// Whether to allow HappyPack to output logs. The default value is true
verbose: true
Threads: indicates the number of child processes to start processing files of this type. The default is 3. The type must be an integer
});
const clientWebpackConfig = {
// ...
module: {
rules: [{test: /\.(js|jsx)$/.// Pass processing of the.js. JSX file to HappyPack instance with id happy-Babel
use: ["happypack/loader? id=happy-babel"].// Exclude files in node_modules
// The files in node_modules are in ES5 syntax, so there is no need to convert them through Babel
exclude: /node_modules/,}},// ...
plugins: [
createHappyPlugin('happy-babel'[{loader: 'babel-loader'.options: {
presets: ['@babel/preset-env'."@babel/preset-react"].plugins: [["import", { "libraryName": "antd"."style": true }],
['@babel/plugin-proposal-class-properties', {loose:true}]],cacheDirectory: true.// Save disk space when time isn't as important
cacheCompression: true.compact: true,}}]),// ...]}Copy the code
Note that when the project is small, multi-process packaging can actually make the packaging slower.
4. Use cache wisely (shorten continuous build time and increase initial build time)
There are several ways to use webPack caching, such as using cache-Loader, HardSourceWebpackPlugin, or Babel-Loader’s cacheDirectory flag. All of these caching methods have startup overhead. The local time savings during reruns are significant, but the initial (cold) run is actually slower.
If the production version of your project has to do an initial build every time, caching can increase build times and slow you down. If not, they will significantly reduce your second build time.
1. cache-loader
Cache-loaders, like Thread-loaders, are easy to use. They simply need to be added before other loaders with high performance costs to cache results to disk, significantly speeding up the second build.
module.exports = {
module: {
rules: [{test: /\.ext$/.use: ['cache-loader'. loaders],include: path.resolve('src'),},],},};Copy the code
⚠️ Please note that there is some time overhead to save and read these cached files, so use this loader only for loaders with high performance overhead.
2. HardSourceWebpackPlugin
- The first build will take normal time
- The second build will be significantly faster (approximately 90% faster build).
const HardSourceWebpackPlugin = require('hard-source-webpack-plugin')
const clientWebpackConfig = {
// ...
plugins: [
new HardSourceWebpackPlugin({
// cacheDirectory writes to the cache. By default, the cache is stored in a directory under node_modules
// 'node_modules/.cache/hard-source/[confighash]'
cacheDirectory: path.join(__dirname, './lib/.cache/hard-source/[confighash]'),
// configHash converts webPack configuration when starting webPack instance,
// and use cacheDirectory to build different caches for different WebPack configurations
configHash: function(webpackConfig) {
// node-object-hash on npm can be used to build this.
return require('node-object-hash') ({sort: false}).hash(webpackConfig);
},
// When loaders, plug-ins, other build time scripts, or other dynamic dependencies change,
// hard-source Needs to replace the cache to make sure the output is correct.
// environmentHash is used to determine this. If the hash is different from the previous build, the new cache will be used
environmentHash: {
root: process.cwd(),
directories: [].files: ['package-lock.json'.'yarn.lock'],},// An object. Control the source
info: {
// 'none' or 'test'.
mode: 'none'.// 'debug', 'log', 'info', 'warn', or 'error'.
level: 'debug',},// Clean up large, old caches automatically.
cachePrune: {
// Caches younger than `maxAge` are not considered for deletion. They must
// be at least this (default: 2 days) old in milliseconds.
maxAge: 2 * 24 * 60 * 60 * 1000.// All caches together must be larger than `sizeThreshold` before any
// caches will be deleted. Together they must be at least this
// (default: 50 MB) big in bytes.
sizeThreshold: 50 * 1024 * 1024}}),new HardSourceWebpackPlugin.ExcludeModulePlugin([
{
test: /.*\.DS_Store/]}}]),Copy the code
Five, optimize the compression time
1. webpack3
Webpack 3 starts packaging with –optimize-minimize, so Webpack will automatically inject you with a default configuration of UglifyJSPlugin.
Or:
module.exports = {
optimization: {
minimize: true,}}Copy the code
To compress JavaScript code, it is necessary to parse the code into an AST syntax tree abstractly represented by Object, and then apply various rules to analyze and process the AST. As a result, this process is very computative and time-consuming. But UglifyJsPlugin is single-threaded, so we can use the ParallelUglifyPlugin.
ParallelUglifyPlugin enables the ParallelUglifyPlugin to compress multiple processes. The ParallelUglifyPlugin enables the ParallelUglifyPlugin to enable multiple subprocesses to compress multiple files. Each subprocess uses UglifyJS to compress the code. But it’s going to run in parallel. ParallelUglifyPlugin can compress multiple files faster.
2. webpack4
In the webpack4 webpack. Optimize. UglifyJsPlugin has been abandoned.
ParallelUglifyPlugin is also not recommended, as the project is largely unmaintained, the issue is unhandled, and the PR is unmerged.
By default, WebPack4 uses the terser-Webpack-Plugin built-in to compress and optimize code, which uses Terser to shrink JavaScript.
What’s a terser?
The official terser definition is:
JavaScript parser for ES6+, Mangler/Compressor toolkit.
Why did WebPack choose Terser?
Uglify-es is no longer maintained, and UGlify-JS does not support ES6 +.
Terser is a branch of Uglify-ES, retaining primarily API and CLI compatibility with Uglify-ES and uglify-js@3.
Terser starts multiple processes
Speed up builds by running multiple processes in parallel. The default number of concurrent runs is os.cpus().length-1.
module.exports = {
optimization: {
minimizer: [
new TerserPlugin({
parallel: true,}),],},};Copy the code
This can significantly speed up the build, so starting multiple processes is highly recommended
Optimize search time – Narrow the search scope of files to reduce unnecessary compilation work
When webPack is packaged, it will trigger from the configured entry, parse the import statement of the entry file, and then recursively parse it. Webpack does two things when it encounters an import statement:
- According to the import statement to find the corresponding file to import. For example,
require('react')
The file corresponding to the import statement is./node_modules/react/react.js
.require('./util')
The corresponding file is./util.js
. - Based on the suffix of the file found to import, the Loader in the configuration is used to process the file. For example, JavaScript files developed using ES6 need to be processed using Babel-Loader.
The above two things are very fast for processing one file, but when the project is large, the number of files will become very large, and the problem of slow build speed will be exposed. Although the above two things cannot be avoided, we need to minimize the occurrence of the above two things to improve the speed.
Here’s how you can optimize them.
1. Optimize loader configuration
When using Loader, you can use the test, include, and exclude configuration items to match the file to which the rule is to be applied
2. Optimize the resolve.module configuration
Modules configurewebPack which directories to look for third-party modules. The default value of resolve.modules is [‘node_modules’], /node_modules: /node_modules: /node_modules: /node_modules: /node_modules: /node_modules: /node_modules: /node_modules: /node_modules: /node_modules /node_modules /.. /node_modules and so on.
Optimize the resolve.alias configuration
The resolve.alias configuration item uses aliases to map the original import path to a new one, reducing time-consuming recursive resolution operations.
4. Optimize the resolve.extensions configuration
If the import statement does not have a file suffix, WebPack will try to ask if the file exists after it is automatically suffix based on resolvie.extension, so try to keep the following points in mind when configuring resolvie.extensions:
resolve.extensions
Keep the list as small as possible, and don’t write to the suffix attempt list anything that is not possible in your project.- The file suffix with the highest frequency should be placed first to exit the search process as soon as possible.
- When writing import statements in source code, include suffixes as much as possible to avoid the search process.
5. Optimize the resolve.mainFields configuration
There are third-party modules that provide a bit of code for different environments. For example, provide two copies of code using ES5 and ES6, respectively. The location of the two copies of code is written in the package.json file, as follows:
{
"jsnext:main": "es/index.js".// Code entry file using ES6 syntax
"main": "lib/index.js" // Code entry file with ES5 syntax
}
Copy the code
Webpack will determine which code is preferred based on the configuration of the mainFields, which default is as follows:
mainFields: ['browser'.'main']
Copy the code
Webpack will search the package.json file in the order of the array, using only the first one it finds.
If you want to use the ES6 code first, you can configure it like this:
mainFields: ['jsnext:main'.'browser'.'main']
Copy the code
6. Optimize configuration of module.noParse
The module.noParse configuration allows Webpack to ignore recursive parsing of files that are not modularized, which has the benefit of improving build performance. The reason is that some libraries, such as jQuery and ChartJS, are large and do not adopt modular standards, and it is time-consuming and pointless for Webpack to parse these files.
7. Configure details
// Compile the basic configuration of the code
module.exports = {
// ...
module: {
// Jquery used in the project does not adopt modular standards, webPack ignores it
noParse: /jquery/.rules: [{// Compile js and JSX
// Note: Do not write /\.jsx? If there is no JSX file in the project source code $/ to improve regular expression performance
test: /\.(js|jsx)$/.// Babel-Loader supports caching of translated results, enabled with the cacheDirectory option
use: ['babel-loader? cacheDirectory'].// Exclude files in node_modules
// The files in node_modules are in ES5 syntax, so there is no need to convert them through Babel
exclude: /node_modules/}},],resolve: {
// Set module import rules, import/require will directly find files in these directories
// You can specify the absolute path to third-party modules to reduce searching
modules: [
path.resolve(`${project}/client/components`),
path.resolve('h5_commonr/components'),
'node_modules'].// import Omit the suffix when importing
// Note: minimize the possibility of suffix attempts
extensions: ['.js'.'.jsx'.'.react.js'.'.css'.'.json'].// import alias to reduce time-consuming recursive resolution operations
alias: {
'@compontents': path.resolve(`${project}/compontents`),}}};Copy the code
Those are all the build performance optimizations associated with narrowing your search for files, and you’ll definitely see your build speed increase when you follow these steps to suit your project needs.
Past webPack series
Five visual solutions to analyze WebPack performance bottlenecks
Best configuration for Webpack refers to north
How it works: Hand write a JavaScript wrapper
It’s time to ditch Postman and try out the VS Code widget 👏👏👏
If you think it’s good, just like it! 👍 👍 👍
For more on this series,Go to the Github blog home page
Walk last
-
❤️ Have fun, keep learning, and always keep coding. 👨 💻
-
If you have any questions or more unique opinions, please comment or directly contact Mr. Bottle (scan the code to follow the public account and reply 123)! 👀 👇
-
👇 welcome to pay attention to: front bottle gentleman, updated daily! 👇