preface
In your daily development, especially in large projects, do you feel like swearing every time you pack up? Anyway, I am very painful, packaging 20 minutes every time, the long waiting time, let a person very anxious, meet some special problems (such as test WeChat share), must be packaged deployment, see the effect, you will find that one day all waste on the packaging, is the so-called change code for two minutes, packing code for two hours, and leisure, Studied the WebPCK packaging mechanism, and through a few small plug-ins and some skills to successfully reduce the packaging time of the company’s project, although the packaging time is not a cliff type reduction, but can reduce a minute, is a minute, let’s study the performance optimization of Webpack, and volume optimization!
1. Upgrade the base environment version
If scaffolding is found to be slow to pack, smelly and bulky during the project, upgrading the technical environment is the fastest way to improve, such as node environment, such as upgrading the scaffolding version, the effect will be significantly improved, and it is the cheapest way to optimize.
2. Use include or exclude to avoid repeated packaging
In our daily development, some of the plug-in, we introduce a class library, is been packaged, so when we compile with Babel, will need to configure the grammar to have compiled out, so it can reduce the packing time, here, popular science, when we were in the use of plug-in webpack is what to do, For example, we introduced the JQ plug-in in the project
import $ from 'jquery'
Copy the code
NPM search will start with node_modules in the current directory when using esModel to reference jquery. Go one level up to node_modules. Go all the way up to the current disk. If none. Just an error. When he finds node_modules, he goes back to package.json, identifies the entry file in the file, finds the entry file to load, and finds that it’s already a compiled file
{
test: /\.js$/, // Use include to specify the build folder include: path.resolve(__dirname,'.. /src'Exclude: /node_modules/ use: [{loader:'babel-loader'}}]Copy the code
Of course, you can also specify files using noParse to avoid repeated packaging
3. Use caching wisely to reduce packaging time
Take the babel-loader as an example
{
test: /\.js$/, use: [{// If the file is not changed, use the cache loader:'babel-loader? chaheDirectory'}}]Copy the code
So, we have some common unchanged library, is can use the current way to package, in addition to this technique there is also a cache-loader, can also enable cache, usage is very simple, before the overhead of the loader can be used
{
test: /\.js$/,
use: [
'cache-loader'.'babel-loader'
],
include: path.resolve('src')}Copy the code
4. Use plugin reasonably to reduce packaging time and volume
When we build webpack scaffolding, we will find that we have to use some plugin to achieve our requirements, so what should we choose?
- First of all, we try to choose the officially recommended plugins, which have been officially tested and have reliable performance
- Secondly, use these plug-ins reasonably to avoid introducing useless modules and codes.
For example, if we use moment’s library, the webpack in the packaging hand will be introduced to the entire library by default, which will make our library very large. We can use IgonrePlugin to ignore the plugin’s useless folder, which will greatly reduce the packaging volume. For example, When using CSS compression optimizecsssetsplugin, we only need to compress the code in the production environment, so we do not need this plug-in in the development environment, which can effectively shorten the compression time
5. Properly configure Relosve to prevent slowing down the packaging time
When we introduce es6 modules, we find that we can reference them without writing file suffixes. In fact, this is the convenience provided by webPack’s Relosve configuration parameter
Resolve :{extensions:['js'.'jsx']}Copy the code
However, because of this convenience, we will configure many suffixes for convenience, such as JPG, CSS, PNG, etc., because he does not write suffixes, when searching, he will give all the suffixes in the Extensions array traversed before finding the error, which greatly increases the search time, so, we still need to configure reasonably
6, enable multi-process packaging (emphasis on pits)
There are three main ways to enable multi-process packaging: using Happy and using Thread-Loader,
happypack
Happypack itself starts a multi-process package for Node, which is a bit cumbersome to use
Const happypack = require('happypack'// Then use it in lorder. For example, babel-lorder {test: /\.js$/, // The processing of.js files is transferred to the HappyPack instance with the id Babel. Use: ['happypack/loader? id=babel'], include: srcPath, // exclude: New HappyPack({// use a unique identifier id to represent the current HappyPack Is used to handle a particular class of file ids:'babel'Loaders: [loaders: [loaders: [loaders: [loaders:] [loaders: [loaders:] [loaders:]'babel-loader? cacheDirectory']}),Copy the code
Of course, since threadloader is highly recommended by the official documentation in webpack4, and HappyPack is no longer maintained, we prefer threadloader when using multi-process packaging
thread-loader
Hread-loader is also very simple to use. Simply place thread-loader in front of other loaders, so that the loader will run in a separate worker pool. It also supports the definition configuration for easy performance optimization
{
test: /\.js$/, exclude: /node_modules/, // Create a js worker pool use: [// directly before the loader'thread-loader'.'babel-loader'}, // custom configuration line use[{loader:"thread-loader"// loaders with equal options will share worker pools // loaders with equal options will share worker pools: WorkerParallelJobs {// The number of workers, which defaults to the number of CPU cores, workers: 2, // the number of parallel jobs in a worker, which defaults to 20 // Add the additional node js parameter workerNodeArgs: ['--max-old-space-size=1024'], // allow to regenerate a dead work pool // this process will slow down the overall compilation speed // the development environment should be set tofalse
poolRespawn: false// When the worker is in listening mode, it can be set to infinite, so that the poolTimeout will always exist: PoolParallelJobs: 50,} // Pool number of jobs allocated to workder // Default is 200 // Lower efficiency, but more uniform job distribution'babel-loader'
]
Copy the code
ParallelUglifyPlugin
As we know, to compress JS, the code needs to be first parsed into AST syntax tree, then the AST needs to be analyzed and processed according to complex rules, and finally the AST needs to be restored into JS. This process involves a lot of calculation, so it is time-consuming. ParallelUglifyPlugin is the ParallelUglifyPlugin that enables the ParallelUglifyPlugin
Const ParallelUglifyPlugin = require('webpack-parallel-uglify-plugin') plugins: [// ParallelUglifyPlugin(ParallelUglifyPlugin, ParallelUglifyPlugin, ParallelUglifyPlugin, ParallelUglifyPlugin, ParallelUglifyPlugin) UglifyJS: {output: {beautify:false// The most compact output is comments:false// Delete all comments}, compress: {// Delete all 'console' statements, compatible with DROP_console:true// Collapse_vars is a variable that is implicitly defined but only used once:true// Extract the static value reduce_vars that occur multiple times but are not defined as variables to reference:true,}}})]Copy the code
Ok, the basic of the optimization of the processes we arrived here, say a pit, is due to the many people configuration later found instead accelerate the packing time has slowed, this is because if the project is lesser, pack to open multiple processes can have additional performance overhead, but will slow down the time, so, if the project here is bigger, Then we recommend to use it (our company’s project is indeed a little faster), if the project is small, it is better not to use a knife to kill chickens.
7. Use hot updates in development instead of automatic refreshes
In our daily development, we have to refresh the page every time we change the code, but if the project is too large, we have to drink a few water before we refresh, which is quite painful.
We can use hot updates to improve the development experience instead of automatic refreshes. Don’t ask me why, I’ve experienced the feeling of waiting 10 seconds every time we change code (this happened in our old Angular projects). How can we use hot updates? It’s also very simple to introduce a plug-in
Const webpack = require('webpack'); / / use webpack provide hot update plugins plugins: [new webpack. HotModuleReplacementPlugin ()], / / the last need configuration in our devserver devserver: {+ hot:true
},
Copy the code
You can use the @AngularClass/HMR plugin if you are using an Angular project
8. Use DllPlugin to optimize and improve packaging time
When we use some of the community’s more stable libraries, such as React, vue, jquery, you will find that it does not update for several months, so we do not need to re-package, just package once, the next package only reference, so what should I do? In the webpack4 version, The DllPlugin plugin is already integrated, so we just need to configure it.
First step we need to write a DLL configuration file, to package the DLL file
const path = require('path')
const DllPlugin = require('webpack/lib/DllPlugin')
const { srcPath, distPath } = require('./paths')
module.exports = {
mode: 'development'// use the React module as an example and put it into a separate dynamic link library. React: ['react'.'react-dom'}, output: {// React/polyfill filename: {// react/polyfill filename:'[name].dll.js'// Put the output files in the dist directory path: DistPath, // store the global variable name of the dynamic link library. For example, react is _dll_react. // Use _dll_ to prevent global variable conflicts.'_dll_[name]', }, plugins: [// DllPlugin new DllPlugin({// DllPlugin global variable name, // The value of this field is the value of the name field in the output manifest.json file, for example, react.manifest.json"name": "_dll_react"
name: '_dll_[name]', // Describe the manifest.json file output from the dynamic link library.'[name].manifest.json'),})],}Copy the code
The second step is to configure the mapping in webpack to prevent the NPM package from being referenced again during packaging
Const DllReferencePlugin = require('webpack/lib/DllReferencePlugin'); Manifest: require(path.join(distPath,))'react.manifest.json')),}),]Copy the code
The third step is to reference the packaged public module in our HTML, because when we configure the DllReferencePlugin, it is actually looking for the package in the window at execution time, so we must import the packaged file. Then the corresponding the utility module will be under the window hangs a global variable, introducing the way we can use AddAssetHtmlWebpackPlugin plug-in, can also be manually
The above can greatly improve the packaging time, but because the use of dllPlugin is to solve the problem of slow packaging time during development, it is recommended not to use it in online environment, and the original process can be unified packaging
9. Optimization of some project volume and runtime performance
Use lazy loading to speed up the first screen loading time
Lazy loading is a cliche, it is a necessary means of performance optimization, when the page is large, and not very important code, we can use lazy loading way to load in asynchronous, so that we can reach the rendering conditions in advance, specific lazy loading how to use:
import('./util.js').then(data=>{// load console.log(data.default) // Get the data in the loaded file to the console})Copy the code
Small images are base64 format and do not use network requests
When we come across small images in packaging, we can convert them directly to base64-bit format and reduce HTTP requests for front-end performance optimization. It’s very simple to use
Rules: [// image - Consider base64 encoding {test: /\.(png|jpg|jpeg|gif)$/,
use: {
loader: 'url-loader', options: {// Use base64 format for images smaller than 5KB // otherwise, use file-loader and output URL formatlimit: 5 * 1024, // package to img directory outputPath:'/img1/'// Set the CDN address of the image (can also be set in the external output, that will apply to all static resources) // publicPath:'http://cdn.abc.com'}}},]Copy the code
Use hash properly when packaging. If the file is not modified, the file will match the cache
If the file is changed, the hash value will change. After going online, the files that are not changed when accessed by the browser will hit the cache, so as to achieve the purpose of performance optimization. The usage is as follows:
output: {
filename: 'bundle.[contentHash:8].js'// When packing code, addhashSuffix path: distPath},Copy the code
Extract common code, code segmentation
When we extract common code while packaging, and implement code splitting, then I can extract common code for multiple modules, just packaging once, and we can achieve a smaller code size! How do we use it? Webpack gives us the Optimization property to configure
Optimization: {// splitChunks: {chunks:"Async", / / the default applies to asynchronous chunk, values to all initial/async/function (the chunk), the value of the function when the first parameter to iterate through all the entry when the chunk chunk of the module, Chunk. _modules refers to all the dependent modules of chunk, which can be configured freely based on the name of chunk and the resource of all dependent modules. It extracts all the public modules of chunk that meet the conditions and all the dependent modules of the modules, including CSS minSize: 30000, // indicates the minimum module size before compression, the default value is 30KB minChunks: 1, // indicates the number of references, the default value is 1. MaxAsyncRequests: 5, // Maximum 5 maxInitialRequests for all asynchronous requests: 3, // Initial call parallel requests must have no more than 3 automaticNameDelimiter:'~',// Name delimiter, default is ~ name: true, // Packaged name, default is chunk name delimiter (default is ~) {// Set the cache group to extract chunks that meet different rules. The following uses common as an example: {name: 'common', // the name of the chunk to be extracted is chunks(chunks) {test(module, chunks) {// Can be a string, a regular expression, As a function, modules that meet the requirements will be extracted into the chunk of the common. In this function, the first parameter is each module traversed, and the second parameter is each chunks array referenced to the module. I found that I could not extract CSS in the process of trying, which needs further verification. }, priority: 10, // priority, a chunk is likely to meet multiple cache groups, will be extracted into the cache group with the highest priority minChunks: 2, // minimum number of chunks referenced by reuseExistingChunk: True, // If the chunk refers to the chunk that has been extracted, use the chunk directly. True // If minSize is not set in the cacheGroup, it determines whether to use upper-layer minSize. True: 0; false:}}}}Copy the code
Use tree-shaking to remove useless code to reduce code size
The purpose of tree-shaking is to remove code that is referenced but not used. In webpack4, if multiple functions are referenced in a file but one function is used, multiple functions are packaged. In webpack5, it is said that more powerful dependency graphs can be resolved and code that is not ultimately used is removed
// a.js
export const a = 'a';
export const b = 'b';
// b.js
import * as c from "./a";
export { c };
// index.js
import * as module from "./b";
console.log(module.c.a);
Copy the code
For example, the above code is finally the b variable is not packaged, how to open it?
mode: 'production'.Copy the code
One line of code, configure mode can be
conclusion
Webpack optimization content is basically completed, this article is based on their own company project, after referring to the big man article practice summary, it is an effective strategy, wrong place please big man correct!
My blog is synchronized to tencent cloud + community, invite everyone to come together: cloud.tencent.com/developer/s…