Webpack build speed and volume optimization strategy
Primary analysis: Use WebPack’s built-in stats
- Stats: Build statistics
- Pakeage. Json using stats
"Scripts ": {"build:stats": "webpack --env production --json > stats. Json"},Copy the code
// node.js using const prodConfig = require('.. /.. /lib/webpack.prod.js'); webpack(prodConfig, (err, stats) => { if (err) { console.error(err); process.exit(2); } console.log(stats); });Copy the code
The granularity is too coarse to see the problem
Speed analysis: Use speed-measurement-webpack-plugin
- The total time it takes to analyze the entire package
- The time of each plug-in and loader
const SpeedMeasureWebpackPlugin = require('speed-measure-webpack-plugin');
const smp = new SpeedMeasureWebpackPlugin();
module.exports = smp.wrap({
...
})
Copy the code
Volume analysis: Use Webpack-bundle-Analyzer
- Which problems can be analyzed
- Dependent third-party module file size
- The size of component code in the business
const {BundleAnalyzerPlugin} = require('webpack-bundle-analyzer');
plugins:[
new BundleAnalyzerPlugin()
]
Copy the code
Use higher versions of Webpack and Node.js
Multi-process/multi-instance build
- Use the thread-loader to resolve the resource
{
test: /\.js$/,
use: [
{
loader: 'thread-loader',
options: {
workers: 3
}
},
'babel-loader'
]
}
Copy the code
How it works: Each time WebPack parses a module, thread-Loader assigns it and its dependencies to worker threads
Multiprocess parallel compression code
Use the terser-webpack-plugin to enable the PARALLEL argument
optimization: {
minimizer: [
new TerserPlugin({
parallel: true
})
]
}
Copy the code
Subcontracting: Precompiling resource modules
Idea: Type the base package and the business component package into a single file
Method: Use DLLPlugin to subcontract, and DllReferencePlugin to refer to manifest.json
// create webpack.dll.js const path = require('path'); const webpack = require('webpack'); module.exports = { entry: { library: [ 'react', 'react-dom' ] }, output: { filename: '[name]_[chunkhash].dll.js', path: path.join(__dirname,'build/library'), library: '[name]' }, plugins: [ new webpack.DllPlugin({ name: '[name]_[hash]', path: path.join(__dirname,'build/library/[name].json'), }) ] } // webpack.prod.js plugins: [ new webpack.DllReferencePlugin({ manifest: require('./build/library/library.json') }) ]Copy the code
Make full use of the cache to speed up the secondary build
- Caching ideas
- Babel-loader Enables cache
{ test: /\.js$/, use: [ 'babel-loader?cacheDirectory=true' ] } Copy the code
- The terser-webpack-plugin enables caching
optimization: { minimizer: [ new TerserPlugin({ parallel: true, cache: true }) ] } Copy the code
- Use the cache-loader or hard-source-webpack-plugin
plugins: [ new HardSourceWebpackPlugin() ] Copy the code
- Babel-loader Enables cache
Narrow build goals
- Goal: Build as few modules as possible. For example, babel-Loader does not parse node_modules
{
test: /\.js$/,
include: path.resolve('src'),
exclude: 'node_modules',
use: [
'babel-loader'
]
},
Copy the code
- Reduce file search scope
- Optimizes resolve.modules configuration (reduces module search hierarchy)
- Optimize the resolve.mainFields configuration
- Optimize the resolve.extensions configuration
- Use Alias properly
resolve: { alias: { 'react': path.resolve(__dirname,'./node_modules/react/umd/react.production.min.js'), 'react-dom': path.resolve(__dirname,'./node_modules/react-dom/umd/react-dom.production.min.js'), }, extensions: ['.js'], mainFiles: ['main'] } Copy the code
Tree Shaking erases useless JavaScript and Css
-
- Erase useless JavaScript(see WebPack for all aspects from the shallow to the deep, so that your resume really calls “familiar” (Series 1)
-
- Erase useless Css
- Use purgecss-webpack-plugin and mini-css-extract-plugin in combination
const PurgecssWebpackPlugin = require('purgecss-webpack-plugin') const PATHS = { src: path.join(__dirname, 'src') } plugins: [ new MiniCssExtractPlugin({ filename: '[name][contenthash:8].css' }), new PurgecssWebpackPlugin({ paths: glob.sync(`${PATHS.src}/**/*`, {nodir: true}) }) ] Copy the code
Image compression
- Requirements: Imagemin or Tinypng API based on Node library
- Use: to configure image-webpack-loader
rules: [{ test: /\.(gif|png|jpe?g|svg)$/i, use: [ 'file-loader', { loader: 'image-webpack-loader', options: { mozjpeg: { progressive: true, }, // optipng.enabled: false will disable optipng optipng: { enabled: false, }, pngquant: {quality: [0.65, 0.90], speed: 4}, GIFsicle: {interlaced: false,}, // The WebP option will enable WebP webp: { quality: 75 } } }, ], }]Copy the code
Use a Polyfill dynamically
-
Principle: Identify User agents and deliver different polyfills
-
How to use a dynamic Polyfill Service
- Official services provided by Polyfill. IO
<script src="https://cdn.polyfill.io/v2/polyfill.min.js"></script> Copy the code
- Based on the official polyfill service
The last
-
Webpack covers everything from the light to the deep, so that your resume is really “familiar” (Series 1)
-
Webpack covers everything from the light to the deep, so that your resume is really “familiar” (Series 2)
-
Webpack covers everything from the shallow to the deep, so that your resume really says “familiar” (Series 3)
-
Webpack covers everything from the shallow to the deep, so that your resume is really “familiar” (Series 4)
-
Webpack covers everything from the light to the deep, making your resume truly “familiar” (Series 5)