First, webpack packaging problems
Webpack in the following order:
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].js"}};Copy the code
1, find the entry file
2, according to the entry file, find the file js/ CSS that has the dependency relationship
3. Finally, pack all CSS/JS into a JS package
Okay, that’s packed up, that’s packed up the world, so here’s the question:
The product said: the color of the button is wrong, change it to # CCC for me
Technology: Ok, I’ll change it.
Then there is the following flow:
1, find entry -> JS -> componet -> butt. less modify a color value
2. Perform webpack packaging
And that’s where the problem comes in:
1, obviously only changed a color value, but it has to be repackaged from the entrance
2. The business code is implicated even though it has not changed
3. All js generated in the end should be pushed online to cover the business JS that had no problem on the offline, which is purely to increase the risk
Second, think about solutions
The first thing that comes to mind is, since you’re only changing one file, can you repackage a file?
1. Package updated files separately?
This scheme was quickly self-disproved.
Because:
1. The file packaged from the entry has already entered the old version button.less into the final output JS file via dependencies
Less produces a separate button.js file, which needs to be imported into HTML manually. Once these files are made too many, they are impossible to maintain
After much thought, the idea of packaging each file individually is not in line with the original intention of WebPack design, and the process of packaging from the entrance cannot be changed
Stuck on this issue for a really long time….. For a long time
2. Can a certain entry be packaged through dependencies?
Since MY scenario is a multi-page application, there are multiple entries, so if so, can I use dependencies to find the entries that need to be updated?
This plan, too, was considered for a long time, and was later rejected.
Because:
Webpack has no plugins for exporting module dependencies
2. The dependency relationship can be output through webpack’s STATS analysis index, but the amount of data is too large. If not filtered, the project currently outputs 12W lines of JSON information, which needs to be processed again to get the relationship
3. If a component is referenced by multiple entries, you need to find the entry point for each reference and repackage each affected entry
The above point, especially the third one, is completely inconsistent with our goal of incremental publishing. If we change a Button component, we have to repackage 20 or 30 entries, which makes no sense for incremental publishing
In this problem and tangled for a long time…… For a long time
Third, reasonable solutions
After the first two questions, I found myself thinking in the wrong direction, always trying to change the way WebPack is packaged, which is almost against its philosophy.
Since I can’t change how WebPack is packaged, can I change the output of WebPack?
In fact, WebPack provides many powerful plug-ins for caching, such as:
-
The CommonsChunkPlugin can be used to extract common JS code at packaging time
-
The ExtractTextPlugin can be used to extract CSS from JS and export it to a separate file
With these two plug-ins, we were able to divide the precision of our packaging, packaging the parts of common references into a separate file
If the part of the public reference becomes a separate file, and we cache it with a hash, then when we change it again, we just update the hash, so that we can determine which file was changed
In this case, let’s explore step by step:
1. First use CommonsChunkPlugin to extract common JS
Now we create the test entry file:
src/one.js:
import jquery from 'jquery';
console.log('one');
Copy the code
src/two.js:
import jquery from 'jquery';
console.log('two');
Copy the code
webpack.config.js
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].js"}};Copy the code
Perform webpack
It outputs two files, both 271KB in size, because both one.js and two-.js reference jquery, and jquery is packaged twice, one in two files
Webpack.config.js is not very friendly. It is better to cache files like jquery
var webpack = require("webpack");
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].js"
},
plugins:[
new webpack.optimize.CommonsChunkPlugin({
name: "common",})]};Copy the code
Now we have added the CommonsChunkPlugin, which extracts the common JS and executes the Webpack again
You can see that one.js and two-.js are less than 1K in size, while Common is 274K, you can see that jquery has been packaged into common.js
2. Add hash to the file
var webpack = require("webpack");
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[hash:6].js"
},
plugins:[
new webpack.optimize.CommonsChunkPlugin({
name: "common",})]};Copy the code
Output [name].[hash].js
Now execute webpack:
You can see that all three of the packaged files have hashes, but you need to be careful, so the hash is the same for each file
Run webpack again:
As you can see, the output from both builds is the same, which is good because you don’t want the hash to change without modifying the file
Next, modify the file: one.js
import jquery from 'jquery';
console.log('change one');
Copy the code
All files have been hash modified and the output is as follows:
You can see that only one file is changed, but the hash of the entire file is changed. This is a serious problem and obviously not what we want
3. Use chunkhash instead of hash
Webpack provides several ways to add hashes to caches, including chunkhash
Chunkhash simply means adding hashes based on the contents of the module, in which case no new hashes will be generated as long as the file is not changed
var webpack = require("webpack");
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new webpack.optimize.CommonsChunkPlugin({
name: "common",})]};Copy the code
As shown in the figure above, modify filename:[name].[chunkhash:8]/js
Perform webpack
You can see that the hash generated this time is 4897….
But the hash for each output file is not 4897….
Good, let’s do webpack one more time:
You can see that the hash does not change between outputs
Now, modify one.js and execute webapck
import jquery from 'jquery';
console.log('Modify one after using chunkhash');
Copy the code
The hash of one.js has been changed, but the hash of common.js has also been changed…
4. Extract the manifest
After extracting the code with the CommonsChunkPlugin earlier, the common code has been removed, but there must be a mapping between them, for example
The reason why the commonjs hash changes is that a new hash is generated by modifying one.js, and there is a mapping relationship between jquery and one.js. The mapping relationship will be updated, that is, common.js extracts jquery from the new one.js
The MANIFEST can be understood simply as a collection of module mappings, and the MANIFEST is packaged along with the separated code!!
So now separate the Manifest
var webpack = require("webpack");
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new webpack.optimize.CommonsChunkPlugin({
name: "common",
}),
new webpack.optimize.CommonsChunkPlugin({
name: 'manifest'// Manifest})]};Copy the code
The CommonsChunkPlugin uses the default name to extract common code. Webpack has a default module called manifest, so we can use this to implement
Now we execute webpack:
As you can see, an additional manifest.js is output
Next, modify one.js
import jquery from 'jquery';
console.log('Split manifest and modify one');
Copy the code
As you can see, only the one.js and manifest.js hashes have changed and common.js has been successfully cached
Using the code comparison tool to compare the differences between the two manifests, you can see that the mapped chunkid has indeed changed
5. Use the webpack-md5-hash plugin
Earlier we printed out a manifest.js, but we need to process the manifest.js separately, so we can use webpack-md5-hash, another plugin for Webpack
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
module.exports = {
entry: {
one: "./src/one.js",
two: "./src/two.js"
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new WebpackMd5Hash(),
new webpack.optimize.CommonsChunkPlugin({
name: "common",})]};Copy the code
Perform a pack:
No manifest output, modify one.js
import jquery from 'jquery';
console.log('Modify one using WebpackMd5Hash');
Copy the code
Pack again:
This time only the one.js hash has changed
While Webpack-MD5-hash solved our problem, it also turned the packaged module relationship into a black box, with some unknown risks that required careful practice to evaluate if there was a problem
6. Package libraries that change very infrequently
The common code has been removed, but there is a question. If lodash is introduced, will the common hash change?
Modify one. Js
import jquery from 'jquery';
import lodash from 'lodash';
console.log('Introducing lodash to modify one');
Copy the code
Modify the two js
import jquery from 'jquery';
import lodash from 'lodash';
console.log('Introduce lodash to modify two');
Copy the code
This time, the hash of all files has changed, not only that, but more significantly, the size of the Common has increased
This means that LoDash has been added to common, but this is a mistake. Lodash and jquery don’t change them at all, so they need to be optimized and packaged separately
Now modify webapack.config.js
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
module.exports = {
entry: {
two: "./src/two.js",
one: "./src/one.js",
common:['jquery'.'lodash']
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new WebpackMd5Hash(),
new webpack.optimize.CommonsChunkPlugin({
name: "common",})]};Copy the code
This time we add a common to the entry, common points to jquery and LoDash alone, and this time we do the packaging
At this point, the output content does not change significantly, the same 3 files, the same size, hash is not a problem
As you can see, common is 817K in size
What if, at this point, another package is applied? For example, introducing react
Modify one. Js
import jquery from 'jquery';
import lodash from 'lodash';
import react from 'react';
console.log('React');
Copy the code
Modify the two js
import jquery from 'jquery';
import lodash from 'lodash';
import react from 'react';
console.log('React');
Copy the code
Perform webpack
The problem is that the size of common has been increased and react has obviously been packed in, but what if we just want to cache jquery and LoDash permanently?
Modify the webpack. Config. Js
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
module.exports = {
entry: {
two: "./src/two.js",
one: "./src/one.js",
common:['jquery'.'lodash']
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new WebpackMd5Hash(),
new webpack.optimize.CommonsChunkPlugin({
name: 'common',
minChunks:Infinity
})
]
};
Copy the code
This time, a sentence minChunks:Infinity was added
The minChunks attribute can be set to 2, meaning that modules with 2 references will be removed, and the Infinity attribute means that no extra modules will be packed
Now perform the Webpack packaging
Now common is restored to 816K, and react is still in the two files
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
module.exports = {
entry: {
two: "./src/two.js",
one: "./src/one.js",
common:['jquery'.'lodash'],
react:['react'.'react-redux']
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new webpack.optimize.CommonsChunkPlugin({
name: ['react'.'common'}, // manifest minChunks:Infinity}), new WebpackMd5Hash(),]};Copy the code
Through the build above, we have packaged and maintained hash separately from the unaltered libraries.
7. HashedModuleIdsPlugin is introduced to fix module ids
It seems perfect, but now if we change the order of entry
entry: {
react:['react'.'react-redux'],
two: "./src/two.js",
one: "./src/one.js",
common:['jquery'.'lodash'],}Copy the code
As you can see, the Common and React libraries have changed their hashes again. This is because the module ID is incrementally resolved according to the order in which webpack is resolved. If you change the resolution order, the module ID will also change.
So HashedModuleIdsPlugin is needed, which generates the module identifier based on the module’s relative path, and if the module does not change, the module identifier does not change
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
module.exports = {
entry: {
common:['jquery'.'lodash'],
react:['react'.'react-redux'],
two: "./src/two.js",
one: "./src/one.js",
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
plugins:[
new webpack.optimize.CommonsChunkPlugin({
name: ['react'.'common'], / / used to extract the manifest minChunks: Infinity}), new webpack. HashedModuleIdsPlugin (), new WebpackMd5Hash (),]};Copy the code
Now when packaged, the module’s identity is no longer an ID, but a four-digit code, which fixes the IP address.
Use extract-text-webpack-plugin to extract CSS files
Create one.css under SRC:
body{
color:blue;
}
Copy the code
two.css
h1{
font-size:24px;
}
Copy the code
Modify one.js and two-.js to introduce CSS
import jquery from 'jquery';
import lodash from 'lodash';
import react from 'react';
import './one.css'
console.log('Introducing CSS to modify one');
Copy the code
Modify the webpack. Config. Js
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
var ExtractTextPlugin = require("extract-text-webpack-plugin");
module.exports = {
entry: {
common: ['jquery'.'lodash'],
react: ['react'.'react-redux'],
two: "./src/two.js",
one: "./src/one.js",
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
module: {
rules: [
{
test: /\.css$/,
use: ExtractTextPlugin.extract({
fallback: "style-loader",
use: "css-loader"
})
}
]
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({
name: ['react'.'common'], // Manifest minChunks: Infinity}), new ExtractTextPlugin("[name].[chunkhash:8].css"),
new webpack.HashedModuleIdsPlugin(),
new WebpackMd5Hash()
]
};
Copy the code
Perform webpack:
As you can see, the output of js and CSS is successful, but there is a bit of a question that one. CSS and one.js have the same hash, so what if we change one. CSS?
Modify one.css and pack again:
The HASH of the CSS is not changed.
Then modify one.js and package it again:
This time the hash of one.js and one.css has changed at the same time.
9. Use contenthash to extract the hash of a fixed CSS
- When using the ExtractTextWebpackPlugin, use [contenthash] to obtain a hash of the extracted file (neither [hash] nor [chunkhash] work).
Webpack Output documents are written, and when the CSS is extracted, add the hash using contenthash
var webpack = require("webpack");
var WebpackMd5Hash = require('webpack-md5-hash');
var path = require('path');
var ExtractTextPlugin = require("extract-text-webpack-plugin");
module.exports = {
entry: {
common: ['jquery'.'lodash'],
react: ['react'.'react-redux'],
two: "./src/two.js",
one: "./src/one.js",
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: "[name].[chunkhash:8].js"
},
module: {
rules: [
{
test: /\.css$/,
use: ExtractTextPlugin.extract({
fallback: "style-loader",
use: "css-loader"
})
}
]
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({
name: ['react'.'common'], // Manifest minChunks: Infinity}), new ExtractTextPlugin("[name].[contenthash:8].css"),
new webpack.HashedModuleIdsPlugin(),
new WebpackMd5Hash()
]
};
Copy the code
This time, just changing the hash of the output, conenthash represents the hash value of the text file content, i.e. the hash value of the style file only.
Perform webpack:
One.js and one.css hashes have changed differently
Next, modify one.css
body{
color:white;
}
Copy the code
Execute Webpack again:
So far, only one.css has changed, and that’s pretty much the end of the preparation
Fourth, optimize multi-page packaging time and stabilize hash
1. Constrain entry
Because it is a multi-page application, it is packaged by scanning entry files. The rule is that JS files are entry files, and resources referenced by JSX are not identified as entry files
Through the analysis of BundleAnalyzerPlugin, it is found that some components are packed as entrance. After combing, they are repacked, and the packaging time is reduced by 2/3. Of course, this is to fill the previous pit
The production packing time is 74578ms
In this case, the packaging time of compression and uncompression is also 3x:
Development package time is 24780ms
Ok, so around these two times, we’re going to optimize
2. Use UglifyjsWebpackPlugin to enable multi-threaded packaging
The first thing to do is actually stabilize hash, but since packaging in production is too slow, we optimize the packaging speed first. Webpack provides single-threaded packaging by default
const UglifyJSPlugin = require('uglifyjs-webpack-plugin')
module.exports = {
plugins: [
new UglifyJSPlugin({
parallel: true}})]Copy the code
This plugin is provided with webpack3, but it needs to be handled with caution for earlier versions of webapck, but the effect is obvious
Now the production and packaging time is 51690ms, which is 1/3 faster than before
3, use HappyPack multi-threaded acceleration loader
var HappyPack = require('happypack');
var os = require('os'); var happyThreadPool = HappyPack.ThreadPool({ size: os.cpus().length }); . module: { rules: [ {test: /\.js[x]? $/, exclude: /(node_modules|bower_components)/, loader:'happypack/loader? id=happybabel',
include: path.join(__dirname, 'static/assets/js')
}
}
plugins: [
new HappyPack({
id: 'happybabel',
loaders: ['babel-loader? cacheDirectory=true'],
threadPool: happyThreadPool,
cache: true,
verbose: true
}),
Copy the code
The loader in the rules property of the module above was originally babel-loader, but now it is a task with an ID that corresponds to the happyPack instance in plugins
At this point, we turn on the multithreaded mode of babel-Loader
Now the production and packaging time is 43855ms, which is 1/9 faster than before. This is only Babel-Loader, and we can also open it for other loaders
Then deal with less, CSS,style and other loader, these combination can be done in one breath
module: {
rules: [{
test: require.resolve('zepto'),
loader: 'exports-loader? window.Zepto! script-loader'
}, {
test: /\.js[x]? $/, exclude: /(node_modules|bower_components)/, loader:'happypack/loader? id=happybabel',
include: path.join(__dirname, 'static/assets/js')}, {test: /\.less$/,
use: extractTextPlugin.extract({
fallback: "style-loader",
// use: ["css-loader" + (ENV ? '? minimize' : ' '), "less-loader"."postcss-loader"]
use: ["happypack/loader? id=postcss"]
})
}]
}
plugins: [
new HappyPack({
id: 'happybabel',
loaders: ['babel-loader? cacheDirectory=true'],
threadPool: happyThreadPool,
// cache: true,
verbose: true
}),
new HappyPack({
id: 'postcss',
loaders: ["css-loader" + (ENV ? '? minimize' : ' '), "less-loader".'postcss-loader'],
threadPool: happyThreadPool,
// cache: true,
verbose: true
}),
Copy the code
In this way, we have handled the Babel as well as the CSS, less and postCSS loaders
Above happy[task name], you can see that the packing behavior all turns on multi-threading, and the effect is remarkable
Now the production and packaging time is 35130ms, which has doubled the speed compared to the first time when it was not optimized
4, use DLL split code
After the previous process, you must have realized that purely static libraries and components need to be separated from the packaging process, which requires DLL technology
DLL technology, in fact, is to modify the frequency of low or basic do not modify and reference times of the content, packaged separately
Since the number of config files increased dramatically after DLL design, the directory structure needed to be reorganized
For example, in the figure above, to split each webpack and separate all the configuration files, take webpack.dev.js:
var base = require('./webpack.base.js');
var config = {
entry: require('./dev/entry.js'),
output: require('./dev/output.js'),
plugins: require('./dev/plugins.js'),
devtool: 'eval-source-map'} // Expose the configuration file; module.exports = Object.assign(base,config);Copy the code
Ok, after the basic split webpack is complete, we create a webpack.dll.libs.js to package the class library
module.exports = {
libs: [
'react'.'react-dom'.'react-motion'.'react-redux'.'redux'.'axios'.'prop-types'.'classnames']},Copy the code
Modify plugins:
var webpack = require('webpack');
var dirVars = require('.. /common/dir.js');
var path = require('path');
var UglifyJsPlugin = require('uglifyjs-webpack-plugin'); Var getDefaultPlugins = require('.. /common/plugins.js').getDefaultPlugins;
var AssetsPlugin = require('assets-webpack-plugin'); Var plugins =[new webpack.dllplugin ({path: dirvars.dlllibsmanifest,}), new UglifyJsPlugin({parallel:true,
cache: true
}),
new AssetsPlugin({
filename: 'static/dll/libs-rev-manifest.json'
}),
]
module.exports = plugins.concat(getDefaultPlugins())
Copy the code
Now execute webpack
As you can see, it only takes 1s to pack all the libraries. Next, modify webpack.prod.js
Add the following to plugins:
new webpack.DllReferencePlugin({
manifest: 'static/dll/libs-rev-manifest.json'
}),
Copy the code
At this point, when we execute webpack.prod.js for packaging, when we scan the packaged content in LIBS, we will not repeat packaging
4. Continue to constrain hash
The packaging has been done completely, but it is very destructive, so you need the system to verify that there is a problem with the hash
Case1: js change
Modify the JS of a business code, add a comment, and package again
You can see that the file hash has changed, but unfortunately vendor has also changed
Solution: Add the webpack-MD5-Hash plug-in. After using it, verify again and find that the VendorJS hash does not change
Case2: less change
Only one CSS hash has changed, no problem
Case3: Modify a public method that wraps itself under an entry
I changed a tools plugin that was used publicly in the portal, and the hash of the portal was changed. No problem
Case4: Modify the public method component JS
Mainly components that are referenced by multiple entries
Test, only the separately packaged components hash changed
Case5: Modify the public method component less
Only one hash has changed
Case6: add a common component
Only components’ hash has changed
The packaging time before optimization is 180-200s
Optimization:
1, constraint entry, strictly clear entry file screening conditions after production package: 74578ms development package: 24780ms 2, open multi-thread compression production package: 51690ms 3, open multi-thread compilation production package: 35130ms development package: 15031MS 4, unpacking decomposed packaging process, class library 4S, component 4S, business 20s, overall 30sCopy the code
As a result, the process becomes manageable, packaging is customized, and hash is preserved.