Source files are often modified during development, and inevitably have to be rebuilt and packaged to see the changes. This is where the speed of build becomes particularly important, as optimization of build speed can greatly improve development efficiency and experience. This article describes some common Webpack ways to optimize build speed.
To optimize thebabel-loader
Babel-loader is used to translate ES6 and above syntaxes into ES5 syntaxes that browsers can recognize, which is very slow. So the point of optimization is:
-
Define the scope of translation and ensure that as few documents as possible are translated
Explicit include or exclude tells Webpack the range of files to use babel-Loader
-
Use the cache
Enable cacheDirectory for babel-loader. If set, the specified directory will be used to cache loader execution results. Subsequent WebPack builds will attempt to read the cache to avoid the potentially high-performance Babel recompilation process that can occur each time it is executed. If a null value is set (loader: ‘babel-loader? CacheDirectory ‘) or true (loader: ‘babel-loader? CacheDirectory =true’), loader will use the default cache directory node_modules/. Cache /babel-loader, if the node_modules directory is not found in any root directory, It will degrade back to the default temporary file directory of the operating system.
module: {
rules: [{test: /\.js$/,
loader: ['babel-loader? cacheDirectory'].include: srcPath,
// exclude: /node_modules/}}}]Copy the code
Happypack multi-process packaging
When building a project using Webpack, a large number of files are parsed and processed. As the number of files increases, Webpack components become slower. Since Webpack running on Node.js is a single-threaded model, the tasks that Webpack needs to process are handled one by one. Happypack splits the file parsing task into multiple sub-processes to execute concurrently. The child processes the task and then sends the result to the main process. Therefore, the speed of project components in Webpack can be greatly increased. Happypack only works on the Loader and uses multiple processes to compile files simultaneously.
1. The introduction ofhappypack
The plug-in
const HappyPack = require('happypack')
module.exports = {
// ...
plugins: [
// happypack enables multi-process packaging
new HappyPack({
// Use a unique identifier id to indicate that the current HappyPack is used to process a specific class of files
id: 'babel'.// How to handle.js file, use the same as Loader configuration
loaders: ['babel-loader? cacheDirectory']}),],}Copy the code
2. Will be ordinaryloader
Replace withhappypack/loader
module.exports = {
// ...
module: {
rules: [{test: /\.js$/.// pass processing of.js files to the HappyPack instance with id Babel
use: ['happypack/loader? id=babel'].include: srcPath,
// exclude: /node_modules/}},],// ...
};
Copy the code
ParalleUglifyPlugin
Multi-process compressed JS
Similar to happypack multi-process packaging, Webpack compression can also enable multi-process, webpack-parallel-ugli-fi -plugin is designed to enable parallel compression JS code.
const ParallelUglifyPlugin = require('webpack-parallel-uglify-plugin')
module.exports = {
// ...
plugins: [
ParallelUglifyPlugin is used to compress the output JS code in parallel
new ParallelUglifyPlugin({
// The argument passed to UglifyJS
// Use UglifyJS compression again, just to help start multi-process)
uglifyJS: {
output: {
beautify: false.// Most compact output
comments: false.// Delete all comments
},
compress: {
// Delete all 'console' statements, compatible with Internet Explorer
drop_console: true.// Inline variables that are defined but used only once
collapse_vars: true.// Extract static values that occur multiple times but are not defined as variables to reference
reduce_vars: true,}}})],}Copy the code
About starting multiple processes
In the introduction above, both Happypack and ParalleUglifyPlugin enable multiple processes. Multi-process packaging can decompose the packaging task into multiple sub-tasks for concurrent packaging, but not all of them can optimize the construction speed of the project, which should be discussed separately.
- If the project is large and packaging itself is slow, enabling multi-process packaging can actually improve build speed
- Small projects can be packaged quickly, but enabling multi-process packaging can actually slow down the build because of the overhead of enabling multi-threading
Conclusion: Multi-process packaging should not be blindly used, but should be used according to the characteristics of your own project.