Dev Server
Ideal development environment
- First, it must be able to run using AN HTTP service rather than preview as a file. This is both closer to the production state and our project may need to use apis like AJAX, which can cause problems with file access.
- Second, Webpack automatically completes the build after we have modified the code, and the browser displays the latest results instantly, which greatly reduces the amount of extra work done during the development process and allows us to focus more and become more productive.
- Finally, it needs to provide Source Map support. This way, errors that occur during execution can be quickly located in the source code rather than in the packaged results, making it easier to quickly locate errors and debug applications.
The development cycle of “write source code, Webpack, run the application, and view it in the browser” is primitive
Webpack compiles automatically
Webpack CLI provides the Watch working mode
In this mode, after the initial build of Webpack, the source files in the project are monitored and Webpack automatically reruns the packaging task if any changes are made. When you start Webpack, add a –watch CLI parameter so that Webpack starts and runs in monitor mode. Instead of exiting immediately after packaging, the CLI waits for file changes to work again until we either end it manually or an uncontrollable exception occurs.
Our development experience at this point was: modify code → Webpack automatically packaged → manually refresh the browser → preview the results.
P.S. The static file server I’m using here is an NPM module called Serve.
After the encoding changes, the browser can refresh automatically after the Webpack is packed
If you’re already familiar with a tool called BrowserSync, you know that BrowserSync allows you to automatically refresh your browser after a file changes.
So, we can use BrowserSync to replace the serve tool and start the HTTP service. Here, we also need to monitor the changes of files in the dist directory. The specific command is as follows:
You can install browser-sync globally through NPM and then use this module
npm install browser-sync --global
browser-sync dist --watch
Or you can use NPX directly to use the remote module
npx browser-sync dist --watch
Copy the code
The principle of BrowserSync is that Webpack monitors source code changes, automatically packages the source code into Dist, and BrowserSync monitors the changes of files in DIST to automatically compile and refresh the browser. The whole process is monitored by two different tools.
This watch mode + BrowserSync does what we need, but it has a number of drawbacks:
- Cumbersome operation, we need to use two tools at the same time, then need to understand the content will be more, learning cost greatly increased;
- It’s inefficient because during the process, Webpack writes the file to disk and BrowserSync reads it. A large number of disk read and write operations are involved, which inevitably leads to low efficiency.
Webpack Dev Server
Webpack-dev-server is an official development tool for WebPack. It provides a development server and integrates a series of development-friendly features such as auto-compile and auto-refresh browser.
Webpack-dev-server is also a standalone NPM module, so we need to install webpack-dev-server as a development dependency of the project via NPM. After installation, this module provides us with a CLI program called webpack-dev-server. We can also run this CLI directly through NPX, or define it into NPM scripts.
# webpack -- dev server installation
npm install webpack-dev-server --save-dev
# webpack - dev - server operation
npx webpack-dev-server
Copy the code
When we run webpack-dev-server, it starts an HTTP server internally, provides static file services for the packaged results, and automatically packages our application with WebPack, then listens for changes to the source code, and immediately repackages the files when they change. The general process is as follows:
To speed things up, webpack-dev-server does not write the packing results to disk, but stores them temporarily in memory, from which the internal HTTP server reads the files. This reduces unnecessary disk reads and writes and greatly improves the overall build efficiency.
We can also pass an –open parameter to the webpack-dev-server command, which automatically prompts the browser to open our application. Once you open your browser, if you have two screens at this point, you can put your browser on the other screen and experience the development environment as you code and preview it in real time.
Webpack Dev Server configuration
The Webpack configuration object can have a property called devServer that provides configuration for webpack-dev-server as follows:
// ./webpack.config.js
const path = require('path')
module.exports = {
// ...
devServer: {
contentBase: path.join(__dirname, 'dist'),
compress: true.port: 9000
// ...
Detailed configuration document: / / https://webpack.js.org/configuration/dev-server/}}Copy the code
Webpack Dev Server static resource access
By default, webpack-dev-server treats all build results and output files as resource files for the development server, meaning that any file that can be exported through Webpack can be accessed directly. However, if you have static files that are not included in the package that also need to be accessed as resources for the development server, then you need to “tell” webpack-dev-server via additional configuration.
You can do this by adding a corresponding configuration to the webpack-dev-server configuration object. We go back to the configuration file and find the devServer property, which is of type an object. We can specify additional static resource paths through the contentBase property of this devServer object. The contentBase property can be a string or array, which means you can configure one or more paths. The configuration is as follows:
// ./webpack.config.js
module.exports = {
// ...
devServer: {
contentBase: 'public'}}Copy the code
The Proxy agent
Webpack-dev-server is a local development server, and communication with back-end services creates cross-domain
Webpack-dev-server supports adding proxy services in configuration mode.
The best solution to this development phase cross-domain request problem is to configure a backend API proxy service in the development server, that is, the backend interface service proxy to the local development service address.
The pathRewrite property implements proxy path rewriting. The rewriting rule is to replace the/API at the beginning of the path with empty. PathRewrite will eventually replace the request path in a regular manner.
If you set the changeOrigin attribute of the proxy rule configuration to true, the host name of the actual proxy request address will be used for the request.
// ./webpack.config.js
module.exports = {
// ...
devServer: {
proxy: {
'/api': {
target: 'https://api.github.com'.pathRewrite: {
'^/api': ' ' // Replace the/API in the proxy address
},
changeOrigin: true // Make sure that the requested GitHub host name is api.github.com}}}}Copy the code
So at this point we askhttp://localhost:8080/api/users”Is equivalent to a requestapi.github.com/users.
Source Map
By doing things like build or compile, we convert the source code we write during development into code we can run in production, and this progress also means that there is a big difference between the code we actually run and the code we actually write.
In this case, if we need to debug our application, or if an unexpected error occurs during application execution, we can’t do anything about it. Because both debugging and error reporting are based on the code after construction, we can only see the specific location of error information in the code after construction, but it is difficult to directly locate the corresponding location in the source code.
Introduction of the Source Map
The best way to solve this problem is the Source Map, which, as its name suggests, maps the relationship between transformed code and Source code. The Source Map file generated during the conversion process can be reversed parsed to get the corresponding Source code.
Many third-party libraries provide a Source map file with the.map suffix. Such as jQuery. We can open its Source Map file to take a look, as shown below:This is a JSON file. This JSON records the mapping between the converted and pre-converted codes, which mainly has the following attributes:
- Version specifies the standard version of the Source Map to use;
- Sources records the name of the source file before the conversion, because it is possible to package multiple files into one file, so here is an array;
- Names are member names used in source code. We all know that compressed code replaces meaningful variable names we wrote during development with short characters.
- The mappings attribute, which is the most critical attribute, is a string called base64-VLQ encoding, which records the mapping between the characters in the converted code and the characters in the converted code, as shown in the following figure:
We typically import the Source Map file by adding a line comment to the converted code. //# sourceMappingURL=jquery-3.4.1.min.map The specific effects are as follows:
So if you open developer tools in Chrome, it will automatically request this file and then reverse parse the source code based on the contents of this file for easy debugging. And because of the mapping, errors in the code can be automatically located in the source code.
Open the Developer tools and go to the Source panel, where we can see the jQuery Source code before the transformation, as shown below:
Configure the Source Map in Webpack
Back in the configuration file, the configuration property we will use here is called devTool. This property is used to configure ancillary tools during the development process, which is the Source Map-related functionality. We can set this property to source-map as follows:
// ./webpack.config.js
module.exports = {
devtool: 'source-map' // Set source map
}
Copy the code
Then open the command line terminal and run the Webpack package. After the package is complete, we open the dist directory, which will generate our bundle.js Source Map file. At the same time, bundle.js will import this Source Map file through comments, as shown in the following figure:
The Devtool configuration in Webpack supports many other options besides source-map. For details, see the comparison table for different modes in the documentation.
The Eval mode
Eval actually refers to a function in JavaScript that can be used to run JavaScript code in a string. For example, the string console.log(“foo~”) is executed as a piece of JavaScript code:
const code = 'console.log("foo~")'
eval(code) // Execute the string in code as JS code
Copy the code
By default, this code runs in a temporary virtual machine environment, which we can see in the console:
Add a comment to the string code executed by the eval function in the form # sourceURL=./path/to/file.js so that the code is executed in the specified path.
The eval function can specify the file path to which the code belongs via sourceURL
In the Webpack configuration file, set the devtool property to eval as follows:
// ./webpack.config.js
module.exports = {
devtool: 'eval'
}
Copy the code
In the bundle.js file, you’ll find that each module’s code is wrapped in an eval function, and each module’s code ends with a sourceURL declaration of the module’s source file path, as follows:
If we go back to the browser and run bundle.js, the browser console will be able to locate the code in the module if an error occurs.But when you open the file by clicking on the file name in the console, you see the packaged module code instead of our actual source code, as follows:
To sum up, in eval mode, Webpack executes each module’s converted code in an eval function and declares the corresponding file path via sourceURL, so that the browser knows which file a line of code is in the source code.
Because the Source Map file is not generated in Eval mode, it is the fastest to build, but the disadvantage is equally obvious: it can only locate the file path of the Source code, not the specific column and column information.
eval-source-map
In addition to locating files, the eval-source-Map mode can locate specific row and column information. In contrast to eval mode, it can generate Source Map files and reverse Source code
cheap-eval-source-map
The eval-source-Map version is a castrated version, because it also generates source map files, but in this mode source Map can only locate rows, not columns, so it is a little bit worse, but the build speed is much faster
cheap-module-eval-source-map
The source code located in the cheap-module-eval-source-map pattern is exactly the same as the source code we wrote, and the source code located in the cheap-eval-source-map pattern is the result of an ES6 transformation, The comparison is as follows (the image on the left is cheap-eval-source-map) :Because the mode with module in the name, the source code parsed is not processed by Loader, and the mode without module in the name, the source code parsed is processed by Loader. This means that if we want to restore the exact source code, we need to select the cheap-module-eval-source-map mode.
cheap-source-map
The absence of eval in the name of this pattern means that it does not execute code using EVAL, and the absence of module in the name means that the Source Map is the code processed by the Loader. Cheap means that only the line number of the Source code can be located.
inline-source-map
It has the same effect as regular Source-Map, except that instead of being a physical file, the Source Map file appears in the code as data URLs. The eval-source-map we encountered earlier is also inline in this way.
hidden-source-map
We can’t see the effect of Source Map in the development tool, but it does generate a Source Map file, just like jQuery. The Source Map file is generated, but the code does not reference the corresponding Source Map file. Developers can choose to use them.
nosources-source-map
In this mode, we can see where the error occurred (including the column position), but we can’t see the source code when we click in. This is to protect the source code from exposure in the production environment.
Real situation
The development environment
First, during the development process (development environment), cheap-module-eval-source-map is selected for the following three reasons:
- Take React and vue. js as an example. Both JSX and Vue single file components differ greatly after Loader conversion. I need to debug the source code before Loader conversion.
- Typically, I don’t write more than 80 characters per line of code, but for me it’s enough to be able to locate the line to position, and omiting the column information also speeds up the build.
- Although starting packaging in this mode is slow, most of the time the Webpack-dev-server I use is repackaged in monitor mode, which is repackaged very quickly.
The production environment
For pre-release packaging, that is, production packaging, select None, which does not generate a Source Map. The reason is simple:
- First, Source Map exposes my Source code to the production environment. If you don’t control access to the Source Map files, but anyone with a bit of skill can easily recover the vast majority of the Source code involved in the project, this is very unreasonable and insecure, and I think many people may have overlooked this issue.
- Second, debugging should be done during development, and you should find all the problems and pitfalls you can during development, not in production. If you really don’t have confidence in your code, I recommend choosing the nosource-source-map mode so that errors can be located to the source without exposing the source code
HMR mechanism
Auto-refresh issues
Every time we change the code, Webpack will detect the change, automatically package it, and notify the browser to refresh automatically. Once the page is refreshed as a whole, any action status in that page will be lost
Module hot replacement (HMR)
HMR stands for Hot Module Replacement, which is translated as “Module Hot Replacement” or “Module Hot update”.
Module hot replacement in Webpack means that we can replace a module in the application in real time while the application is running without changing the running state of the application.
Open the HMR
The easiest way to use this feature is to turn it on with the –hot argument when running the webpack-dev-server command.
Alternatively, you can enable this function by adding the corresponding configuration in the configuration file. Here we open the configuration file, here we need to configure two places:
- First you need to set the hot property in the devServer object to true;
- You then need to load a plug-in, the plug-in is a plug-in webpack built-in, so we lead into the webpack module, with this module after used here is a named HotModuleReplacementPlugin plug-in.
// ./webpack.config.js
const webpack = require('webpack')
module.exports = {
// ...
devServer: {
// Enable HMR. If the resource does not support HMR, it will fallback to live reloading
hot: true
// Only use HMR, no fallback to live reloading
// hotOnly: true
},
plugins: [
// ...
// A plug-in for the HMR feature
new webpack.HotModuleReplacementPlugin()
]
}
Copy the code
HMR questions
The HMR in Webpack needs to be handled manually through the code, how to replace the updated module into the page after the module is updated.
Q1: You may ask, why is it that when we open HMR, changes to the style file can be hot updated? Do we not manually handle style module updates?
A1: This is because style files are processed by Loader, and style files are automatically updated in style-loader, so there is no need for additional manual processing.
Q2: So you might be wondering, what style can be handled automatically when our script needs to be handled manually?
A2: The reason for this is very simple, because once the style module is updated, it just needs to replace the updated CSS to the page in time, and it can overwrite the previous style, thus implementing the update.
The JavaScript modules we write are random. You can export an object, you can export a string, you can export a function, and you can use it differently. So Webpack has no idea what to do with the updated JS modules in the face of these irregular JS modules, so it is impossible to directly implement a universal module replacement solution.
That’s why style files can be hot updated, while JS files can be automatically refreshed.
Q3: Then there are those who normally use vue-CLI or creation-react-app framework scaffolding tools and say, “My project is not manual, JavaScript code can still be hot replaced, and it’s not as troublesome as you say.”
A3: that’s because you are using a framework, using the framework of development, we have the items in each file, such as the React of each module export must be a function or class, then this can have a general way to replace, so inside these tools has been to help you achieve the general replace operation, nature would be no need for manual processing.
HRM APIs
When a JavaScript module is updated, how to replace the updated module into the page.
HotModuleReplacementPlugin JavaScript for us to provide a set of API for processing HMR, we need in our own code, the use of the API will replace the updated module to running the page.
// ./src/main.js
import createEditor from './editor'
import logo from './icon.png'
import './global.css'
const img = new Image()
img.src = logo
document.body.appendChild(img)
const editor = createEditor()
document.body.appendChild(editor)
Copy the code
A Webpack entry file in which some other modules are loaded. Because we use these modules in main.js, we have to reuse the updated modules in main.js once they are updated.
We need to add some extra code to this file to handle the updated hot replacement logic of the modules it depends on.
For environments with HMR enabled, we can access the hot member of the global Module object, which is an object that is the core object of the HMR API. This object provides an Accept method for registering handlers when a module is updated. The first argument to the Accept method receives the monitored dependency module path, and the second argument is the updated dependency module handler.
The first parameter is the path to the editor module. The second parameter requires us to pass in a function and print a message in this function.
// ./main.js
/ /... The original business code
module.hot.accept('./editor'.() = > {
// This function is automatically executed when./editor.js is updated
console.log('Editor updated ~ ~')})Copy the code
If we modify the Editor module and save, the message we added in the code above will be automatically printed in the browser console, and the browser will not refresh automatically.
Once updates to this module are handled manually, no automatic refresh is triggered; Conversely, if there is no manual processing, hot replacement will automatically fallback to automatic refresh.
JS module hot replacement
This module exports a createEditor function, which we print to the console as normal and then print again in the module’s updated handler as follows:
// ./main.js
import createEditor from './editor'
/ /... The original business code
console.log(createEditor)
module.hot.accept('./editor'.() = > {
console.log(createEditor)
})
Copy the code
If you modify the Editor module again and save it, you will find that when the module is updated, the createEditor function obtained here is updated to the latest result, as shown in the following figure:The createEditor function updates automatically after the module file is updated. The createEditor function is used to create an element that needs to be re-created as soon as the module is updated, so we remove the original element and then call the createEditor function to create a new element to append to the page as follows:
// ./main.js
import createEditor from './editor'
const editor = createEditor()
document.body.appendChild(editor)
/ /... The original business code
// HMR -----------------------------------
module.hot.accept('./editor'.() = > {
document.body.removeChild(editor) // Remove the previously created element
const newEditor = createEditor() // Create new elements with new modules
document.body.appendChild(newEditor)
})
Copy the code
But if you just do it this way, once you’ve done a hot substitution, you can’t do a hot substitution the second time. Since the second time this function is executed, the element that the editor variable points to has already been removed from the previous execution, we should also record the new element created by each hot replacement for the next hot replacement, as follows:
// ./main.js
import createEditor from './editor'
const editor = createEditor()
document.body.appendChild(editor)
/ /... The original business code
// HMR -----------------------------------
let lastEditor = editor
module.hot.accept('./editor'.() = > {
document.body.removeChild(lastEditor) // Remove the previously created element
lastEditor = createEditor() // Create new elements with new modules
document.body.appendChild(lastEditor)
})
Copy the code
It should be normal heat replacement
The state of hot substitution is maintained
Hot replacement operations must preserve the state during replacement.
Take the contents of the editor before replacing them, and then put them back after replacing them. Because I’m using an editable element here, not a textbox, we need to get the content we’ve edited using innerHTML, and then set it to the new element we’ve created. Here’s how:
// ./main.js
import createEditor from './editor'
const editor = createEditor()
document.body.appendChild(editor)
/ /... The original business code
// HMR --------------------------------
let lastEditor = editor
module.hot.accept('./editor'.() = > {
// This function is automatically executed when editor.js is updated
// Temporarily record the contents of the editor before the update
const value = lastEditor.innerHTML
// Remove the element before the update
document.body.removeChild(lastEditor)
// Create a new editor
// createEditor is now an updated function
lastEditor = createEditor()
// Restore editor content
lastEditor.innerHTML = value
// Append to the page
document.body.appendChild(lastEditor)
})
Copy the code
Why Webpack needs us to handle hot updates of JS modules ourselves: Because different modules have different situations, and different situations must be handled differently here. It’s like, we have a text editor application here, so we need to preserve state, if it’s not of this type then we don’t need to do that. So Webpack does not provide a universal JS module replacement.
Image hot module replacement
We also registered the image module’s heat replacement function with module.hot.accept. In this function, we simply set the updated image path to the image element’s SRC. Because the file name of the image will change after the modification, and here we can directly obtain the updated path, so resetting the SRC of the image can realize the hot replacement of the image, the specific code is as follows:
// ./src/main.js
import logo from './icon.png'
/ /... Other code
module.hot.accept('./icon.png'.() = > {
// execute when icon.png is updated
// Overriding the SRC setting triggers a reload of the image element, thereby updating the image locally
img.src = logo
})
Copy the code
Q&A
1. There is an error in the hot replacement code
If there is an error in the code (handler function) that handles hot replacement, the result will also be an automatic refresh. For example, here we deliberately add a runtime error to the handler, as follows:
// ./src/main.js
/ /... Other code
module.hot.accept('./editor'.() = > {
// Intentionally cause an exception
undefined.foo()
})
Copy the code
This is because HMR fails due to an error in the HMR process. After HMR fails, it will automatically revert to automatic refresh. Once the page is automatically refreshed, the error message in the console will be cleared, so that if the error is not obvious, it will be difficult to find.
In this case, we can use hotOnly to solve the problem, because the current hot mode will automatically revert to automatic refresh if hot replacement fails, whereas the hotOnly mode will not use automatic refresh.
Back in the configuration file, here we change the devServer hot = true to hotOnly = true as follows:
// ./webpack.config.js
const webpack = require('webpack')
module.exports = {
// ...
devServer: {
// Only use HMR, no fallback to live reloading
hotOnly: true
},
plugins: [
// ...
// A plug-in for the HMR feature
new webpack.HotModuleReplacementPlugin()
]
}
Copy the code
2. HRM API is used, but HMR function is not enabled
For code that uses the HMR API, if we run Webpack without HMR enabled, the runtime environment will display the error Cannot read property ‘accept’ of undefined:
The reason is that module.hot is a member provided by the HMR plugin. If the plugin is not enabled, there is no object.
The solution is as simple as checking API compatibility in business code. We first check whether the object exists and then use it. The code is as follows:
// HMR -----------------------------------
if (module.hot) { // Make sure there are HMR API objects
module.hot.accept('./editor'.() = > {
// ...})}Copy the code
We went back to the configuration file, make sure have to shut down, hot replacement features and remove the HotModuleReplacementPlugin plug-in, and then open a command line terminal, normal operation of the Webpack packaging, packaging, after we find packaging generated bundle. Js file, Then find the module corresponding to main.js inside, and the specific result is shown as follows:You’ll notice that all the code we wrote to handle hot substitution has been removed, leaving only an if (false) null judgment, which is automatically removed after compression, so it doesn’t have any impact on the production environment at all.
HMR about the frame
As for the framework HMR, since it is used out of the box in most cases, I will not introduce it here. For details, please refer to:
React HMR Vue HMR
Tree-shaking
The Tree – shaking is introduced
Tree Shaking translates to “Shaking trees”. With the shaking of a tree, dead branches and leaves fall from the tree.
The same is true for tree-shaking, but it is the parts of code that are not used that are shaken off. The technical term for shaking useless code is dead-code.
Tree-shaking was originally a feature in Rollup and has been supported in Webpack since 2.0.
In our optimizations packaged in Webpack production mode, we used auto-enable to detect unreferenced code in our code and remove it automatically.
// ./src/components.js
export const Button = () = > {
return document.createElement('button')
console.log('dead-code')}export const Link = () = > {
return document.createElement('a')}export const Heading = level= > {
return document.createElement('h' + level)
}
Copy the code
The Button component has a console.log() statement after the return, which is clearly never executed, so console.log() is unreferenced.
// ./src/main.js
import { Button } from './components'
document.body.appendChild(Button())
Copy the code
Here, when importing the components module, we only extracted the Button members in the module, which resulted in many parts of the components module will not be used, so these parts are redundant. The specific redundant parts are as follows:
// ./src/components.js
export const Button = () = > {
return document.createElement('button')
// No code is referenced
console.log('dead-code')}// No code is referenced
export const Link = () = > {
return document.createElement('a')}// No code is referenced
export const Heading = level= > {
return document.createElement('h' + level)
}
Copy the code
Removing redundant code is an important part of production optimization, and Webpack’s tree-shaking feature is a great example of this.
We open the command line terminal, here we try to run packaging in production mode, the specific command is as follows:
npx webpack --mode=production
Copy the code
The tree-shaking feature of Webpack is automatically turned on in production mode. Once the package is complete, we open the bundle.js output as follows:Search for redundant code in the Components module and you’ll find no output at all. This is what it looks like after tree-shaking.
Imagine if we introduced Lodash as a tool library in a project. Most of the time, we would only use some of the tool functions, and the rest would be redundant code. Tree-shaking can greatly reduce the size of the final packaged bundles.
It is important to note that tree-shaking is not a configuration option in Webpack, but rather a combination of functions that are automatically enabled in production mode, so packaging in production mode gives you the tree-shaking effect.
Open the Tree shaking
Here’s the case structure again, we run Webpack packaging again, but this time instead of using production mode, we use None, that is, not to enable any of the built-in functions and plug-ins, as follows:
npx webpack --mode=none
Copy the code
Once the package is complete, we find the bundle.js output again. The result is as follows:Note that the components module does not use the Link and Heading functions externally, but it still exports them, as shown below:This derivation makes no sense.
Once we know the current status of the packaging results, we open the Webpack configuration file and add an Optimization property to the configuration object, which is used to centralize the optimization functionality built into Webpack and whose value is also an object.
The usedExports option can be used to export only external members of the optimization object.
// ./webpack.config.js
module.exports = {
/ /... Other Configuration Items
optimization: {
// The module exports only used members
usedExports: true}}Copy the code
After configuration, repackage, and then look at the bundle.js output as shown below:At this point you’ll notice that the components module’s corresponding functions no longer export the Link and Heading functions, and their corresponding code becomes unreferenced. And if you’re using VS Code, you’ll notice that VS Code has lightened the function names to indicate that they’re not referenced.
For this type of unreferenced code, if we turn on the compression function, we can automatically compress the unused code.
We can go back to the configuration file and try to enable minimize in the optimization configuration, which is as follows:
// ./webpack.config.js
module.exports = {
/ /... Other Configuration Items
optimization: {
// The module exports only used members
usedExports: true.// Compress the output
minimize: true}}Copy the code
Then go back to the command line and re-run the package, as shown below:
If you look closely at the packaging results, you’ll see that unreferenced code like Link and Heading has been removed automatically.
This is tree-shaking implementation, using two optimizations of Webpack:
- UsedExports – Export only the external members in the package result.
- Take out – compress packing results.
If you think of our code as a big tree, you can think of it this way:
- UsedExports is used to mark dead branches and leaves on trees.
- The function of taking care is to shake down dead branches and leaves.
Merge modules (extensions)
In addition to the usedExports option, we can use a concatenateModules option to continue optimizing the output.
Normal packaging just ends up putting a module into a single function, and if we have a lot of modules, that means there will be a lot of module functions in the output.
The role of the concatenateModules configuration is to combine all modules as much as possible into a single function, which increases efficiency and reduces the size of the code.
Back in the configuration file, here we turn on concatenateModules in the Optimization property. Meanwhile, in order to better see the effect, we first turn off the minimize, and the specific configuration is as follows:
// ./webpack.config.js
module.exports = {
/ /... Other Configuration Items
optimization: {
// The module exports only used members
usedExports: true.// Merge each module into a function if possible
concatenateModules: true.// Compress the output
minimize: false}}Copy the code
Then go back to the command line terminal and run the package again. The bundle.js module is no longer a function, but all modules are grouped into one function. The result is as follows:This feature, also known as Scope promotion, was added in Webpack 3.0.
If you combine it with the minimize option, the size of the package will be much smaller.
Combine the problem with babel-loader
Because the early Webpack was moving so fast, there was a lot of change, so when we went looking for material, the results didn’t necessarily apply to the current version we were using. This is especially true for tree-shaking. Many of them say “Configuring babel-Loader for JS modules causes tree-shaking to fail”.
The premise of a tree-shaking implementation is ES Modules, that is, the code that is ultimately packaged by Webpack must be modularized using ES Modules.
We all know that Webpack, before packaging all module code, first sends modules to different Loaders for processing according to configuration, and finally packages the results of Loader processing together.
Many times we choose to use Babel-Loader to convert new ECMAScript features in our source code for better compatibility. When Babel converts JS code, it’s likely that the ES Modules part of our code will be removed and converted to CommonJS, as shown in the following figure:
Whether or not Babel handles ES Modules code depends on whether or not we have configured a plug-in for it that converts ES Modules.
Most of the time, we configure Babel as a preset set of plug-ins, rather than preset. For example, the most widely used @babel/preset-env has a plugin for converting ES Modules. So when we use this default, the ES Modules part of the code is converted to CommonJS. When Webpack is packaged, it gets code organized in CommonJS, so tree-shaking doesn’t work.
In order to easily identify the results, we only enable usedExports. The complete configuration is as follows:
// ./webpack.config.js
module.exports = {
mode: 'none'.entry: './src/main.js'.output: {
filename: 'bundle.js'
},
module: {
rules: [{test: /\.js$/,
use: {
loader: 'babel-loader'.options: {
presets: [['@babel/preset-env']}}}]},optimization: {
usedExports: true}}Copy the code
After the configuration is complete, we open the command line terminal, run the Webpack command, and find bundle.js. The result is as follows:If you look closely, you will see that the result is not as stated above. UsedExports still works, and if we compress the code, the unreferenced code will still be removed. This shows that tree-shaking is not invalid.
So what’s going on? Why does a lot of literature say that babel-loader causes tree-shaking to fail, but when we actually try it, it doesn’t?
In the latest version of babel-loader (8.x), the plugin for ES Modules conversion has been automatically turned off. You can refer to the corresponding version of babel-loader source code, the core code is as follows:By looking at the source of the babel-Loader module, we can see that it already indicates in the injectCaller function that the current environment supports ES Modules.
Find the source of @babal/preset-env module, part of the core code is as follows:
In this module, the ES Modules conversion plugin is automatically disabled according to the environment flag, so the code processed by Babel-Loader is still ES Modules by default, and the final package of Webpack is ES Modules. Tree-shaking should work.
You can also use the ES Modules conversion plugin to force it on in babel-loader configuration as follows:
// ./webpack.config.js
module.exports = {
mode: 'none'.entry: './src/main.js'.output: {
filename: 'bundle.js'
},
module: {
rules: [{test: /\.js$/,
use: {
loader: 'babel-loader'.options: {
presets: [['@babel/preset-env', { modules: 'commonjs'}]]}}}]},optimization: {
usedExports: true}}Copy the code
There is a special way to add configuration to Babel Preset, which is prone to misconfiguration by many people. Be careful. It needs to define the members of the preset array as an array, and the first member of the array is the name of the preset used, and the second member is the preset defined configuration object.
We set the modules property to “commonjs” in this object. By default, this property is auto. When we set it to CommonJS, we force the use of Babel’s ES Modules plugin to convert ES Modules in our code to CommonJS.
Once done, we open the command line terminal again and run the Webpack package. Then I find bundle.js, which looks like this:At this point, you will find that usedExports does not work. Tree-shaking will not work even if we turn on compressed code.
To sum up, this experiment shows that the latest version of Babel-Loader does not cause tree-shaking failure. If you are not sure if the babel-loader you are using is causing this problem, the easiest way is to set @babel/preset-env’s modules property to false in the configuration to ensure that ES modules are not converted, This ensures the premise of tree-shaking.
SideEffects
Introduction of SideEffects
A new sideEffects feature has been added to Webpack 4, which allows us to configure to identify whether our code has sideEffects, thus providing more compression.
TIPS: Module side effects refer to whether the module does anything other than export members when it executes.
This feature is usually only used when developing an NPM module. SideEffects is not shaking Tree-shaking, so many people think there is a causal relationship between them.
Let’s take a look at the sideEffects feature itself, and you’ll understand why it’s not tree-shaking.
// ./src/components/index.js
export { default as Button } from './button'
export { default as Link } from './link'
export { default as Heading } from './heading'
Copy the code
In addition, in each component, we added a console operation (side effect code) as follows:
// ./src/components/button.js
console.log('Button component~') // Side effect code
export default() = > {return document.createElement('button'}} we will load the components' Button member in the package entry file (main.js) as follows:// ./src/main.js
import { Button } from './components'
document.body.appendChild(Button())
Copy the code
The problem with this is that although we only want to load the Button module here, we actually load components/index.js, which in index.js loads all the component modules in this directory, which causes all the component modules to be loaded and executed.
We opened the command line terminal and tried to run packaging. After packaging, we found the packaging result as follows:If we turn on the tree-shaking feature (set useExports only), export members that are not used here can be removed eventually. The package looks like this:However, because there is side effect code in the modules that these members belong to, these modules are not completely removed after the final tree-shaking.
You might think that this code should be saved, but the fact is that the side effects in these modules are usually for the module itself, such as the console.log I added here to indicate that the module is currently loaded. But in the end the entire module is useless, so there is no need to leave this side effect code behind.
Tree-shaking can only remove unused code members, but to remove unused modules completely, you need to enable sideEffects.
SideEffects role
Open the Webpack configuration file and enable sideEffects in Optimization as follows:
// ./webpack.config.js
module.exports = {
mode: 'none'.entry: './src/main.js'.output: {
filename: 'bundle.js'
},
optimization: {
sideEffects: true}}Copy the code
TIPS: Note that this feature is also automatically enabled in Production mode.
Before packaging a module, Webpack checks the sideEffects identifier in package.json to see if the module has sideEffects. If there are no sideEffects, unused modules will not be packaged. In other words, even if these unused modules have sideEffects code, we can enforce that there are no sideEffects through sideEffects in package.json.
Json, add a sideEffects field and set it to false as follows:
{
"name": "09-side-effects"."version": "0.1.0 from"."author": "zce <[email protected]> (https://zce.me)"."license": "MIT"."scripts": {
"build": "webpack"
},
"devDependencies": {
"webpack": "^ 4.43.0"."webpack-cli": "^ 3.3.11." "
},
"sideEffects": false
}
Copy the code
This means that all the code in our project has no side effects, and Webpack feels free to “do it.”
Once done, we run the package again and find the bundle.js file for the package output as follows:The unused modules will not be packaged at all. So that’s what sideEffects do.
There are two places set up:
- SideEffects in webpack.config.js is used to enable this functionality;
- SideEffects in package.json is used to indicate that our code has no sideEffects.
SideEffects attention
Side effects that affect the whole world cannot be removed, but side effects that affect only modules can be removed.
Make sure that your code has no sideEffects, or that your sideEffects don’t have global effects, or you’ll mistakenly remove your meaningful sideEffects when you package them.
For example, the extend.js module I prepared here:
// ./src/extend.js
// Add an extension method to the prototype for Number
Number.prototype.pad = function (size) {
const leadingZeros = Array(size + 1).join(0)
return leadingZeros + this
}
Copy the code
There are no members exported in this module, just a pad method mounted on the prototype of Number to add leading zeros to the Number, a common prototype-based extension method from a long time ago.
Return to main.js and import the extend module as follows:
// ./src/main.js
import './extend' // Internal contains global side effects
console.log((8).pad(3)) / / = > '0008'
Copy the code
Since this module does not export any members, there is no need to extract any members here. Once imported, you can use it to provide extension methods for Number.
Extending the Number type here is a global side effect of the EXTEND module.
If we still use package.json to indicate that our code has no side effects, we will have a problem when we package again. We can find the packing result as shown below:As we can see, extensions to Number are not packaged.
Without the extension operation on Number, our code will fail when we run it again. This extended operation is a global side effect.
This prototype-based extension can be found in many Polyfill libraries, most commonly es6-Promise, which is a typical side effect module. In addition, CSS modules that we load directly in JS are also side effects modules, which also face this problem. So not all side effects should be removed, and some necessary side effects should be preserved.
The best way to do this is in the sideEffects field of package.json to identify the module path where you want to keep the sideEffects (you can use wildcards) as follows:
{
"name": "09-side-effects"."version": "0.1.0 from"."author": "zce <[email protected]> (https://zce.me)"."license": "MIT"."scripts": {
"build": "webpack"
},
"devDependencies": {
"webpack": "^ 4.43.0"."webpack-cli": "^ 3.3.11." "
},
"sideEffects": [
"./src/extend.js"."*.css"]}Copy the code
In short, both tree-shaking and sideEffects are intended to complement JavaScript’s early modular system design. With the development of technologies like Webpack, the modularity of JavaScript is definitely getting better and better.
Code Spliting
Disadvantages of All in One
While there are obvious advantages to modularizing front-end projects as a whole through Webpack, there are some drawbacks: It ends up packaging all of our code together. Imagine if our application was very complex and had many modules, then this all-in-one approach would result in packages that were too large, even larger than 4-5m.
In the vast majority of cases, not all modules are required when the application starts to work. If all of these modules are packaged together, the bundle.js must be loaded as a whole even if the application only needs one or two modules to work. In addition, the front-end application usually runs on the browser side, which means that the response time of the application is affected, and a large amount of traffic and bandwidth is wasted.
Therefore, such an All in One approach is not reasonable. A more reasonable solution is to separate the packaged results into multiple bundles according to certain rules, and then load them according to the running needs of the application. This can reduce start-up costs and improve response times.
Webpack makes loading more efficient by packing together scattered modules in a project, so why separate them here? Isn’t that a contradiction?
Resources in Web applications are limited by the environment. Too large is not good, and too fragmented is not good. Because the granularity of modules in our development process is generally very fine, most of the time a module just provides a small tool function, and can not form a complete functional unit.
If we do not package these resource modules and load them directly according to the granularity of the modules in the development process, then running a small feature requires loading a lot of resource modules.
HTTP1.1 is inherently flawed
- Parallel requests under the same domain name are limited;
- Each request has a delay of its own;
- In addition to transmitting content, each request has additional headers, which can also waste traffic and bandwidth in the case of a large number of requests.
Code Splitting
In order to solve the problem of packaging results being too large, Webpack designed a kind of subcontracting feature: Code Splitting.
Code Splitting reduces startup costs and improves response times by packaging resource modules in projects into different bundles according to the rules we design.
Webpack implements subcontracting in two main ways:
- Configure multiple packaging entrances according to different services and output multiple packaging results.
- Load Modules on demand in combination with the Dynamic Imports feature of ES Modules.
Multiple entry packing
Multi-entry packaging generally applies to traditional multi-page applications. The most common partition rule is that one page corresponds to one entry of packaging, and the common parts of different pages are extracted into the common results.
The sample source code
GitHub:github.com/zce/webpack… CodeSandbox: CodeSandbox. IO/s/a lot/zc…
. ├ ─ ─ dist ├ ─ ─ the SRC │ ├ ─ ─ common │ │ ├ ─ ─ the fetch. Js │ │ └ ─ ─ global. The CSS │ ├ ─ ─ album. The CSS │ ├ ─ ─ album. The HTML │ ├ ─ ─ album. Js │ ├─ ├─ ├─ ├─ ├─ ├─ download.config.jsCopy the code
This example has two pages, index and album. The logic of code organization is also simple:
- Index.js is responsible for the implementation of index page function logic;
- Album.js is responsible for implementing the functional logic of the album page;
- Global.css is a common style file;
- Fetch. Js is a common module responsible for the request API.
Configure multi-entry packaging for this case as follows:
// ./webpack.config.js
const HtmlWebpackPlugin = require('html-webpack-plugin')
module.exports = {
entry: {
index: './src/index.js'.album: './src/album.js'
},
output: {
filename: '[name].bundle.js' // [name] is the entry name
},
/ /... Other configuration
plugins: [
new HtmlWebpackPlugin({
title: 'Multi Entry'.template: './src/index.html'.filename: 'index.html'.chunks: ['index'] // Specify to use index.bundle.js
}),
new HtmlWebpackPlugin({
title: 'Multi Entry'.template: './src/album.html'.filename: 'album.html'.chunks: ['album'] // Specify the use of album.bundle.js}})]Copy the code
Generally, only one entry package is configured in the Entry property. If we need to configure multiple entries, we can define entry as an object.
An entry is defined as an object, not an array, which is a package of multiple files, or an entry.
In this object, an attribute is an entry, the attribute name is the name of the entry, and the value is the file path corresponding to the entry. What we have configured here is the JS file path corresponding to the index and album pages.
Once our entry is configured to be multi-entry, the output file name also needs to be changed, because two entries have two packaged results, not both of which can be called bundle.js. We can use a placeholder like [name] here to output the dynamic filename, which will eventually be replaced by the name of the entry.
In addition, corresponding HTML files are generated for index and album pages respectively through htML-webpack-plugin in the configuration.
By default, this plugin will automatically inject all the bundle results. If you need to specify the bundle to use, you can do this through the chunks property of the HtmlWebpackPlugin. We configured the two pages to use different chunks so that each packaging entry forms a separate chunk.
Once configured, we can open the command line terminal and run the Webpack package, which will have two entries. After packaging, we find the output directory, and here we can see the packaging results of the two entry files, as shown below:
Extract common modules
We also need to extract these common modules into a separate bundle. It is very easy to extract common modules in Webpack. We only need to enable splitChunks in the optimization configuration as follows:
// ./webpack.config.js
module.exports = {
entry: {
index: './src/index.js'.album: './src/album.js'
},
output: {
filename: '[name].bundle.js' // [name] is the entry name
},
optimization: {
splitChunks: {
// Automatically extract all public modules into separate bundles
chunks: 'all'}}/ /... Other configuration
}
Copy the code
Here we add the splitChunks property to the optimization property, and the value of this property is an object that needs to be configured with a Chunks property, which we set to all to indicate that all common modules can be extracted.
Other uses of splitChunks
Dynamic import
Code Splitting is more commonly implemented in combination with the dynamic import feature of ES Modules to achieve on-demand loading.
Webpack supports dynamic import for on-demand loading of modules, and all dynamically imported modules are automatically extracted into separate bundles for subcontracting.
Dynamic imports are more flexible than the multi-entry approach because you can control whether or when a module needs to be loaded using logic in your code. In addition, one of the most important purposes of our subcontracting is to enable modules to be loaded on demand, thus improving the response speed of the application.
. ├ ─ ─ the SRC │ ├ ─ ─ album │ │ ├ ─ ─ album. The CSS │ │ └ ─ ─ album. Js │ ├ ─ ─ common │ │ ├ ─ ─ the fetch. Js │ │ └ ─ ─global.css │ ├ ─ ─ posts │ │ ├ ─ ─ posts. The CSS │ │ └ ─ ─ posts. Js │ ├ ─ ─ index. The HTML │ └ ─ ─ index. The js ├ ─ ─ package. The json └ ─ ─ webpack.config.jsCopy the code
The article list corresponds to the Posts component here, and the album list corresponds to the album component. I imported both modules in the package entry (index.js), and then decided which component to display based on the change of the page anchor point. The core code is as follows:
// ./src/index.js
import posts from './posts/posts'
import album from './album/album'
const update = () = > {
const hash = window.location.hash || '#posts'
const mainElement = document.querySelector('.main')
mainElement.innerHTML = ' '
if (hash === '#posts') {
mainElement.appendChild(posts())
} else if (hash === '#album') {
mainElement.appendChild(album())
}
}
window.addEventListener('hashchange', update)
update()
Copy the code
To import modules dynamically, you can call the import keyword as a function. When used in this way, the import function returns a Promise object. This is called Dynamic Imports in the ES Modules standard.
Dynamic Imports
With dynamic imports, there is no waste problem because all components are lazily loaded and only loaded when needed. The specific implementation code is as follows:
// ./src/index.js
// import posts from './posts/posts'
// import album from './album/album'
const update = () = > {
const hash = window.location.hash || '#posts'
const mainElement = document.querySelector('.main')
mainElement.innerHTML = ' '
if (hash === '#posts') {
// mainElement.appendChild(posts())
import('./posts/posts').then(({ default: posts }) = > {
mainElement.appendChild(posts())
})
} else if (hash === '#album') {
// mainElement.appendChild(album())
import('./album/album').then(({ default: album }) = > {
mainElement.appendChild(album())
})
}
}
window.addEventListener('hashchange', update)
update()
Copy the code
Use the import function to import the specified path where the component is needed, and this method returns a Promise. In the then method of this Promise we can get the module object. Since our posts and album modules are exported as default members, we need to deconstruct the default in the module object, get the exported member first, and then use the exported member normally.
Packing results Specific results:At this point, there will be three additional JS files in dist directory, among which two files are dynamically imported modules, and the other file is a common module in the dynamically imported module. These three files are generated by dynamic import automatic subcontracting.
Use of dynamic imports in Webpack. We don’t need to configure any extra parts in the whole process, just need to follow the ES Modules dynamic import way to import Modules can be done, Webpack will automatically handle the subcontracting and on-demand loading.
If you are using a SPA framework such as vue.js, routing mapped components in your project can be loaded on demand using this dynamic import method, thus enabling subcontracting.
Magic annotation
The default dynamically imported bundle file name is a serial number, which is fine because most of the time in production we don’t care about resource file names at all.
But if you still need to name these bundles, you can do so using magic comments that are Webpack specific. Specific methods are as follows:
// Magic notes
import(/* webpackChunkName: 'posts' */'./posts/posts')
.then(({ default: posts }) = > {
mainElement.appendChild(posts())
})
Copy the code
The magic comment is to add an inline comment to the formal parameter of the import function. This comment has a specific format: webpackChunkName: “, so that the chunk of the subcontract can be named.
Once done, we open the command line terminal again and run the Webpack package. The name of the generated bundle will use the name provided in the comment.Magic notes also serve a special purpose: If you have the same chunkName, the same chunkName will eventually be packaged together. For example, we can set both chunknames to components, and then run packaging again, both modules will be packaged into a file. The specific operation is shown in the figure below:
Build speed versus package results
There is a big difference between a production environment and a development environment. In a production environment, we emphasize the need to accomplish business functions with less and more efficient code. In a development environment, all we care about is efficiency.
Webpack 4 introduced the use of mode, which gives us some preset configurations for different modes, including many optimizations for production mode.
Configurations in different environments
Create different Webpack configurations for different work environments. There are two main ways to create different environment configurations:
- Add criteria to the configuration file and export different configurations based on the environment.
- Add a configuration file for each environment. Each environment corresponds to one configuration file.
Back in the configuration file, the Webpack configuration file also supports exporting a function and then returning the desired configuration object from that function. This function can take two arguments, the first is env, which is the environment name argument we passed through the CLI, and the second is argv, which is all the arguments in running the CLI. The specific code is as follows:
// ./webpack.config.js
module.exports = (env, argv) = > {
return {
/ /... Webpack configuration}}Copy the code
We can then use this feature to create different configurations for development and production environments. I first define the common configurations in different modes as a Config object, and then add special configurations for different environments to the Config object based on judgment. The specific code is as follows:
// ./webpack.config.js
module.exports = (env, argv) = > {
const config = {
/ /... Common configurations in different modes
}
if (env === 'development') {
// Add special configuration for config in development mode
config.mode = 'development'
config.devtool = 'cheap-eval-module-source-map'
} else if (env === 'production') {
// Add special configuration for config in production mode
config.mode = 'production'
config.devtool = 'nosources-source-map'
}
return config
}
Copy the code
Configuration files for different environments
The method of returning different configuration objects by judging the environment name parameter is only suitable for small and medium-sized projects, because as the project gets complicated, so does our configuration. Therefore, for large projects, it is recommended to use different configuration files for different environments.
In this way, there are usually at least three WebPack configuration files in the project. Two of these are for development and production environments respectively, and one is a common configuration. Because the configuration of the development and production environments is not completely different, a common file is required to abstract the same configuration from both. The configuration file structure is as follows:
Mon. ├ ─ ─ webpack.com. Js · · · · · · · · · · · · · · · · · · · · · · · · · · · · common configuration ├ ─ ─ webpack. Dev. Js · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · development pattern configuration └ ─ ─ Webpack. Prod. Js · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · production pattern configurationCopy the code
The object. assign method is used to copy the common configuration Object to the specific configuration Object and overwrite some of the configurations. Details are as follows:
// ./webpack.common.js
module.exports = {
/ /... Common configuration
}
// ./webpack.prod.js
const common = require('./webpack.common')
module.exports = Object.assign(common, {
// Production mode configuration
})
// ./webpack.dev.js
const common = require('./webpack.common')
module.exports = Object.assign(common, {
// Development mode configuration
})
Copy the code
The object. assign method completely overwrites the attributes of the same name in the previous Object. This feature is fine for overwriting common value type attributes. However, as with plugins in the configuration, we just want to add some plugins to the existing public configuration. Object. Assign cannot be done.
This can be done using the Merge function provided by Lodash. The community provides a more specialized module, Webpack-Merge, which is specifically designed to meet our need to merge webPack configurations here.
npm i webpack-merge --save-dev
# or yarn add webpack-merge --dev
Copy the code
What this module exports is a merge function, which we use to merge the configuration here with the common configuration. The specific code is as follows:
// ./webpack.common.js
module.exports = {
/ /... Common configuration
}
// ./webpack.prod.js
const merge = require('webpack-merge')
const common = require('./webpack.common')
module.exports = merge(common, {
// Production mode configuration
})
// ./webpack.dev.jss
const merge = require('webpack-merge')
const common = require('./webpack.common')
module.exports = merge(common, {
// Development mode configuration
})
Copy the code
After using WebPack-Merge, configuration objects here can be configured as they would be in a normal WebPack configuration, with merge logic automatically handled internally.
Once configured separately, we went back to the command line terminal and tried to run the WebPack package. However, since there is no default configuration file, we need to specify the path to the configuration file we use with the –config parameter. Such as:
webpack --config webpack.prod.js
Copy the code
You can also define this build command into NPM scripts for easy use.
Optimized plug-ins in production mode
In production mode, which is new in Webpack 4, many general optimizations are automatically enabled internally. Out-of-the-box is very convenient for users, but out-of-the-box for learners can cause us to miss out on a lot of things we need to know. So that there is no place to start.
Let’s first learn about the main optimization features in Production mode and how Webpack optimizes the packaging results.
Define Plugin
DefinePlugin is used to inject global members into our code. In production mode, a process.env.node_env is injected into the code by default via this plug-in. Many third-party modules use this member to determine the operating environment and decide whether to perform operations such as printing logs.
DefinePlugin is a built-in plug-in, so we import the WebPack module first and then add it to plugins. The plug-in’s constructor takes an object argument, and the members of the object can be injected into the code. The specific code is as follows:
// ./webpack.config.js
const webpack = require('webpack')
module.exports = {
/ //... Plugins: [new webpack.defineplugin ({API_BASE_URL: 'https://api.example.com'})]}Copy the code
Define an API_BASE_URL via DefinePlugin to inject the API service address for our code, whose value is a string.
// ./src/main.js
console.log(API_BASE_URL)
Copy the code
Once done, we open the console and run webPack packaging. After the packaging is complete, we find the packaging result and then find the module corresponding to main.js. The specific results are as follows:Here we find that DefinePlugin simply replaces the content of our configured string directly into the code, and the current content of the string isapi.example.comThe string does not contain quotes, so substitution is syntactically problematic.
The correct approach is to pass in a string literal statement. The concrete implementation is as follows:
// ./webpack.config.js
const webpack = require('webpack')
module.exports = {
/ /... Other configuration
plugins: [
new webpack.DefinePlugin({
// The value requires a snippet of code
API_BASE_URL: '"https://api.example.com"'}})]Copy the code
If we need to inject a value, we can use json.stringify to get the literal representation of the value. It’s less likely to go wrong. The concrete implementation is as follows:
// ./webpack.config.js
const webpack = require('webpack')
module.exports = {
/ /... Other configuration
plugins: [
new webpack.DefinePlugin({
// The value requires a snippet of code
API_BASE_URL: JSON.stringify('https://api.example.com')]}})Copy the code
Mini CSS Extract Plugin
For CSS file packaging, we usually use style-loader for processing, the final result of this processing is the CSS code will be embedded in the JS code.
The mini-CSS-extract-plugin is a plug-in that extracts CSS code from the package results. It is very simple to use, and also requires NPM installation. Specific commands are as follows:
npm i mini-css-extract-plugin --save-dev
Copy the code
Configuration file for Webpack. The configuration is as follows:
// ./webpack.config.js
const MiniCssExtractPlugin = require('mini-css-extract-plugin')
module.exports = {
mode: 'none'.entry: {
main: './src/index.js'
},
output: {
filename: '[name].bundle.js'
},
module: {
rules: [{test: /\.css$/,
use: [
// 'style-loader', // inject the style through the style tag
MiniCssExtractPlugin.loader,
'css-loader']]}},plugins: [
new MiniCssExtractPlugin()
]
}
Copy the code
- This plug-in is added to the plugins array of the configuration object. The Mini CSS Extract Plugin will automatically Extract CSS from the code at work.
- The Mini CSS ExtractPlugin also requires us to use the loader provided in the MiniCssExtractPlugin to replace the style-loader to capture all styles.
Once packaged, styles are stored in separate files, directly introduced to the page through the link tag. If your CSS is not very large, extracting them into a single file can be counterproductive, since separate files need to be requested separately. My personal experience is that CSS should only be extracted as a separate file if it is over 200KB.
Optimize CSS Assets Webpack Plugin
After using the Mini CSS Extract Plugin, styles are extracted into a separate CSS file. But js files are compressed, CSS files are not compressed
In production mode, the output will be compressed automatically, and we can open the JS file generated by packaging. Webpack’s built-in compression plug-in is only for the compression of JS files, other resource files compression need additional plug-ins.
Optimize CSS Assets Webpack Plugin We can use this plug-in to compress our style files.
npm i optimize-css-assets-webpack-plugin --save-dev
Copy the code
The specific configuration code is as follows:
// ./webpack.config.js
const MiniCssExtractPlugin = require('mini-css-extract-plugin')
const OptimizeCssAssetsWebpackPlugin = require('optimize-css-assets-webpack-plugin')
module.exports = {
mode: 'none'.entry: {
main: './src/index.js'
},
output: {
filename: '[name].bundle.js'
},
module: {
rules: [{test: /\.css$/,
use: [
MiniCssExtractPlugin.loader,
'css-loader']]}},plugins: [
new MiniCssExtractPlugin(),
new OptimizeCssAssetsWebpackPlugin()
]
}
Copy the code
The official documentation for this plug-in shows that it is not configured in the plugins array, but added to the Minimizer property in the Optimization object. Details are as follows:
// ./webpack.config.js
const MiniCssExtractPlugin = require('mini-css-extract-plugin')
const OptimizeCssAssetsWebpackPlugin = require('optimize-css-assets-webpack-plugin')
module.exports = {
mode: 'none'.entry: {
main: './src/index.js'
},
output: {
filename: '[name].bundle.js'
},
optimization: {
minimizer: [
new OptimizeCssAssetsWebpackPlugin()
]
},
module: {
rules: [{test: /\.css$/,
use: [
MiniCssExtractPlugin.loader,
'css-loader']]}},plugins: [
new MiniCssExtractPlugin()
]
}
Copy the code
Optimize-css-assets-webpack-plugin is configured to the plugins property so that the plugin will work in any case. Minimizer, on the other hand, only works when the minimizer is on.
So Webpack suggests that compression plug-ins like this should be configured into minimizer for uniform control of the minimize option.
But there is a drawback to this configuration. At this point, we run the production package again. After the package is finished, we take a look at the output JS file. The output of JS is as follows:This is because we set minimizer and Webpack decided that we needed to use the custom compression plugin so that the internal JS compressor would be overwritten. We have to add it back manually.
The built-in JS compression plugin is called Terser-webpack-plugin
npm i terser-webpack-plugin --save-dev
Copy the code
// ./webpack.config.js
const MiniCssExtractPlugin = require('mini-css-extract-plugin')
const OptimizeCssAssetsWebpackPlugin = require('optimize-css-assets-webpack-plugin')
const TerserWebpackPlugin = require('terser-webpack-plugin')
module.exports = {
mode: 'none'.entry: {
main: './src/index.js'
},
output: {
filename: '[name].bundle.js'
},
optimization: {
minimizer: [
new TerserWebpackPlugin(),
new OptimizeCssAssetsWebpackPlugin()
]
},
module: {
rules: [{test: /\.css$/,
use: [
MiniCssExtractPlugin.loader,
'css-loader']]}},plugins: [
new MiniCssExtractPlugin()
]
}
Copy the code