Background of 0.

With the continuous improvement of the construction system and the optimization of the construction experience, WebPack has gradually become a major overlord of the front-end construction system. For front-end engineering projects in the real sense of work, WebPack has become the best choice for our front-end construction technology selection. Common scaffolding tools, such as create-React-app and VUe-CLI, are built on top of WebPack. But as the business code grows and the project deepens, so does our build time. Over time, the conclusion will be thrown out that WebPack builds are too slow and “heavy.” As the author of the recent optimization for the team project as an example, as shown in the figure below, we can see that as the project of pile and some incorrect reference, team projects within a single building length has reached 40 s, this creates an engineer if you need to restart devServer or perform the build, can cause very bad experience.

After optimization, the time of the second startup can be close to 8s. So what kind of divine manipulation can have such an effect? Take your time, let’s go step by step, and if you follow my steps, maybe in just one night, you’ll be able to further optimize your team’s build system.

However, before the main body begins, first of all, it needs to be explained in advance that the construction efficiency improvement method introduced in this article is based on Webpack4. For projects using the old version, HOW to upgrade from the old version to webPack4 process WILL not be introduced too much. Most of the basics, such as webpack-dev-server-related configurations and some common plugins, will not be covered in this article because there are so many good articles to be found on both nuggets and forums. For those of you who continue to follow the Webpack version, I believe you also know that Webpack 5 is already ready at this stage. The figure below is the official milestone progress. While the current update is only 64%, I would like to send a wave of soft articles first. May have been integrated into the system of Webpack itself, who let it get better and better every day 😊.

Over the past year, busy with graduation and work, the role of the Nuggets gradually changed from a writer to a reader. Now work and life are basically stable, the author also hope to be like the past, their work and learning accumulation, with everyone to share. As the beginning of the author’s return to nuggets, I hope that after reading this article, we can let everyone in the work, for today’s front-end indispensable construction system, there is a new recognition and more bold attempt. Of course, if you encounter problems in this process, I am also very welcome to communicate with you, learn from you and make progress together.

Without further ado, let’s get to the main topic of this time.

In this paper, the author will solve the problem in practice as the index, and gradually take you to analyze the problem -> find the problem -> solve the problem to understand the whole process of building system optimization.

1. Build points

In order to optimize, we have to know where to optimize. So what was it that slowed down our build efficiency during a build process? How can we measure them?

To solve these two problems, we need to use a tool: speed-measure-webpack-plugin, which can measure the execution time of each Loader and plugin during your build process. The official renderings are as follows:

It’s just as easy to use, as shown in the example below. Just add a layer of smp.wrap to your original configuration when you export the Webpack configuration, then perform the build, and you’ll see the execution times for each type of module in the Console panel as shown in the demo.

const SpeedMeasurePlugin = require("speed-measure-webpack-plugin");
 
const smp = new SpeedMeasurePlugin();
 
module.exports = smp.wrap(YourWebpackConfig);
Copy the code

Tip: Since the speed-measure-webpack-plugin is not perfect enough to upgrade webpack, there is still a BUG (at the time of writing). Just can’t live with your own custom plugin (add-asset-html-webpack-plugin) that is mounted on hooks provided by htML-webpack-plugin, so before you need to do anything, If such Plugin exists, please remove it first, otherwise it will cause the problems mentioned in my issue.

It is safe to say that most of the execution time should be spent on the Loader that compiles JS and CSS and the Plugin that performs compression on these two types of code. If your execution results are as I say, please give my article a thumbs up at 😁.

Why is that? The compiler (in this case, Webpack) needs to convert the string code we wrote into an AST, which is a tree object like the following:

Obviously, the compiler can’t implement such a complex compilation process by explicitly replacing strings with re’s. Instead, all the compiler needs to do is walk through the tree, find the correct nodes, and replace them with compiled values, as shown below:

I covered this in detail in my previous Webpack debunking – The Path to the advanced Front End, if you’re interested

You remember being abused a thousand times by tree algorithms while studying Data Structures and Algorithms or interviewing for a job, and you remember the idea of depth first and breadth first traversal. As you can imagine, the build time is concentrated in compiling or compressing code precisely because it has to traverse the tree to replace characters or transform syntax, so it has to go through “transform AST -> traverse the tree -> convert back to code,” which, you know, takes a long time.

2. Optimization strategy

Now that we’ve found the culprit slowing down our build rate, let’s go point-to-point! I’ll get straight to the point here, and since we all know why the build takes so long, we can come up with a strategy. So we’re going to start with four broad directions: cache, multicore, pull out, and split. Now that you see these four words, it’s nice to see some familiar ideas coming up in your head, so you’ll be able to understand what I’m going to do a little bit faster.

2.1. The cache

Every time we make a project change, we will not rewrite everything, but every time we do a build, we will compile everything again. Can this repeat work be cached, just like the browser loads resources? Most loaders provide a cache configuration item. For example, in babel-loader, you can enable caching by setting cacheDirectory. The babel-loader will write the result of each compilation to the hard disk file (the default is in the node_modules/. Cache /babel-loader directory of the project root directory, of course you can customize).

But what if loader doesn’t support caching? We also have methods. Cache-loader () : babel-loader () : babel-loader () : babel-loader () : babel-loader () : babel-loader () : babel-loader () : Babel-loader () : Babel-loader () : Babel-loader () : Babel-loader () : Babel-loader () : Babel-loader () The way to use it is simple, as shown in the official demo, just uninstall it at the top of the expensive loader:

module.exports = {
  module: {
    rules: [{test: /\.ext$/.use: ['cache-loader'. loaders],include: path.resolve('src'),},],},};Copy the code

Tip: By default, cache-loader stores the cache in the. Cache-loader directory in the root directory of the project. It is used to configure the cache in the node_modules/. With other plugins such as babel-loader or loader cache

Similarly, most of the problems in the code compression stage caused by efficiency bottlenecks in the construction process can also be solved by caching. Uglifyjs-webpack-plugin, the most commonly used plugin for us, provides the following configuration:

module.exports = {
  optimization: {
    minimizer: [
      new UglifyJsPlugin({
        cache: true.parallel: true,}),],},};Copy the code

We can enable caching by enabling cache configuration. We can also enable multi-core compilation by enabling Parallel, which we will cover shortly in the next section. Optimization-css-assets-webpack-plugin is another plugin we use to compress CSS. I haven’t found any support for caching and multi-core compilation yet. If you have your expertise in this area, please comment in the comments section.

Tip: At present, I do not recommend to integrate the cache logic into the CI process, because at present, there is still a problem of hitting the cache after updating the dependency. This is obviously a BUG. On the development machine, we can manually delete the cache to solve the problem, but on the compiler machine, the process is much more troublesome. In order to ensure the purity of each CI result, it is recommended not to enable the caching function during CI.

2.2. Multiple cores

The optimization method here you must have thought of, naturally is our happypack. This seems to have been a plite topic, since the era of 3, Happypack has become a number of Webpack projects access to multi-core compilation of the choice, almost all people, when referring to webpack efficiency optimization, how will also say happypack this word. So, in today’s flourishing front-end community, there have been a lot of great quality articles springing up since happypack came along. So here today, HAPPypack I will not do too much detail on the introduction, I think you are familiar with it, I will take you a brief review of its use.

const HappyPack = require('happypack')
const os = require('os')
// create a thread pool
// Get the maximum number of cores in the system CPU, happypack fills all threads with compile work
const happyThreadPool = HappyPack.ThreadPool({ size: os.cpus().length })

module.exports = {
  module: {
    rules: [{test: /\.(js|jsx)$/.exclude: /node_modules/.use: 'happypack/loader? id=js',}]},plugins: [
    new HappyPack({
      id: 'js'.threadPool: happyThreadPool,
      loaders: [{loader: 'babel-loader',},],},}Copy the code

Module. rules: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders: Loaders So use Happypack to wrap up multi-core compilation for loaders that you’ve measured to be expensive.

For some expensive compilation of Webpack plug-in, generally will provide parallel such configuration items for you to open multi-core compilation, so as long as you are good at going to its official website to find, there will be unexpected harvest oh ~

PS: There is one pit that can be easily encountered in Production mode that needs to be mentioned here, because there is a special character named Mini-CSS-extract-Plugin. Where is the pit? There are two points (which I have not addressed at the time of writing) :

  1. MiniCssExtractPlugin cannot coexist with Happypack. If happypack is used to wrap MiniCssExtractPlugin, this problem will be triggered: github.com/amireh/happ… .
  2. MiniCssExtractPlugin must be placedcache-loaderOnce executed, otherwise it will not take effect.Github.com/webpack-con… 。

So in the end, the CSS Rule configuration in Production mode looks like this:

module.exports = { ... .module: {
        rules: [..., {test: /\.css$/
                exclude: /node_modules/.use: [
                    _mode === 'development' ? 'style-loader' : MiniCssExtractPlugin.loader,
                    'happypack/loader? id=css']]}},plugins: [
        new HappyPack({
          id: 'css'.threadPool: happyThreadPool,
          loaders: [
            'cache-loader'.'css-loader'.'postcss-loader',],}),],}Copy the code

2.3. Pull away

Static dependence for some rarely changes, such as common React in our project buckets, or use some of the tools library, such as lodash and so on, we don’t want the dependence is integrated into every build logic, because they are really too little time will be changed, so every time the input and output of the build should be the same. Therefore, we try to take these static dependencies out of each build logic to improve our build efficiency with each build. There are two common solutions. One is to use webpack-DLL-plugin, which packages these static dependencies separately at the first build and then only references the static dependencies that have already been built, which is similar to the concept of “precompilation”. Another way, which is also the Externals method common in the industry, we remove these static resources that do not need to be packaged from the build logic, and use the CDN method to reference them.

So how do we choose between these two approaches?

2.3.1. Choice between webpack-DLL-plugin and Externals

The early project scaffolding of the team used webpack-DLL-plugin for static resource extraction. The reason for this was that Externals was also used previously, but the company’s early CDN service was not mature. The project used online open source CDN, but the unstable service led to problems in the team project, so it was replaced with Webpack-DLL-plugin in one iteration. However, as the company established mature CDN service, the scaffolding of the team was delayed to update due to various reasons.

I am firmly supported by Externals, which is not what I want to do. First, let’s count the three original SINS of Webpack-DLL-plugin:

  1. You need to configure static dependencies that are not compiled at each build and precompile a JS file for them at the first build (called lib files later). Dependencies need to be maintained manually each time they are updated, and errors or version errors occur when dependencies are added or deleted or resource versions are changed and forgotten to be updated.

  2. New browser features script type=”module” cannot be accessed, and the introduction of native ES Modules provided by some dependent libraries (such as the introduction of new versions of Vue) cannot be supported to better adapt to the good features provided by older browsers for better performance optimization.

  3. Precompiling all resources into a single file and explicitly injecting that file into the HTML template that the project builds was favored in the HTTP1 era because it reduced the number of requests for resources, but in the HTTP2 era if you split multiple CDN links, You can take full advantage of the multiplexing features of HTTP2. No evidence, directly above to verify the conclusion:

    • Using the lib file generated by webpack-dlL-plugin, the entire resource is loaded as a single file in over 400 milliseconds
    • Use Externals and HTTP2 to load all resources in parallel. The total loading time does not exceed 100ms

That’s why I chose Externals.

But what if your company doesn’t have a mature CDN service, but wants to decoupage static dependencies from your project? To optimize your build efficiency, choose webpack-DLL-plugin. If you still feel that it is troublesome to maintain a lib file every time you update your dependencies, I would like to remind you that it is very important to choose a reliable CDN when using Externals. After all, these dependencies such as React are the skeleton of your website. You can’t even run the site without them.

2.3.2. How to write Externals more elegantly

As we all know, when using Externals, we also need to update the CDN in the HTML at the same time, and sometimes we forget this process, resulting in some errors. So as an extreme front-end, can we try to automate this process with existing resources?

Let’s go back and forth and analyze what parts we need to configure when we configure Externals.

First, in the webpack.config.js configuration file, we need to add the WebPack configuration item:

module.exports = { ... .externals: {
    // Key is the package name we import, and value is the global variable name provided by the CDN
    // So finally Webpack will compile a static resource as: module.export.react = window.react
    "react": "React"."react-dom": "ReactDOM"."redux": "Redux"."react-router-dom": "ReactRouterDOM"}}Copy the code

At the same time, we need to update our CDN script tag synchronously in the template HTML file. Generally, a common CDN Link looks like this:

https://cdn.bootcss.com/react/16.9.0/umd/react.production.min.js

Here, take the static resource CDN provided by BootCDN as an example (but it does not mean that THE author recommends using the CDN service provided by BootCDN, its last change of domain name really made me step on a lot of holes), we can find that a CDN Link is mainly composed of four parts, they are: CDN service host, package name, version number, and package path. The same applies to other CDN services. Take Link above as an example, the four parts correspond to:

  • CDN server host:cdn.bootcss.com/
  • Package name: react
  • Version number: 16.9.0
  • Package path: umd/react. Production. Min. Js

I think you’ve figured it out by this point. We can write a webpack plug-in to automatically generate CDN Link Script tag and mount it on the event hook provided by HTML-webpack-plugin to realize automatic HTML injection. However, we need four parts of a CDN Link. CDN service we only need to host and service, the package name. We can through the compiler options. The externals get, but we only need to read the program version number of the package. The json file, the final package path, is usually a fixed value.

I will not give a detailed introduction to the specific code implementation. During the project scaffolding update and iteration, the author customized a Webpack plug-in according to the CDN service provided by the company, and the implementation logic is as shown above, so the subsequent engineers no longer need to pay attention to the synchronous script tag. Everything is integrated into the Plugin logic automatically handled, of course, if you are interested in the source of the Plugin, please feel free to post it in the comments section. I will consider contributing to the community as an open source project of the team.

Resolution of the 2.4.

Although SPA has become the mainstream in the era of big front-end, there are still some projects that need to be made into MPA (multi-page application). Thanks to the multi-entry support of WebPack, we can manage and maintain multiple pages under one REPO. However, as the project progresses and iterates, the amount of code will inevitably increase. Sometimes we only change the files under one entry, but we need to perform a build for all the entries. Therefore, here is a concept of cluster compilation for you:

What is cluster compilation? By cluster, of course, I don’t mean our actual physical machine, but our Docker. The idea is to maintain a separate build process by stripping out individual entries, executing them in a container, and then typing the generated files into a specific directory when the build is complete. Why can we do that? Because we know that Webpack treats an entry as a chunk, and when it finally generates a file, it creates a separate file for chunk,

Because now the team in practice front service, so each sub modules is subdivided into a separate repo, so our project is born with inherited gene cluster compilation, but if put these components in the form of entry in a repo, is also a very common situation, at that time, you need to break up, Cluster compilation works to its advantage. Because the team does not need to carry out relevant practices, so I will not provide details here, just to provide you with a direction, if you have any questions, please feel free to discuss with me in the comments section.

3. Improve the experience

Here are a few Webpack plugins that will help you improve your build experience. They won’t do much for you in terms of efficiency, but they will make you more comfortable while you wait for your build to complete.

3.1. progress-bar-webpack-plugin

This is a Plugin that shows you your build progress and is used in the same way as a regular Plugin, without passing in any configuration. Here is what it looks like when you add it to your terminal panel. At the bottom of your terminal, there will be a build progress bar, which allows you to see the progress of the build:

3.2. webpack-build-notifier

It’s a way for apps like wechat or Lark to pop up a message when you’re done building, telling you that it’s done. This means that when you start the build, you can hide the console panel and focus on other things. When you hit the “dot”, it will call you and it will sound like this

3.3. webpack-dashboard

Of course, if you are not satisfied with webPack’s original build output, you can also use a Plugin to optimize your output interface. It looks like this.

4. To summarize

To sum up, in essence, our optimization measures for webpack build efficiency are in two general directions: cache and multicore. Caching is so that when you do a second build, you don’t have to do the same thing again; And multi-core, is to make full use of the advantages of the hardware itself (I believe that everyone’s computer today must be more than dual-core, my own low-configuration MAC issued by the company has dual-core), so that our complex work can make full use of our CPU. These two directions are the main players in the practice, and the two trump cards we introduced earlier are cache-loader and Happypack, so if you know it and use it well, you’ll be better at building optimization practices. Therefore, don’t just look at the project, quickly take your project hands-on practice, let your optimized team project in front of your leader in front of the eyes!

However, we must remember that these things are not to say that the effect will be the best, we must remember to use them in the blade, that is, those in the first phase of the implementation of the Loader or Plugin, which is expensive to build, because we know, like local cache requires reading and writing hard disk files, System IO needs time, such as the start of multi-core also needs IPC communication time, that is to say, if the original building time is not long module, it may be because of the addition of cache or multi-core will have outweight the loss, so these optimization means also need to be reasonably allocated and used.

Now, Webpack itself is also constantly iterating and optimizing. It is no longer the new building star that made us laugh about the slow and heavy construction two or three years ago. The reason why webpack has become the mainstream is precisely because the webpack team has made a lot for us in terms of efficiency and experience, while we need to do less. And I’m sure there will be even fewer.

Finally, attach a hard AD, if you are interested in joining us, and we explore the mystery of the front end together, please click 👇👇👇 below the picture 👇👇, you can open my internal push link oh ~

If you want to learn more about our team, you can also check out this link: 👉 : ByteDance’s most profitable front end team is hiring.