1. Introduction

Webpack is a great modular packaging tool for the front end, and its ecosystem has flourished to date, driving the development of front-end engineering. So mastering the use of this tool is very useful for our project development, because it solves a lot of problems, such as compatibility with various modular specifications, automated workflow, etc. With the rise of no-Bundler packaging tools such as Vite and Snowpack, people may be looking for a better solution, but personally, any tool is designed to solve a certain kind of problem, so technology has No silver bullet. Such as jQuery has become a thing of the past, Grunt, such as technology, they may have no value for us now, because we use them to touch now scene is rare, but we can probably to know about the background, they have solved the problem what (you don’t have to dig the technical details). This is very helpful for our future development. Therefore, the author will start from the development of modularization and automation, and slowly “lead” Webpack out.

2. Modular and automated build tool evolution

2.1 Development history of modularity

Modularity is a way of organizing projects, which can improve development efficiency and reduce costs.

2.1.1 STAGE1: File partitioning

In the early days, people used to write code directly in the script tag, of course, for some very simple functions can be used in this way, but as the code becomes longer, there are the following problems:

  • Fuzzy function division
  • Scoped pollution (naming conflicts)
  • Code not maintainable

So we separate the modules with different functions into files:

// js/module-a.js
const sum = (a, b) = > a + b


// js/module-b.js
const multi = (a, b) = > a * b
Copy the code
<! DOCTYPEhtml>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
    <meta name="viewport" content="Width = device - width, initial - scale = 1.0" />
    <title>Document</title>
  </head>
  <body>
    <script src="./js/module-a.js"></script>
    <script src="./js/module-b.js"></script>
    <script>
      console.log(sum(1.2))
      console.log(multi(1.2))
    </script>
  </body>
</html>
Copy the code

Although it seems that the module is split separately, it still does not solve the above problem. If module-b has the same variable with the same name as module-A, there will still be member conflicts, and the dependency between modules is not clear.

2.1.2 State 2: namespace

To solve the naming conflict problem above, we simply export a global object to which all module members are mounted. This approach is called “namespace”.

// js/module-a.js
window.a = {
  data: 'hello a'.sum(a, b) {
    return a + b
  },
}

// js/module-b.js
window.b = {
  data: 'hello b'.multi(a, b) {
    return a * b
  },
}
Copy the code

This approach resolves naming conflicts nicely (keeping global variables unique), but other problems still exist, and IIFE is derived from this.

2.1.3 Stage 3: IIFE: Call function expressions immediately

IIFE is not a new technology; it takes full advantage of the closure feature in JavaScript to provide private space for modules.

// js/module-a.js; (function () {
  const data = 'hello a'

  window.a = {
    data,
    sum(a, b) {
      return a + b
    },
  }
})()

// js/module-b.js
// Pass in parameters to declare other module dependencies; (function (moduleA) {
  const data = 'hello b'
  // Other modules
  console.log(moduleA.sum(1.2))

  window.b = {
    data,
    multi(a, b) {
      return a * b
    },
  }
})(window.a)
Copy the code

It is good to solve the problem of the scope of pollution, also can be declared by passing parameters depend on other modules, but every time need to pass the injected script tags to introduce module, that leads to the maintenance problems, because each new add a module to think to add a label, you will also ensure the script writing order. This made maintenance difficult, and if only one entry file could be maintained, it slowly evolved into a modern modular solution.

Stage 4: Modern modular solutions

(1) Commonjs modular specification

With the birth of Node.js in 2009, the Commonjs modular specification has gradually emerged. It mainly includes the following contents:

  • Require importing modules
  • Exports /modules.exports export module
  • A file is a module
  • Module loading is synchronous

Note: The Commonjs specification is not the same as the modularity specification. Modularity is just one of the many things it covers: buffers, binaries, I/O streams, etc.

The Commonjs modularity specification is implemented in accordance with IIFE. The above mentioned require, exports/module.exports variables, including __filename, are passed through an IIFE:

(function (exports.require.module, __filename, __dirname) {}) ()Copy the code

(2) AMD/CMD specification

The Commonjs modular specification is not suitable for use in browsers because of the serious performance issues associated with synchronous loading of modules, hence the AMD/CMD modular specification in the community, which is implemented in libraries require.js and sea-.js. Here is a simple way to use require.js:

Import require.js first, then add the main entry:

<! DOCTYPEhtml>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
    <meta name="viewport" content="Width = device - width, initial - scale = 1.0" />
    <title>Document</title>
  </head>
  <body>
    <script src="./js/require.js" data-main="js/main"></script>
  </body>
</html>
Copy the code

Define module A:

// js/module-a.js
// Do not rely on other modules
define(function () {
  const data = 'hello a'

  return {
    data,
    sum(a, b) {
      return a + b
    },
  }
})
Copy the code

Define module B:

// js/module-b.js
// Depends on module A

define(['./module-a'].function (moduleA) {
  const data = moduleA.sum(1.2)

  // Use other module code
  return {
    data,
    multi(a, b) {
      return a * b
    },
  }
})
Copy the code

Main entry:

// js/main.js
/ / the main entry

require(['module-a'.'module-b'].function (moduleA, moduleB) {
  console.log(moduleA.sum(1.2))
  console.log(moduleB.data)
})
Copy the code

CMD and AMD are similar, but written differently, and the CMD modular specification has since been compatible with AMD, so I won’t cover it here.

We can see the performance overhead of introducing additional libraries in order to write modular code, so native modular solutions have been added to the ES specification.

(3) ES Module

ES Module was introduced in the ES6 specification and has the following features:

  • Has private scope
  • Strict mode
  • Defer script execution (defer by default)
  • Request external JS modules through CORS

This way we are relatively familiar with, the way to use is very simple.

<! DOCTYPEhtml>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
    <meta name="viewport" content="Width = device - width, initial - scale = 1.0" />
    <title>Document</title>
    <! -- defer script execution, which is equivalent to adding defer -->
    <script src="./js/main.js" type="module"></script>
  </head>
</html>
Copy the code

Export module:

// js/module-a.js
const data = 'hello a'

export default {
  data,
  sum(a, b) {
    return a + b
  },
}

// js/module-b.js
import moduleA from './module-a.js'

const data = moduleA.sum(1.2)

export default {
  data,
  multi(a, b) {
    return a * b
  },
}
Copy the code

Main entry:

/ / the main entry

import moduleA from './module-a.js'
import moduleB from './module-b.js'

console.log(moduleA.sum(1.2))
console.log(moduleB.data)
Copy the code

Export imported members:

// a.js
const b = 2
export const a = 1
export default b

// b.js
export const b = 2

// index.js
export { a, default as c } from 'a.js'
export { b } from 'a.js'
Copy the code

For browsers that are not ESM compatible:

< script nomodule SRC = "https://unpkg.com/[email protected]/dist/polyfill.min.js" > < / scripts > < script nomodule SRC = "https://unpkg.com/[email protected]/dist/babel-browser-build.js" > < / scripts > < script nomodule SRC = "https://unpkg.com/[email protected]/dist/browser-es-module-loader.js" > < / scripts > < script type = module > / /... </script>Copy the code

Node.js handles ES6 modules:

  • use.mjsEnd the file to identify the ESM, or inpackage.jsoninsidetypeField set tomodule. (Type the default iscommonjsIf you change it tomodule, need to use.cjsTo mark by the endcommonjsModule)
  • requireThe ES6 module cannot be loaded.
  • .mjsFile not availablerequireThrough theimportThe importedcommonjsThe member is a default member and cannot passimport {} from './demo.js'To get a member, an error is reported.

Matters needing attention:

  • Export {} exports not an object, but a syntax. Export default {} exports an object.

  • Likewise, import {} is not a destruct object; it is also a syntax.

  • The member variable imported by import is read-only and has the same reference relationship as the member variable exported by export.

  • An import in the native ESM must export a full file name.

  • Import must be placed in the top-level scope, and the import() function is used to import modules asynchronously.

ESM is now implemented in major browsers and is supported by the Commonjs specification (Node V13.2.0 +), so it is recommended to use this method.

So far, this is the end of the modularity evolution section. So, what does Webpack have to do with this? Let’s go back to Webpack’s introduction: modular packaging tools. This is not the modularity described above, but Webpack has implemented a modularity mechanism in order to be compatible with the main modularity specification above. It can handle multiple modularity simultaneously, which is its power. This means that we can use require and import in the same file. Not only that, but for Webpack, everything is a module, not just JS files.

Here’s an introduction to the automated build tools, so we don’t have to go into all the details, just to see what they all have in common.

2.2 Development history of automated construction tools

An automated build is the process of manually converting some source code into production code and handing it over to a program.

2.2.1 Manual period

Long ago, before the rise of automation tools, people would first compress static resources such as scripts, styles and images after developing projects. At this time, they would find some third-party tools to help compress and then upload them to the server for deployment. Then it was all over again, and with Node.js things got better, because that’s when Npm came along.

2.2.2 Npm Script

With Npm Script we can configure some automation scripts (in the scripts field in package.json), along with some scaffolding tools to help us do repetitive things. This approach has remained undefeated to this day, often in conjunction with automated build tools, bringing a qualitative improvement to the development experience. The downside is that it is too simple and lacks many features, so scenarios using Npm Scripts alone are limited.

2.2.3 Grunt

Grunt introduced the concept of “tasks,” where we can think of everything we want to do as tasks, such as:

  • Compile sass
  • Use Babel to handle JS compatibility
  • Hot update
  • Unit testing

Gruntfile.js can be configured in a project to automate the execution of tasks.

const loadGruntTasks = require('load-grunt-tasks')

module.exports = (grunt) = > {
  grunt.initConfig({
    // A plug-in to clear files
    clean: {
      temp: 'temp/**',},/ / compile Sass
    sass: {
      options: {
        sourceMap: true.implementation: require('sass'),},main: {
        files: {
          'dist/css/main.css': 'src/scss/main.scss',}}},babel: {
      options: {
        sourceMap: true.presets: ['@babel/preset-env'],},main: {
        files: {
          'dist/js/app.js': 'src/js/app.js',}}},watch: {
      js: {
        files: ['src/js/*.js'].tasks: ['babel'],},css: {
        files: ['src/scss/*.scss'].tasks: ['sass'],}}})// Automatically load plug-ins
  loadGruntTasks(grunt)

  grunt.registerTask('default'['sass'.'babel'.'watch'])}Copy the code

Its plug-in system is also very rich, according to various plug-ins to achieve our needs, but also support asynchronous tasks. But there are also some disadvantages:

  • Many plug-ins are not integrated and do not work out of the box.
  • Based on disk read and write, low efficiency.

It compensated for Npm Scripts’ problems, but as with some of the shortcomings mentioned above, new automated build tools emerged in the community.

2.2.4 Gulp

Gulp is also a great automated build tool. It introduces the concept of “streams” (based on stream in Node.js), which does everything in memory and then outputs resources to disk, addressing Grunt’s shortcomings. You can think of “streaming” as the process of passing a source file through a pipeline to a destination file. These pipes are usually done by plug-ins. Its PARALLEL (parallel execution) and Series (synchronous execution) apis allow you to compose any task.

I here based on Gulp package a CLI tool pages- CLI (not released), can help us to automatically execute the relevant construction process, due to the space here will not introduce more Gulp use, relevant source code and uploaded to Github, you can refer to (gulp-case).

It’s been three years since the last release of Gulp, but that doesn’t mean it’s “dead”, as we can see from Npm’s weekly downloads (1 million +), there are still a lot of projects using it.

Gulp has the advantage of being flexible, but it also has the disadvantage of not being very integrated out of the box.

2.2.5 FIS 3

FIS 3 is an automation tool from Baidu. It comes out of the box with a lot of built-in plug-ins. It is an “enhanced version” of Gulp and Grunt. We just need to write a configuration file to configure it:

// Specify the output directory
fis.match('*.{js,scss,png}', {
  release: '/assets/$0'
})

/ / compile sass
fis.match('**/*.scss', {
  rExt: '.css'.parser: fis.plugin('node-sass'),
  optimizer: fis.plugin('clean-css')})/ / compile js
fis.match('**/*.js', {
  parser: fis.plugin('babel-6.x'),
  optimizer: fis.plugin('uglify-js')})Copy the code

Unfortunately, FIS 3 is no longer maintained and the number of Npm downloads per week is low, but that doesn’t mean it’s not a great tool.

Here to summarize, these tools looks be like multifarious, but is in order to solve some problems and appear, sometimes we don’t need to have all the tools to use completely, but for their application scenarios and the need to know about the background, one day in the future we may encounter on the market has yet to solve the problem of pain points, That’s where our accumulated skills come in handy, and maybe we can build a tool that works really well. On the other hand, it’s not hard to see how powerful Node.js is, and these tools are largely due to the evolution of Node.js, so you know what to learn 😏.

2.3 Modernized Construction tools

The automated build tools above mostly covered our business scenarios, but we also had a need for unified management of modularity, which led to the emergence of modern build tools, most of which support modular packaging. Here’s a brief overview of each tool, followed by an in-depth look at Webpack.

  • Webpack: modular packaging tool, so resources are processed by the original module, there is a very large ecology.
  • Parcel: a modular packaging tool with zero configuration and unpacking support for most static resourcesWebpackTo configure the tedious problem,ParceluseworkerProcess to disable multi-core compilation.
  • Rollup: modular packaging tool forES ModulePackaging was first proposedTree-shaking, can reduce the size of output files and improve performance, often used for packaging certain libraries.
  • Snowpack: The idea is to take advantage of browser nativeES ModuleThe ability to reduce or avoid the wholebundlePackage, provided at development timeNo-BundlerServers, files are cached once they are built.
  • Vite: Another oneNo-BundlerBuild tools (development time only), which are subject toSnowpackSome inspiration, cold start up when developing, extremely fast hot module replacement experience, but still used in production environmentRollupIt was packed.

3. Why learn Webpack

We tend to pursue a technology as soon as it emerges, and there’s nothing wrong with that. Instead, we need to understand why we should learn it, which is also the point that this article has been emphasizing. Of course, tools like Vite are definitely going to be mainstream in the future, and it’s certainly worth learning, but we can’t completely abandon Webpack because it’s already a thriving ecosystem and there are so many projects using Webpack that mastering this skill will help us solve problems. The following articles will be explained from the following directions:

  • Here’s how to build a Vue/React project from 0 to 1.
  • The loader/plugin mechanism and how to write your own loader/plugin.
  • Advanced stage, here explains the core of the Webpack Tapable library source code, packaging results analysis, core build process source code analysis, to achieve a simple packer.

4. To summarize

Development based on modular and automated build tools, tells the story of them in order to solve some problems in the process of development and produced a variety of tools, and finally the purpose of this paper is to let everybody to understand such tools bring us what is the value of, to speak to learn Webpack can have a preparation, hope to bring some fruit.

Reference 5.

– Ruan Yifeng: Javascript modular programming (iii) : require.js usage

  • Hook Education – Webpack Principles and Practice