As we all know, modern front-end is basically inseparable from front-end engineering, more or less need to use packaging tools for processing, such as Webpack, rollup. However, it is only written for configuration engineers of WebPack /rollup, and the packaging process is a black box for us.

What do you say we do something?

Carding process

We’ll go through the packaging process before we build.

  1. Our packing tool should have a packing entry.

  2. Our [Entry] file introduces dependencies, so we need an {graph} to store the module graph. This module graph has three core contents:

    • filepath: Module path, for example./src/message.js'
    • dependencies: What dependencies are used in the module
    • code: The specific code in this module
  3. {graph} stores all modules, so you need to recursively traverse each dependent module

  4. Build runnable code with {graph.code} based on the dependency path of {graph}

  5. Export the code to the export file and the package is complete

architectural

First write the code we need to package

// ./src/index.js
import message from './message.js';
import {word} from './word.js';

console.log(message);
console.log(word);
Copy the code
// ./src/message.js
import {word} from './word.js';
const message = `hello ${word}`;

export default message;
Copy the code
// ./src/word.js
export const word = 'paraslee';
Copy the code

Then create Bundler.js in the root directory and design the functionality to be used.

// ./bundler.js
// Module analysis capability: Analyze a single module and return analysis results
function moduleAnalyser(filepath) {
    return {};
}

// Map building ability: Recursive module, call moduleAnalyser to get the information of each module, synthesize and store into a module map
function moduleGraph(entry) {
    const moduleMuster = moduleAnalyser(entry);
    
    const graph = {};
    return graph;
}

// Code generation capability: Generate executable code from the resulting module map
function generateCode(entry) {
    const graph = moduleGraph(entry);
    const code = ' ';
    
    return code;
}

// Call bundleCode to execute the package operation. After obtaining the executable code, output the code to the file
function bundleCode() {
    const code = generateCode('./src/index.js');
}

bundleCode();
Copy the code

Bottom-up, coding begins!

Module analysis

The first is the lowest level function: moduleAnalyser

Since the first module to be analyzed must be an entry file, we can comment out the rest of the code while writing the moduleAnalyser details and write this section separately.

function moduleAnalyser(filepath) {}
moduleAnalyser('./src/index.js');Copy the code

First, we need to read the file’s information. Node provides the FS API to read the file’s contents

const fs = require('fs');
function moduleAnalyser(filepath) {
    const content = fs.readFileSync(filePath, 'utf-8');
}
Copy the code

Printing Content gives you the result shown below

Second, we need to get all the dependencies of this module, namely./message.js and./word.js. There are two ways to get dependencies: 1. Handwriting rules for string matching; 2. Use the Babel tool

The first method is really thankless, and fault tolerance is low, the efficiency is not high, so I use the second method

Babel has a tool that converts JS code into an AST. By iterating through the AST, you can directly retrieve the content that uses the import statement

npm i @babel/parser @babel/traverse
Copy the code
. const parser =require('@babel/parser');
const traverse = require("@babel/traverse").default;

const moduleAnalyser = (filePath) = > {
    const content = fs.readFileSync(filePath, 'utf-8');
    Parse @babel/parser to convert JS code to AST
    const AST = parser.parse(content, {
        sourceType: 'module' // If the code uses esModule, you need to configure this item
    });

    // [@babel/traverse] can traverse AST trees
    traverse(AST, {
        // Match ImportDeclaration type node (import syntax)
        ImportDeclaration: function(nodePath) {
            // Get the module path
            constrelativePath = nodePath.node.source.value; }}); }Copy the code

If we print relativePath in the console, we can print./message.js and./word.js

The AST is too long to be paid. If you are interested in it, you can output the AST to see what it looks like


Third, after obtaining the dependency information, store it along with the code content

Here is the complete code for moduleAnalyser, and if there are a few puzzles after reading it, I’ll explain them one by one in the annotations

npm i @babel/core
Copy the code
. const babel =require('@babel/core');

const moduleAnalyser = (filePath) = > {
    const content = fs.readFileSync(filePath, 'utf-8');
    const AST = parser.parse(content, {
        sourceType: 'module'
    });

    // Save file path #1
    const dirname = path.dirname(filePath);
    // Store dependency information
    const dependencies = {};

    traverse(AST, {
        ImportDeclaration: function(nodePath) {
            const relativePath = nodePath.node.source.value;
            // Change the relative module path to the relative root path #2
            const absolutePath = path.join(dirname, relativePath);
            // Replace is to solve path problem #3 on Windows
            dependencies[relativePath] = '/' + absolutePath.replace(/\\/g.'/'); }});// Compile the AST into runnable code #4 with Babel
    const {code} = babel.transformFromAst(AST, null, {
        presets: ["@babel/preset-env"]})return {
        filePath,
        dependencies,
        code
    }
}
Copy the code

#1 Why get dirName?

/ SRC is the root directory of the code, all dependencies, all module files are under./ SRC (forget node_modules for now), so we need to get this root directory information, dirname === ‘SRC’

#2 Why change the relative module path to the relative root path

/ SRC /index.js: Import message from ‘./message.js’. RelativePath stores the value./message.js. This is very inconvenient for parsing message.js files, but converting to./ SRC /message.js makes it much easier to read the file directly from fs

#3 Why store dependency information like this

By storing key-value pairs, you can keep the path of the relative module and store the path of the relative root directory

#4 Why compile code

Code compilation converts esModule to CommonJS, and then when building code we can write our own require() method for modularization.


OK, now that you understand the moduleAnalyser method, let’s see what the output looks like

Map building

Now that we have implemented the module analysis capability, we need to recurse all the imported modules and store the analysis results for each module as grapth

. const moduleAnalyser =(filePath) = >{... }const moduleGraph = (entry) = > {
    // moduleMuster stores the collection of modules that have been analyzed, and by default, the analysis results are added directly to the entry file
    const moduleMuster = [moduleAnalyser(entry)];
    // Cache records the modules that have been analyzed to reduce module duplication
    const cache = {
        [moduleMuster[0].filePath]: 1
    };
    // Store the real graph information
    const graph = {};

    // Recursively iterate over all modules
    for (let i = 0; i < moduleMuster.length; i++) {
        const {filePath} = moduleMuster[i];

        if(! graph[filePath]) {const {dependencies, code} = moduleMuster[i];
            graph[filePath] = {
                dependencies,
                code
            };

            for (let key in dependencies) {
                if(! cache[dependencies[key]]) { moduleMuster.push(moduleAnalyser(dependencies[key])); cache[dependencies[key]] =1; }}}}return graph;
}

// Pass the enRTY information directly to obtain the map information
moduleGraph('./src/index.js');
Copy the code

The moduleGraph method is not hard to understand, and the main content is at the recursive level.

Take a look at the resulting graph

Building code

Here comes the point of the point, the point of the point: generating executable code from graph

. const moduleAnalyser =(filePath) = >{... }const moduleGraph = (entry) = >{... }const generateCode = (entry) = > {
    // The code in the file is actually a string. The browser converts the string into an AST before executing the operation, so you need to convert the map to a string to use
    const graph = JSON.stringify(moduleGraph(entry));
    return '(function(graph) {// the browser does not require the method, Function require(module) {// ex.var _word = require('./word.js') // Ex.require ('./ SRC /word.js') // Pass requireInEval into a closure for conversion using function requireInEval(relativePath) {return require(graph[module].dependencies[relativePath]); } // The contents of the submodule are in exports and empty objects need to be created for use. var exports = {}; Function (code, require, exports) {function(code, require, exports) {eval(code); })(graph[module].code, requireInEval, exports) // Return modules dependent content to module return exports; } // require('${entry}'); }) (${graph}) `;
}

generateCode('./src/index.js');
Copy the code

Now ten thousand grass mud horses in the heart: what is this stupid function written? I lost my mind

GGMM is not in a hurry, I will explain step by step

Start by passing the string-like graph into the closure function for use.

You then need to manually import the entry file module, require(‘${entry}’), which is wrapped in quotes and made sure to be a string

So our require function is now zero

function require(module = './src/index.js') {}
Copy the code

Graph [‘./ SRC /index.js’]

function require(module = './src/index.js') {(function(code) {
        // Execute code through eval
        eval(code);
    })(graph[module].code)
}
Copy the code

Then let’s look at the code that Eval will execute at this point, the compiled code of the entry file

"use strict";
Object.defineProperty(exports, "__esModule", {
  value: true
});
exports["default"] = void 0;
var _word = require("./word.js");
var message = "hello ".concat(_word.word);
var _default = message;
exports["default"] = _default;
Copy the code

Var _word = require(“./word.js”); Because of the scope chain, require calls the outermost require method, but our own require method takes a path relative to the root directory, so we need a method to convert.

// require
function require(module = './src/index.js') {
    function requireInEval(relativePath = './word.js') {
        return require(graph[module].dependencies[relativePath]);
    }
    var exports = {};
    (function(code, require, exports) {
        eval(code);
    })(graph[module].code, requireInEval, exports)
    returnExports; }Copy the code

Path conversion is performed using requireInEval and passed to the closure as, depending on the nature of the scope, the requireal method passed in eval.

When eval is executed, the dependencies are stored in an exports object, so we need to create an exports object to accept the data.

Finally, it returned exports


This is followed by a loop that repeats the above steps

Generate the file

Now that the packaging process is almost complete, the code returned by the generateCode method is ready to run directly in the browser.

But at least a packaging tool, must package results output.

const os = require('os'); // To read system information. const moduleAnalyser =(filePath) = >{... }const moduleGraph = (entry) = >{... }const generateCode = (entry) = >{... }function bundleCode(entry, output) {
    // Get the absolute path to the output file
    const outPath = path.resolve(__dirname, output);
    const iswin = os.platform() === 'win32'; // Whether it is Windows
    const isMac = os.platform() === 'darwin'; // Whether it is a MAC
    const code = generateCode(entry);

    // Read the output folder
    fs.readdir(outPath, function(err, files) {
        // Create a folder if there is no folder
        let hasDir = true;
        if (err) {
            if (
                (iswin && err.errno === - 4058.)
                || (isMac && err.errno === 2 -)
            ) {
                fs.mkdir(outPath, {recursive: true}, err => {
                    if (err) {
                        throwerr; }}); hasDir =false;
            } else {
                throwerr; }}// Clear the contents of the folder
        if (hasDir) {
            files = fs.readdirSync(outPath);
            files.forEach((file) = > {
                let curPath = outPath + (iswin ? '\ \' :"/") + file;
                fs.unlinkSync(curPath);
            });
        }

        // Write the code to a file and print it
        fs.writeFile(`${outPath}/main.js`, code, 'utf8'.function(error){
            if (error) {
                throw error;
            }

            console.log('Packing done! ');
        })
    })
}

bundleCode('./scr/index.js'.'dist');
Copy the code

Execute Node bundler.js to see the results!

The end of the

At this point, a basic packaging tool is complete!

You can add your own bundler.config.js, add configuration information to it, and pass in Bundler.js to make it look like a complete packaging tool.

The packaging tool is very simple and very basic. Webpack /rollup’s internal implementation is N times more complex due to the huge number of features and optimizations involved, but the core idea of packaging is basically the same.

Put the full code here: Github

If there are any mistakes/inadequacies/areas that need improvement/can be optimized in the article, I hope you can mention them in the comments, and the author will deal with them in the first time after seeing them

Since you are here, why don’t you like 👍 and github star ⭐ is the biggest support for my continuous creation ❤️️

Please, this is really important to me