Recently, the popularity of Vite has increased with vue3’s various exposures, and the popularity of Snowpack, similar to Vite, has also gradually increased. Snowpack currently has nearly 1 million stars on Github.

Snowpack’s code is very lightweight. This article will introduce the features of Snowpack from an implementation perspective. In the meantime, take a look at how a young build tool with native JavaScript modularity at its core implements some of the features offered by the “older” build tools.

1. Meet the snowpack

Recently, the popularity of Vite has increased with vue3’s various exposures, and the popularity of Snowpack, similar to Vite, has also gradually increased. Snowpack currently has nearly 1 million stars on Github.

Back in the first half of 2019, I read A Future Without Webpack in the middle of the day. It introduced me to the pika/ Snowpack project (then known as PIKA/Web).

The key points of the article are as follows:

Today (2019), it’s entirely possible to ditch the packaging tools and use the browser’s native JavaScript module functionality directly in the browser. This is based on three considerations:

  1. Compatibility is acceptable: most major browser versions now support direct use of JavaScript Modules (except, of course, IE, as always).
  2. Performance issues improved: One of the main reasons for packaging was the HTTP/1.1 feature, so we merged requests to optimize performance; Now that HTTP/2 is popular, this performance issue is not as significant as it used to be.
  3. The necessity of packaging: The main purpose of packaging tools is to deal with modular and merge requests, and the above two basic solutions to these two problems; In addition, packaging tools become more and more complex, and the necessity of their existence is naturally questioned by the author.

Because I think that packaging tools like Webpack, after “starting out” to become a build tool is not the optimal solution, in fact, it is a kind of accidental phase results. Therefore, I agreed with the viewpoints mentioned in this project at that time, among which the most impressive one is the following:

In 2019, you should use a bundler because you want to, not because you need to.

Meanwhile, I also believe that Bundler ≠ Build Tools ≠ engineering.

2. Looking at the beginning of the snowpack

After reading this article (June/July 19?) “Immediately read about the project on Github out of curiosity. At the time, the project was around version 0.4.x, with very simple source code and functionality.

The core goal of the initial version of Snowpack was to move away from packaging business code and use the browser’s native JavaScript Module capabilities directly.

Therefore, from the perspective of its processing process, for the module of business code, basically only need to publish ESM (copy) to publish directory, and then change the module import path from the source path to the release path.

For node_modules, the dependencies in package.json are packaged in granularity by iterating through the dependency list. Take the entry of each package in node_modules as the package entry, use rollup to generate the corresponding ESM module file, put it in the web_modules directory, and finally replace the import path of source code. It is possible to load packages in node_modules through native JavaScript modules.

- import { createElement, Component } from "preact";
- import htm from "htm";
+ import { createElement, Component } from "/web_modules/preact.js";
+ import htm from "/web_modules/htm.js";
Copy the code

V0.4.0 version of the source can be seen, its initial function is really very simple, even some humble, so that the lack of many modern front-end development required features, is obviously not used in the production environment.

Intuitively, it lacked the following abilities:

  1. Import CSS/image /… : Thanks to webPack’s modular concept + component-based development, the import Anything writing model has become ingrained in the minds of developers. The lack of dependency and loading ability for content like CSS will be its Achilles heel.
  2. Syntactic translation: Snowpack (called the Web at the time), which was intended to be a build tool, didn’t have the ability to compile Typescript, JSX, and other syntactic files. You could have made a completely unrelated tool to handle syntax, but isn’t that what build tools are supposed to integrate?
  3. HMR: It may not be that bad, but as the saying goes, “It’s easy to go from frugality to extravagance, but it’s hard to go from extravagance to frugality.”
  4. Performance: Although it states that using JavaScript modules does not perform worse with HTTP2, it is doubtful that it has been practiced.
  5. Environment variables: This is a small feature, but it’s used on most projects I’ve worked with. It helps developers automatically test and uninstall debugging tools in their online code, and automatically report buried points to different services based on the environment. You really need a feature that works like this.

3. The Evolution of Snowpack

Back in the first half of 2020, with the constant exposure of VUE3, another project related to it, Vite, is gaining traction. Snowpack, as mentioned in the introduction, has suddenly attracted much more heat and discussion. At that time, I was just familiar with PIka, and when I clicked the snowpack project homepage with curiosity, I found that the project I first knew about a year ago (PIKA/Web) had been upgraded to Pika/Snowpack V2. The project source code is no longer the only and simple index.ts before, in addition to the core code, but also contains many official plug-ins.

My first instinct when I looked at the Readme was that there should already be a solution to all the problems I had thought of.

Holding the attitude of learning, after re-understanding it, found that it is so. Curiosity tends me to explore its solution.

This article was written in 2020.06.18, the source code is based on [email protected]

3.1. The import of CSS

The import CSS problem extends to loading non-javascript resources, including images, JSON files, text, etc.

Let’s start with CSS.

import './index.css';
Copy the code

This syntax is not currently supported for browsing. So Snowpack uses a similar approach to webPack before it, turning CSS files into JS modules for injecting styles. If you’re familiar with Webpack, you know that if you just work with CSS in the Loader, instead of generating a separate CSS file (that’s why there’s the mini-CSs-extract-plugin), you load a JS module, The CSS text is then inserted into the page as the content of the style tag through the DOM API in the JS module.

To do this, Snowpack writes a simple template method of its own that generates JS modules that inject CSS styles into the page. The following code implements style injection:

const code = '.test { height: 100px }';
const styleEl = document.createElement("style");
const codeEl = document.createTextNode(code);
styleEl.type = 'text/css';
styleEl.appendChild(codeEl);
document.head.appendChild(styleEl);
Copy the code

As you can see, everything is constant except the rvalue of the first line, so it is easy to generate a JS module that meets the requirements:

const jsContent = `
  const code = The ${JSON.stringify(code)};
  const styleEl = document.createElement("style");
  const codeEl = document.createTextNode(code);
  styleEl.type = 'text/css';
  styleEl.appendChild(codeEl);
  document.head.appendChild(styleEl);
`;

fs.writeFileSync(filename, jsContent);
Copy the code

Snowpack has a little more implementation code than we did above, but it has nothing to do with style injection, which I’ll save for later.

Load CSS using the power of JavaScript Modules by saving the content of the CSS file to a JS variable and then using JS to call the DOM API to inject CSS content into the page. CSS in the source code will be replaced with index.css.proxy.js:

- import './index.css';
+ import './index.css.proxy.js';
Copy the code

The term proxy will come up many times, as Snowpack calls the generated intermediate JavaScript modules proxy in order to be able to import non-JS resources in a modular manner. This approach is almost identical to WebPack.

3.2. Import of images

Another very typical resource in today’s front-end development scenario is images.

import avatar from './avatar.png';

function render() {
    return (
        <div class="user">
            <img src={avatar} />
        </div>
    );
}
Copy the code

The code above is written in a way that is common to many projects. So what about Snowpack?

There is nothing new under the sun. Snowpack, like Webpack, ends up importing Avatar variables into the code as urIs of that static resource.

Take the official React template provided by Snowpack as an example to see how to import image resources.

npx create-snowpack-app snowpack-test --template @snowpack/app-template-react

After the initialization template runs, you can see the difference between the source code and the built code as follows:

- import React, { useState } from 'react';
- import logo from './logo.svg';
- import './App.css';

+ import React, { useState } from '/web_modules/react.js';
+ import logo from './logo.svg.proxy.js';
+ import './App.css.proxy.js';
Copy the code

Similar to CSS, a JS module logo.svg.proxy.js is generated for SVG. The contents of the module are:

// logo.svg.proxy.js
export default "/_dist_/logo.svg";
Copy the code

The pattern is the same as Webpack. Using the build command as an example, let’s take a look at how snowpack is handled.

The first step is to copy the static file (logo.svg) from the source code to the distribution directory:

allFiles = glob.sync(* * / * ` `, {... });const allBuildNeededFiles: string[] = [];
await Promise.all(
    allFiles.map(async (f) => {
        f = path.resolve(f); // this is necessary since glob.sync() returns paths with / on windows. path.resolve() will switch them to the native path separator.. return fs.copyFile(f, outPath); }));Copy the code

Then, we can see a key method call in Snowpack called transformEsmImports. This method transforms the module path imported from the source JS. For example, replace all imports in node_modules with web_modules. Here the import name of the SVG file will also be added to.proxy.js:

code = awaitTransformEsmImports (Code, (spec) => {......if (spec.startsWith('/') || spec.startsWith('/') || spec.startsWith('.. / ')) {
        const ext = path.extname(spec).substr(1);
        if(! Ext) {... }const extToReplace = srcFileExtensionMapping[ext];
        if (extToReplace) {
            ……
        }
        if (spec.endsWith('.module.css')) {
            ……
        } else if(! isBundled && (extToReplace || ext) ! = ='js') {
            const resolvedUrl = path.resolve(path.dirname(outPath), spec);
            allProxiedFiles.add(resolvedUrl);
            spec = spec + '.proxy.js';
        }
        return spec;
    }
    ……
});
Copy the code

At this point, we have the SVG file and the import syntax of the source code (import logo from ‘./logo.svg.proxy.js’) in place, and all that is left is to generate the proxy file. It’s also very simple:

for (const proxiedFileLoc of allProxiedFiles) {
    const proxiedCode = await fs.readFile(proxiedFileLoc, {encoding: 'utf8'});
    const proxiedExt = path.extname(proxiedFileLoc);
    const proxiedUrl = proxiedFileLoc.substr(buildDirectoryLoc.length);
    const proxyCode = wrapEsmProxyResponse({
      url: proxiedUrl,
      code: proxiedCode,
      ext: proxiedExt,
      config,
    });
    const proxyFileLoc = proxiedFileLoc + '.proxy.js';
    await fs.writeFile(proxyFileLoc, proxyCode, {encoding: 'utf8'});
 }
Copy the code

WrapEsmProxyResponse is a method to generate proxy modules. Currently, it only handles files including JSON, image, and other types. For other types (including images), it is very simple to export urls:

return `export default The ${JSON.stringify(url)}; `;
Copy the code

As a result, CSS and images are converted to JS modules because they are not supported by the browser module specification. The Snowpack and WebPack implementations are very similar.

3.3. HMR (Hot Update)

If you’ve just taken a closer look at the wrapEsmProxyResponse method, you’ll notice that the CSS “module” has the following lines in addition to the code for injecting CSS functionality:

import * as __SNOWPACK_HMR_API__ from '/${buildOptions.metaDir}/hmr.js';
import.meta.hot = __SNOWPACK_HMR_API__.createHotContext(import.meta.url);
import.meta.hot.accept();
import.meta.hot.dispose((a)= > {
  document.head.removeChild(styleEl);
});
Copy the code

This code is used to implement Hot updates, also known as HMR (Hot Module Replacement). It allows the application to automatically replace a module in the front end when it is updated, without having to reload the entire page. This is very friendly for single-page application development that relies on state builds.

Import. meta is an object containing module meta information, such as the url of the module itself. HMR doesn’t have much to do with import.meta, snowpack just uses it to store HMR-related feature objects. So don’t get too hung up on it.

Let’s take a closer look at the above section of HMR function code, is the API very familiar? But compare this to the following paragraph

import _ from 'lodash'; import printMe from './print.js'; function component() { const element = document.createElement('div'); const btn = document.createElement('button'); element.innerHTML = _.join(['Hello', 'webpack'], ' '); btn.innerHTML = 'Click me and check the console! '; btn.onclick = printMe; element.appendChild(btn); return element; } document.body.appendChild(component());+
+ if (module.hot) {
+ module.hot.accept('./print.js', function() {
+ console.log('Accepting the updated printMe module! ');
+ printMe();
+})
+}
Copy the code

The code above is taken from the HMR feature instructions on webPack’s official website. It can be seen that Snowpack stands on the shoulders of the “giant”, following the WEBPack API, and its principle is very similar. There are a lot of webPack HMR documentation on the Internet, so I won’t go into details here. The basic implementation principle is:

  • Snowpack for construction, and watch source;
  • Build websocket connection between Snowpack server and front-end application;
  • When the source code changes, rebuild, after completion through websocket module information (ID/URL) push to the front-end application;
  • The front-end application listens to this message and loads the module based on the module information
  • At the same time, the callback event registered before the module is triggered, which is passed in in the above codeacceptdisposeThe methods in

So, this code is constructed in wrapEsmProxyResponse

import.meta.hot.dispose((a)= > {
  document.head.removeChild(styleEl);
});
Copy the code

This means that when the CSS is updated and replaced, the previously injected style needs to be removed. The order of execution is: remote module –> complete loading –> Dispose callback of old module –> Dispose callback of old module.

The core HMR front-end code in Snowpack is placed in assets/hmr.js. The code is also very brief, and it’s worth noting that unlike WebPack, which adds script tags to the page to load new modules, Snowpack uses native Dynamic imports directly to load new modules:

const [module. depModules] =await Promise.all([
  import(id + `? mtime=${updateID}`),
  ...deps.map((d) = > import(d + `? mtime=${updateID}`)));Copy the code

It also follows the philosophy of using the browser’s native JavaScript Modules.


Take a nap. After reading the above, you can see that these technical solutions are very similar to the Implementation of WebPack. Snowpack borrowed some of these best practices from front-end development, and the idea was clear from the beginning: to provide front-end development with a build tool that didn’t require a Bundler.

One of the things webpack is all about is optimization, both in terms of build speed and build product. One point is how to unpack. Webpack V3 has CommonChunkPlugin before, v4 has SplitChunk configuration after. Using declarative configuration is more “intelligent” than we would have been if we had unpacked it manually. Merge and split is intended to reduce duplicate code while increasing cache utilization. But if the packaging itself is not, naturally, these two problems no longer exist. If you load the ESM directly, then the problems tree-shaking solves are mitigated to some extent (but not completely cured).

Combined with the performance and compatibility mentioned earlier, if these two things really worked out, why would we use a tool with a complex internal process and tens of thousands of lines of code to solve a problem that no longer exists?

Ok, let’s get back to the implementation of the other snowpack features.


3.4. Environment variables

It is a very common requirement to use the environment to determine whether to turn off debugging.

if (process.env.NODE_ENV === 'production') {
  disableDebug();
}
Copy the code

Environment variables are also implemented in Snowpack. From the usage documentation, you can fetch variables in the module import.meta.env. Use it like this:

if (import.meta.env.NODE_ENV === 'production') {
  disableDebug();
}
Copy the code

So how are environment variables injected?

Take build source code as an example. In the code generation stage, a new code segment is generated by calling the wrapImportMeta method.

code = wrapImportMeta({code, env: true.hmr: false, config});
Copy the code

So what’s the difference between the code processed by wrapImportMeta? The answer can be found in the source code:

export function wrapImportMeta({ code, hmr, env, config: {buildOptions}, }: { code: string; hmr: boolean; env: boolean; config: SnowpackConfig; }) {
  if(! code.includes('import.meta')) {
    return code;
  }
  return (
    (hmr
      ? `import * as  __SNOWPACK_HMR__ from '/${buildOptions.metaDir}/hmr.js'; \nimport.meta.hot = __SNOWPACK_HMR__.createHotContext(import.meta.url); \n`
      : ` `) +
    (env
      ? `import __SNOWPACK_ENV__ from '/${buildOptions.metaDir}/env.js'; \nimport.meta.env = __SNOWPACK_ENV__; \n`
      : ` `) +
    '\n' +
    code
  );
}
Copy the code

For code that contains a call to import.meta, Snowpack injects the import to the env.js module and assigns the import value to import.meta.env. So the built code looks like this:

+ import __SNOWPACK_ENV__ from '/__snowpack__/env.js';
+ import.meta.env = __SNOWPACK_ENV__;

if (import.meta.env.NODE_ENV === 'production') {
    disableDebug();
}
Copy the code

In a development environment, env.js HMR is added. The content of env.js is also very simple, which is to directly take the env key value as the object key value, export through export default.

By default, env.js contains only MODE and NODE_ENV. You can use @snowpack/plugin-dotenv to read.env files directly.

3.5. CSS Modules support

Modularity of CSS has always been a challenge, and one of its important purposes is to isolate CSS styles. Common solutions include:

  • I’m going to use something like BEM
  • Use the CSS Module functionality provided by WebPack
  • Use the STYLED Components CSS in JS scheme
  • Shadow DOM scheme

My previous articles covered these types of solutions in detail. Snowpack also provides CSS Modules functionality similar to webPack.

import styles from './index.module.css' 

function render() {
    return <div className={styles.main}>Hello world!</div>;
}
Copy the code

To enable CSS modules in Snowpack, you must end with.module. CSS. Only then will the file be treated specially:

if (spec.endsWith('.module.css')) {
    const resolvedUrl = path.resolve(path.dirname(outPath), spec);
    allCssModules.add(resolvedUrl);
    spec = spec.replace('.module.css'.'.css.module.js');
}
Copy the code

All CSS Modules are wrapped with the wrapCssModuleResponse method, which injects the generated token with a unique class name into a file and exports it as default:

_cssModuleLoader = _cssModuleLoader || new (require('css-modules-loader-core'())); const {injectableSource,exportTokens} = await _cssModuleLoader.load(code, url, undefined, () => {
    throw new Error('Imports in CSS Modules are not yet supported.');
});
return`...export let code = ${JSON.stringify(injectableSource)};
    let json = ${JSON.stringify(exportTokens)};
    exportdefault json; ... `;Copy the code

I’ve left out the HMR and style injection code here, keeping only the CSS Module functionality. It can be seen that it actually borrows csS-modules-loader-core to realize the core ability of token generation in CSS Module.

Use the React template as an example. Change app. CSS to app.module. CSS.

+ let json = {"App":"_dist_App_module__App","App-logo":"_dist_App_module__App-logo","App-logo-spin":"_dist_App_module__App-logo-spin" ,"App-header":"_dist_App_module__App-header","App-link":"_dist_App_module__App-link"};
+ export default json;
Copy the code

For the exported default object, the key is className from the CSS source, and the value is the actual classname after the build.

3.6. Performance issues

Remember yahoo’s 35 catch-22 for performance optimization? There is mention of merging files to reduce the number of requests. This is due both to the slow start nature of TCP and the browser’s concurrency limitations. With the increase of the front-end rich application demand, the front-end page is no longer manual introduction of a few script scripts. At the same time, the lack of native JS modularity in browsers also added fuel to the fire, and then, with the help of NPM, the packaging tool came out. Webpack is also a product of that era.

With the popularity of HTTP/2 in recent years, the arrival of 5G, and the continued development of JS modularity in browsers, the “truth” of this merge request may be worth a second look. Last year PHILIP WALTON’s blog Using Native JavaScript Modules in Production Today recommended that you try Using JavaScript Modules Native to the browser in a Production environment.

The article “Using Native JavaScript Modules in Production Today” mentioned that according to previous tests, unpackaged code performed much worse than packaged code. However, the experiment was biased, and with recent optimizations, non-packaged performance has also improved significantly. The recommended practice is basically the same as snowpack’s approach to node_modules. The load is guaranteed to be no more than 100 modules and 5 layers deep.

At the same time, due to the technical form of the business, the business line I work for has undergone a construction tool migration, and a similar strategy is used for module processing: business code modules are not merged, only the modules in node_modules are packaged, and all go through HTTP/2. Instead of using native module functionality, the distribution of modules is similar to snowpack and the ones mentioned in this article. According to the performance data after the launch, the performance has not decreased. Of course, since you don’t use native module functionality to load dependencies, it’s not all the same. But there are some references.

JSX/Typescript/Vue/Less…

For non-standard JavaScript and CSS codes, we generally use tools such as Babel, less and the corresponding loader to deal with them in Webpack. The original version of Snowpack didn’t have the ability to handle this syntax. Instead, it recommended converting the code before transferring it to Snowpack.

In the new version, Snowpack already has built-in JSX and Typescript file handling. For typescript, Snowpack is compiled using TSC, which is the official typescript compiler.

JSX is compiled via @snowpack/plugin-babel, which is really just a simple wrapper around @babel/core. JSX can be compiled using Babel configuration on the machine.

const babel = require("@babel/core");

module.exports = function plugin(config, options) {
  return {
    defaultBuildScript: "build:js,jsx,ts,tsx".async build({ contents, filePath, fileContents }) {
      const result = await babel.transformAsync(contents || fileContents, {
        filename: filePath,
        cwd: process.cwd(),
        ast: false});return { result: result.code }; }}; };Copy the code

As you can see above, the core is to call the babel.transformAsync method. Projects generated using the @snowpack/app-template-react-typescript template rely on a package called @snowpack/app-scripts-react, It uses @snowpack/plugin-babel and babel.config.json as follows:

{
  "presets": [["@babel/preset-react"]."@babel/preset-typescript"]."plugins": ["@babel/plugin-syntax-import-meta"]}Copy the code

For the Vue project snowpack also provides a corresponding plug-in @snowpack/plugin-vue to get through the build process. If you look at the plug-in, the core is to use @vue/ Compiler-sFC to compile the Vue components.

In addition, for Sass (similar to Less), Snowpack recommends adding the corresponding script command:

"scripts": {
  "run:sass": "sass src/css:public/css --no-source-map"."run:sass::watch": "$1 --watch"
}
Copy the code

So, in fact, Sass compilation is done using the Sass command directly, and snowpack simply executes the instructions following its convention syntax. Similar to gulp/Grunt, what you define in scripts is a simple “workflow”.

A summary of ts, JSX, Vue and SASS syntax processing shows that snowpack does not implement much on its own. It mainly relies on “bridging” existing tools and integrating them into its own system in a way. Similarly, the loader for Webpack is the same idea; for example, babel-loader is the bridge between Webpack and Babel. In the end, it is a question of blaming borders. If the goal is to be a build tool for front-end development, you don’t have to implement these sub-build processes already in place, but you need to incorporate them into your system.

Because of the boom in front-end build tools in recent years, Snowpack can find a variety of tools to leverage and implement the build process in a lightweight way.

4. One last chat

Snowpack is fast — full builds are fast, and incremental builds are fast. Because it doesn’t need packaging, it doesn’t need to build a huge dependency graph like WebPack and do all sorts of merge and split calculations based on dependencies. Snowpack’s incremental build is basically a matter of changing a file and processing that file, with modules loosely coupled.

Another big pain point for WebPack is the handling of “external” dependencies, which are:

  • Module A runs with A dependency on module B
  • But you don’t want to build B in the A construction phase

At this point B is like an “external” dependency. In the past, a typical solution was external, and of course, UMD and AMD packages could be loaded using a front-end loader. Or take it one step further and use Module Federation in WebPack 5. A possible scenario for this requirement is the micro front end. If each front-end microservice is to be built together, it will inevitably be built slower and slower with the expansion of the project, so the requirement of independent construction and dynamic loading and running also appears.

For packers, import ‘b.js’ by default is required to package B modules, which is why we need a lot of “reverse” configuration to disable this default behavior and provide an expected runtime scheme. The import ‘/dist/ b.js’ does not need to fetch the B Module at build time, but is only coupled at run time. It is inherently build-independent and run-time dependent. Of course, snowpack still throws errors when it’s built for missing dependency modules, but the above is essentially achievable, much less difficult than a packer, and more intuitive to users.

So snowpack is Bundleless? We can look at it from the following aspects:

  • Its handling of business code is bundleless
  • Node_modules is currently bunched
  • It still provides plugins like @snowpack/plugin-webpack / @snowpack/plugin-parcel to allow you to pack for production. So, with module/ Nomodule technology, it will be much more resistant to compatibility issues, which is a bit of incremental marketing

Is Snowpack the next build tool?

In 2019, you should use a bundler because you want to, not because you need to.