General operating process

The general content is directly depicted with a PPT diagram. The code for each phase is explained and linked below, so watch for the end

The build process

Dependency prebuild

steps

  1. Dependent search replacement: Replace the bare import path with a relative path
Why: The browser does not recognize the ES raw module import (local can be because of the Node.js environment)Copy the code
  1. Commonjs/UMD modules. Translated into an ES module. Based on the esbuild
  2. Aggregations commonly use ES multiple file libraries, such as Lodash-ES. Otherwise there will be hundreds of files requested together
// before import React from 'React' // // after import __vite__cjsImport0_react from "/node_modules/.vite/react.js? v=432aac16"Copy the code

Dependent prebuild – source interpretation

Precompile entry functions

Github links: Build entry

// Vite1.0 is based on koA architecture, but vite2.0 uses HttpServer instead. Make full use of existing tools. MiddlewareMode && httpServer) {const Listen = httpserver.listen.bind (httpServer) // Rewrite Listen to ensure that it is executed before server starts. httpServer.listen = (async (port, ... Args) => {try {// vite plugin assembly, Invoke await container.buildStart({}) await runOptimize() // prebuild} catch (e) {} return Listen (port,... args) }) } else { .... }Copy the code

Precompile the core functions

Github links: precompiled entry

Steps:

  1. Cache comparisons to determine whether to rebuild
  2. Scan entry to obtain file dependency Map. Deps: {“lodash-es”: “node_modules/lodash-es”}
  3. Leveling the dependency Map based on es-module-lexer.
  4. Call esBuild compilation and write the cache to _metadata.json
/ / the precompiled main function entry function optimizeDeps (config, force = config. Server. The force, asCommand = false, newDeps?) {... -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / * * the first step: Compare the last pre-built information (_metadata json) and deciding whether or not to build * / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / / If (prevData && prevData.hash === data.hash) {return prevData}... -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / * * the second step: Scan the source code, or according to the parameter newDeps, Access to rely on * / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / / newDeps parameter is in the service starts to join rely on incoming rely on information. let deps if (! NewDeps) {// scan the source code with esbuild for dependencies; ({ deps, missing } = await scanImports(config)) } else { deps = newDeps missing = {} } ... -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / * * the third step: Using the es module - lexer flattening nested source dependent * / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / / flat await init for (const id in deps) { flatIdDeps[flatId] = deps[id] const entryContent = fs.readFileSync(deps[id], 'utf-8') const exportsData = parse(entryContent) as ExportsData ... }... -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / * * the fourth step: To resolve user dependency optimization configuration, call esBuild, And deposited in the cacheDir * / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - / / to join the user to specify esbuildOptions const { plugins = [], ... esbuildOptions } = config.optimizeDeps? .esbuildOptions?? {} // Call the esbuild.build package const result = await build({... EntryPoints: object.keys (flatIdDeps),// Entry format: 'esM ',// Package into esM mode external: config.optimizedeps? .exclude,// exclude file outdir: = const entry = deps[id] data. Optimized [id] = {file: normalizePath(path.resolve(cacheDir, flattenId(id) + '.js')), src: entry, } } writeFile(dataPath, JSON.stringify(data, null, 2)) return data }Copy the code

Static resource processing

The browser supports ES Module, but webpack tells us: Import anything u want, but the browser does not know how to Import CSS, IMGCopy the code

HTML template parsing

Github links: entry

  1. All HTTP requests are processed by the createDevHtmlTransformFn function
  2. Vite: CSS-POST translates CSS files into
/ * * * / server createServer function. TransformIndexHtml = createDevHtmlTransformFn (server) / * * * / createDevHtmlTransformFn function Const [preHooks, postHooks] = resolveHtmlTransforms(server.config.pluginsCopy the code

css

Official description: Importing the.css file will insert the content into the tag, with HMR support as well. It is also possible to retrieve processed CSS exported as a string as its module defaults

Steps:

  1. Server Obtains the resource request URL
  2. Check whether it is a CSS file. If it is, use the built-in Vite preprocessing plug-in: Vite: CSS to modify the style file
  3. Return the modified file to the browser
Import {createHotContext as __vite__createHotContext} from "/@vite/client"; import {createHotContext as __vite__createHotContext} from "/@vite/client"; import.meta.hot = __vite__createHotContext("/src/App.scss"); // HMR import { updateStyle, removeStyle } from "/@vite/client" const id = "... / SRC/app.scss "const CSS = "XXXX" updateStyle(id, CSS) Import default CSS import.meta.hot. Prune (() => removeStyle(id)) /** The main functions are as follows. Head */ updateStyle() {... const style = new CSSStyleSheet() style = document.createElement('style') style.setAttribute('type', "Text/CSS") style. The innerHTML = the content / / the content by esbuild processing to get the document. The head. The appendChild (style)... }Copy the code

Style files such as SCSS, less, and stylus

Quick jump to see source: portal

Compared to CSS processing, there is one more step based on sass, less and other third-party library syntax translation work, the rest of the steps are completely consistent with CSS processing, even written on the official website

There is no need to install a specific Vite plug-in for them

# .scss and .sass
npm install -D sass
Copy the code

JSX, TS, TSX

Translate Ts files into JS based on esbuild, wonder why the compilation is so fast after every resource request is made? What the devil is esbuild?

  1. Go language, compiled language machine can be directly executed. JS is an interpreted language
  2. Go language, concurrency worry, and shared memory between threads. JS can only be serialized to pass, deserialization to read

According to the esbuild authors’ tests, garbage collection seems to cut the parallel processing power of JavaScript worker threads in half, probably because one half of your CPU core is busy collecting garbage for the other half

  1. Compared to TSC, ESbuild only makes 3 AST traversals, reducing the memory footprint.

Other resources

For example, images and JSON are directly imported through URLS

Plug-in system design

Vite can be extended with plug-ins, thanks to Rollup’s excellent plug-in interface design and some additional options unique to Vite. This means that Vite users can take advantage of the powerful ecosystem of Rollup plug-ins, while also being able to extend the development server and SSR capabilities as needed.

Premise: Blue is Vite custom hook (5), red is Rollup built-in hook (7). And from left to right is the order of executionCopy the code

Personal opinion: Vite uses the Rollup plugin system to enrich its ecosystem. Official explanation Rollup is more suitable for ES Module

  1. It is not reasonable to use ESModule for production environment build, which may not be as effective as Bundle, so you need to use some mainstream build tools. Such as Rollup, Webpack, etc.
  2. Simple configuration, smaller build size, and the goal is to replace WebPack. So it’s no surprise that Rollup was chosen

The plug-in implements pseudocode

  1. ResolveConfig: Resolve, sort, and initialize the PluginContainer management module

  2. ResolvePlugins: Add built-in Vite plugins, such as Vite: CSS, etc

  3. CreatePluginContainer: The plug-in exits uniformly and executes the corresponding hook. Then, all hook functions of the plug-in after pre-classification are executed in sequence

async function resolveConfig(inlineConfig, ... args) { let config = inlineConfig; // Configure the parameter object... // Since there is an array of flattern plugins, It can play a [[pulginA pulginB], pulginC] / / screening apply Settings application scenario (serve | build) plug-in const rawUserPlugins = (config. Plugins | | []).flat().filter((p) => { return p && (! p.apply || p.apply === command); }); // the sortUserPlugins function orders the plugins according to the enforce field. You can use the prior post const [prePlugins, normalPlugins, postPlugins] = sortUserPlugins(rawUserPlugins); // Run plugin.config hook to set the configuration parameter const userPlugins = [...prePlugins,...normalPlugins,...postPlugins]; for (const p of userPlugins) { if (p.config) { const res = await p.config(config, configEnv); if (res) { config = mergeConfig(config, res); }}} // Use createPluginContainer to create an internal parser for a particular scenario, such as parsing css@import, Const createResolver = (options) => {return async (id, importer, aliasOnly) => { let container = await createPluginContainer(... args); return (await container.resolveId(id, importer))? .id; }; }; // The final parameter configuration object const resolved = {//... Other plugins: userPlugins, createResolver,}; // The resolvePlugins function adds an internal vite plugin to complete all functions out of the box. // The resolvePlugins function is described below. https://github.com/vitejs/vite/blob/5745a2e8072cb92d647662dc387e7f12b2841cab/packages/vite/src/node/plugins/index.ts#L18  resolved.plugins = await resolvePlugins(resolved, prePlugins, normalPlugins, postPlugins); All (userplugins.map ((p) => p.onfigresolved? .(resolved))); // Omit the followingCopy the code

plugin container

Export async function createPluginContainer({plugins, rollupOptions}: ResolvedConfig) { Const container = {// Note that options is an immediate call function. Your plug-in's options hook can be either the configuration object directly or the function options that returns the configuration object: await (async () => { for (const plugin of plugins) { if (! plugin.options) continue; options = (await plugin.options(... args)) || options; }}) (), async buildStart() { await Promise.all( plugins.map((plugin) => { if (plugin.buildStart) { return plugin.buildStart.call(new Context(plugin), ... args); }})); }, // resolveId async resolveId(... args) { const ctx = new Context(); for (const plugin of plugins) { if (! plugin.resolveId) continue; const result = await plugin.resolveId.call(ctx, ... args); if (! result) continue; } }, // load async load(... args) { for (const plugin of plugins) { if (! plugin.load) continue; const result = await plugin.load.call(ctx, ... args); } }, // transform async transform(... args) { const ctx = new TransformContext(... args); for (const plugin of plugins) { result = await plugin.transform.call(ctx, ... args); } }, watchChange(... args) { const ctx = new Context(); for (const plugin of plugins) { if (! plugin.watchChange) continue; plugin.watchChange.call(ctx, ... args); } }, // buildEnd && closeBundle async close() { if (closed) return; const ctx = new Context(); await Promise.all(plugins.map((p) => p.buildEnd && p.buildEnd.call(ctx))); await Promise.all(plugins.map((p) => p.closeBundle && p.closeBundle.call(ctx))); closed = true; }}; return container; }Copy the code

HMR

paraphrase

  1. Hot Module refresh: listens for file changes, recompiles files, and refreshes the entire page
  2. Hot Module replacement: Hot module replacement. Only the corresponding file is updated

HMR process

  • Create a WebSocket server. (createWebSocketServer function)
  • Create a WS client file (/@vite/client) and introduce it in the HTML. (devHtmlHook)

In the HTML code, you can see:

  • The server listens for file changes and sends WebSocket messages telling the client that the type has changed, the file has changed, and so on. (chokidar)
  • The client receives the message, decides whether to refresh the page or reload the changed file based on the message content, and executes the preset HMR hook function. (See updateStyle, described in CSS processing above)
CreateServer */ if (! middlewareMode || middlewareMode === 'html') { // transform index.html middlewares.use(indexHtmlMiddleware(server)) ... * * file location:} / SRC/node/server/middlewares/indexHtml ts * / const devHtmlHook: IndexHtmlTransformHook = Async (HTML, {path: htmlPath, server, originalUrl}) => {// Process some cache identification bit information... Return {HTML, tags: [{tag: 'script', attrs: {type: 'module', // CLIENT_PUBLIC_PATH is /@vite/client SRC: path.posix.join(base, CLIENT_PUBLIC_PATH) }, injectTo: 'head-prepend' } ] } }Copy the code

Client response logic

Source link

  1. According to type, determine whether to refresh the page or update the file
  2. For resource files such as the CSS, modify the link href
  3. If the file such as JS is executed. Execute hook first. Include old file logout (disposer). The file is then loaded through import
// @vite/client socket.addEventListener('message', Async ({data}) => {handleMessage(json.parse (data))}) if(CSS) {const el = ([].slice. Call ( document.querySelectorAll(`link`) ) ).find((e) => e.href.includes(path)) if (el) { const newPath = `${path}${ path.includes('? ')? '&' : '? '}t=${timestamp} 'el.href = new URL(newPath, el.href).href}} All (array.from (modulesToUpdate).map(async (dep) => {const Disposer = Disposemap.get (dep) // Effects of undoing old files if (disposer) await disposer(dataMap.get(dep)) const [path, query] = dep.split(`?`) try { const newMod = await import( /* @vite-ignore */ base + path.slice(1) + `?import&t=${timestamp}${query ? `&${query}` : ''}` ) moduleMap.set(dep, newMod) } catch (e) { warnFailedFetch(e, dep) } }) ) }Copy the code

HMR contrast Webpack

  1. Dependencies are established: Webpack is recorded at browser runtime, vite is recorded at service side compile time; This is related to the build strategy, where the packaged resources are fully loaded and used as needed
  2. Module update: Webpack directly replaces the locally cached Module (i.e. removes it). Vite, on the other hand, directly requests the new module content and uses the new module.
  3. The Webpack compilation process is pre-installed, and the Vite compilation process is post-installed and compiled on demand.
  4. Webpack uses JSONP to request new compiled modules. Vite directly uses ESM Import to dynamically load the changed module contents. Read the content of export.

Source summary

  • Limited manpower, put it on the edge.
    • For example, Vite successfully Bridges various front-end tools (sass, esbuild, es-modole-lexer).
    • Vite2.0 abandoned KOA in favor of Http. Let the browser determine whether resources are cached, such as dependencies, by setting max-age = 31536000, immutable
  • Things like static resource processing are basically the same as webPack. Fully absorb

now

performance

Originally, merging requests was an important step in performance tuning because TCP started slowly and browsers did not support concurrency. Hence the emergence of packaging tools such as WebPack. Is the current development of browser modularity due to HTTP2 once and for all?

Data source: Performance recommendations

Premise: 300 modules. Comparison before and after compression, time consuming

Although the native ESM is now widely supported, it is still inefficient to publish an unpackaged ESM in a production environment (even with HTTP/2) because of the additional network round-trips that nested imports cause. In order to get the best load performance in a production environment, it is best to do tree-shaking, lazy loading, and chunk splitting (for better caching). – Quoted from Vite website

Local test

Import 50 individual TS files (only 20 lines each), 91% more time than starting scaffolding Demo (http1.1 based)Copy the code

Whether Http2.0 can accelerate

Is the local Http1.1 based transport inefficient due to the browser’s 6 Tcp connection limit? – The topic of this article is Vite, so the history of Http is removed and only the conclusion is retained

Bottom line: Local development does speed up. However, as the author’s current project is a large-scale application of multi-SPA, the files required for the first screen access are all thousand-level. Therefore, although it is fast, the actual first screen resource request time is still close to 1 minute, which is a little unbearable for developers

So what’s the effect on actual production? However, I did not use Vite in a production environment, and can only use some articles to generalize

  1. Current browser implementations don’t do well with highly modular Web apps. It is difficult to transfer only files that are not cached by the client

Web servers’ and browsers’ implementations are not currently optimized towards highly modularized Web app use cases. It’s hard to only push the resources that the user doesn’t already have cached-javascript mode-use HTTP /2

  1. Because of TCP congestion control. Makes it hard for single-linked Http/2 to be much faster than Http/1.1. Even at low speeds and high packet loss rates (don’t worry, usually 0.01% on wired networks), the performance is worse than Http/1.1 with six connections. And it was actually an accident. HTTP/2 is currently deployed in browsers and servers and is generally as fast or slightly faster than HTTP/1.1 in most cases

HTTP/2 as it is currently deployed in browsers and servers is typically as fast or slightly faster than HTTP/1.1 in most conditions. This is in my opinion partly because websites got better at optimizing for HTTP/2 – HOL blocking in HTTP/2 over TCP

Other tools

snowPack

  • The idea is basically the same as Vite, and the pre-build of Vite was also inspired by Snowpack V1
  • Despite the differences on the website, for personal use, Vite has integrated Rollup as a production packaging tool, with all of its features right out of the box. But the Snowpack has to be built by the developers themselves, and it’s a bit of a fragmented experience (this one is from October 2020 V2, it’s probably a bit old).

prepack

  • A tool for making JavaScript code run faster.
  • It has a full JS interpreter built in and executes the code in a separate environment (prepack JS environment rather than the current Node environment).
  • Based on the above environment, the extended host environment exchanges related issues with the prepack execution environment

@web/dev-server

  • But if you’re interested, you can click on the link above

Write in the last

  1. Vite is expected to completely replace Webpack:

To this problem, the author according to the current speed of infrastructure is not too optimistic. Perhaps small and medium-sized projects can be quickly developed with Vite. But for large applications, JS Module method of the first screen scenario, a large number of resource requests caused by the network round trip time developers can not accept (measured: huge projects based on Vite startup needs to request 3500+ files, even using Http2.0 first screen loading is still time-consuming 80 seconds). But for large applications, the speed of the development environment is eliminated. What other advantages does Vite have?

  1. Vite’s speed to start up in non-Megalithic application development environments is undeniable, and it has the advantage of ESBuild and ES Module features to overshar WebPack. But on a production build, packaged based on Rollup… Gap is also good
  2. HTTP2.0 and even 3.0 upgrade, protocol level in the active solution to the problem of transmission efficiency. Bandwidth expansion at the hardware level will also accelerate. Es Module will become more and more mainstream. I also hope that new technologies can bring more possibilities.

Reference documentation

es-module-lexer

Vite2.0 is officially released, so why should WebPack?

Vite official Chinese document

prepack

A Future Without Webpack

Performance recommendations

snowPack

prepack-gentel-intro-1

Head-of-Line Blocking in QUIC and HTTP/3: The Details

Where is ESBuild fast