— ByteFE: Yang Jian
background
Lynx(cross-end framework developed by the company) compilation tools are quite different from traditional Web compilation tool chains (for example, dynamic style and dynamic script are not supported, basically saying no to Bundleless and code splitting, the module system is based on JSON instead of JS, and there is no browser environment). In addition, there are requirements such as real-time compilation on the Web side (building the system), dynamic compilation on the Web side (WebIDE), real-time compilation on the server side (compilation and delivery on the server side), and multi-version switching. Therefore, we need to develop a Bundler that supports both local and browser work and can be flexibly customized according to business. Universal Bundler. In the process of developing Universal Bundler, we encountered some problems. Finally, we developed a new Universal Bundler based on ESbuild and solved most of the problems we encountered.
What is a bundler
Bundler’s job is to package a series of modular code into one or more files. Common Bundler includes WebPack, rollup, esbuild, etc. Most of the module organization here refers to JS based module systems, but there are other ways to organize module systems (such as WASM, json usingComponents for applets, CSS and HTML import, etc.). The generated file may also be more than one file (code spliting generates multiple JS files, or generates different JS, CSS, HTML files, etc.). Most Bundler’s core principles are similar, but they tend to emphasize features such as
- Webpack: emphasizes the support for Web development, especially the built-in HMR support, the plug-in system is relatively powerful, the best compatibility of various module systems (AMD, CJS, UMD, ESM, etc., good compatibility is a bit too much, which actually has advantages and disadvantages, leading to webpack programming), there is a rich ecology, The disadvantage is that the product is not clean enough, the product does not support the generation of ESM format, plug-in development is difficult, not suitable for library development.
- rollup: The product is very clean and supports a variety of output formats. It is suitable for library development. The plug-in API is friendly. Application development relies on plug-ins.
- Esbuild: Emphasizes performance. It has built-in support for CSS, images, React, typescript, etc. It compiles fast (100 times faster than WebPack and rollup). The disadvantage is that the plugin system is relatively simple and the ecosystem is not as mature as WebPack and rollup.
How does bundler work
Bundler’s implementation is similar to that of most compilers in that it uses a three-step design for comparison
- LLVM: Each language is compiled to LLVM IR through the compiler front end, then various optimizations are made based on LLVM IR, and then different CPU instruction set codes are generated based on the optimized LLVM IR according to different processor architectures.
- bundler: Compile each module as Module graph and optimize tree shaking && Code spliting && Minify based on Module graph. Finally, the optimized Module Graph generates JS code in different formats based on the format specified.
LLVM vs. Bundler
This also makes many of the traditional LLVM compiler optimization strategies actually available in Bundler, and esbuild is an example of taking this to the extreme. Because of rollup’s minimal functionality and architecture, let’s use rollup as an example to see how a Bundler works. The bundle process of rollup is divided into two steps: rollup and generate, corresponding to bundler front-end and Bundler back-end respectively.
- src/main.js
import lib from './lib';
console.log('lib:', lib);
Copy the code
- src/lib.js
const answer = 42;
export default answer;
Copy the code
Start by generating the Module Graph
const rollup = require('rollup');
const util = require('util');
async function main() {
const bundle = await rollup.rollup({
input: ['./src/index.js'],
});
console.log(util.inspect(bundle.cache.modules, { colors: true, depth: null }));
}
main();
Copy the code
The output is as follows
[
{
code: 'const answer = 42;\nexport default answer;\n',
ast: xxx,
depenencies: [],
id: 'Users/admin/github/neo/examples/rollup-demo/src/lib.js'
...
},
{
ast: xxx,
code: 'import lib from './lib';\n\nconsole.log('lib:', lib);\n',
dependencies: [ '/Users/admin/github/neo/examples/rollup-demo/src/lib.js' ]
id: '/Users/admin/github/neo/examples/rollup-demo/src/index.js',
...
}]
Copy the code
Our generated product already contains the AST structure of each module after parsing, as well as the dependencies between modules. Once the Module Graph is built, Rollup can proceed to build artifacts based on the Module Graph based on the user’s configuration.
const result = await bundle.generate({
format: 'cjs',
});
console.log('result:', result);
Copy the code
The generated content is as follows
exports: [], facadeModuleId: '/Users/admin/github/neo/examples/rollup-demo/src/index.js', isDynamicEntry: false, isEntry: true, type: 'chunk', code: "'use strict'; \n\nconst answer = 42; \n\nconsole.log('lib:', answer); \n", dynamicImports: [], fileName: 'index.js',Copy the code
Therefore, a basic JavaScript Bundler process is not complicated, but its powerful plug-in system is indispensable if it is to be truly applied to the production environment and support complex and diverse business requirements.
Plug-in system
Most Bundler offers a plug-in system that allows users to customize Bundler’s logic. For example, rollup plug-ins are divided into input plug-in and output plug-in. Input plug-in corresponds to the process of generating Module Graph based on input, while output plug-in corresponds to the process of generating products based on Module Graph. We’ll focus on the Input plugin, which is the core of the Bundler plugin system. We’ll use esbuild’s plugin system as an example to see what we can do with it. The core flow of input is to generate dependency diagrams, one of the core functions of dependency diagrams is to determine the source content of each module. The Input plug-in provides a way to customize how modules load source code. Most input plug-in systems provide two core hooks
- OnResolve (resolveId in rollup, factory.hooks. Resolver in webpack): Determines the actual module address based on a moduleID
- OnLoad (adiDID in rollup, loader in webpack) : Load the module content according to the module address.
Esbuild only provides the hooks for load. You can use the hooks for load. Rollup also provides the hooks for transform. And load functions are explicitly distinguished (but do not prevent you from doing the transform in load), while Webpack delegate the transform to the Loader. These two hooks seem small, but together they can be very powerful. One of the best features of the ESBuild plugin system compared to rollup and WebPack plugin systems is the support for virtual Modules. Let’s look briefly at a few examples to show what plug-ins can do.
loader
One of the most common requirements for webpack is to use a variety of loaders to handle non-JS resources, such as image CSS, etc. Let’s take a look at how to implement a simple less-loader using esbuild plug-ins.
export const less = (): Plugin => { return { name: 'less', setup(build) { build.onLoad({ filter: /.less$/ }, async (args) => { const content = await fs.promises.readFile(args.path); const result = await render(content.toString()); return { contents: result.css, loader: 'css', }; }); }}; };Copy the code
We only need to filter the type of file we want to process in onLoad, then read the content of the file, perform a custom transform, and return the result to the CSS Loader built in ESbuild for processing. Most loader functions can be implemented through the onLoad plugin.
sourcemap && cache && error handle
The above example is relatively simple. As a more mature plug-in, we need to consider sourcemap mapping after transform and custom cache to reduce the repeated overhead of load and error handling. Let’s use the svelte example to see how to handle sourcemap and cache and error handling.
let sveltePlugin = { name: 'svelte', setup(build) { let svelte = require('svelte/compiler') let path = require('path') let fs = require('fs') let cache = new LRUCache(); // Use a LRUcache to keep memory from rising during the watch build.onLoad({filter: /.svelte$/ }, async (args) => { let value = cache.get(args.path); // let input = await fs.promises. ReadFile (args. Path, 'utf8'); If (value && value.input === input){return value cache hit, skip the subsequent transform logic, Svelte's format to ESbuild's format let convertMessage = ({message, start, 3.3.2 Case) end }) => { let location if (start && end) { let lineText = source.split(/\r\n|\r|\n/g)[start.line - 1] let lineEnd = start.line === end.line ? end.column : lineText.length location = { file: filename, line: start.line, column: start.column, length: lineEnd - start.column, lineText, } } return { text: message, location } } // Load the file from the file system let source = await fs.promises.readFile(args.path, 'utf8') let filename = path.relative(process.cwd(), args.path) // Convert Svelte syntax to JavaScript try { let { js, warnings } = svelte.compile(source, {filename}) let contents = js.code + '//# sourceMappingURL=' + js.map.tourl () Merge return {contents, warnings: Map (convertMessage)} // Report warning and errors to esbuild, and then report them to business} catch (e) {return {errors: [convertMessage(e)] } } }) } } require('esbuild').build({ entryPoints: ['app.js'], bundle: true, outfile: 'out.js', plugins: [sveltePlugin], }).catch(() => process.exit(1))Copy the code
So far we have realized a relatively complete function of svelte-Loader.
virtual module
One of the biggest improvements in Esbuild over Rollup is the support for Virtual Modules. Bundler handles two types of modules. One is a path that corresponds to a file path on a real disk. The other path does not correspond to the actual file path but needs to generate the corresponding content based on the path form: virtual Module. The Virtual Module has rich application scenarios.
glob import
To take a common scenario, developing a repL like rollupjs.org/repl/ often involves loading code samples into MEMFS and building them on the browser, but if the examples involve many files, it can be cumbersome to import them one by one. We can support glob imports. examples/
examples
index.html
index.tsx
index.css
Copy the code
import examples from 'glob:./examples/**/*'; import {vol} from 'memfs'; vol.fromJson(examples,'/'); // Mount the local examples directory to memfsCopy the code
Similar functionality can be implemented via Vite or babel-plugin-Macro. Let’s see how esbuild implements this. Implementing the above functions is actually very simple, we just need to
- Resolve the custom path in onResolve and pass the metadata to onLoad via pluginData and path. In addition, you can define a namespace(namespace is used to prevent the normal file load logic from loading the returned path and filtering subsequent loads).
- The metadata returned by onResolve is filtered through namespace in onLoad, and the generated data logic is customized to load according to the metadata, and then the generated content is handed to the built-in loader of ESBuild for processing
const globReg = /^glob:/; export const pluginGlob = (): Plugin => { return { name: 'glob', setup(build) { build.onResolve({ filter: globReg }, (args) => { return { path: path.resolve(args.resolveDir, args.path.replace(globReg, '')), namespace: 'glob', pluginData: { resolveDir: args.resolveDir, }, }; }); build.onLoad({ filter: /.*/, namespace: 'glob' }, async (args) => { const matchPath: string[] = await new Promise((resolve, reject) => { glob( args.path, { cwd: args.pluginData.resolveDir, }, (err, data) => { if (err) { reject(err); } else { resolve(data); }}); }); const result: Record<string, string> = {}; await Promise.all( matchPath.map(async (x) => { const contents = await fs.promises.readFile(x); result[path.basename(x)] = contents.toString(); })); return { contents: JSON.stringify(result), loader: 'json', }; }); }}; };Copy the code
Esbuild filters based on filters and namespaces for performance reasons, where the filter’s re is golang’s re and namespace is a string. Therefore, the esbuild can completely filter based on filter and namespace without falling into js calls, minimizing the overhead of golang call JS, but it can still be set to /.*/ to completely trap js and filter in JS. Actual falling overhead is actually acceptable.
Virtual Module can not only fetch content from disk, but also directly calculate content in memory, and even import modules as function calls.
memory virtual module
The env module here is calculated entirely from environment variables
let envPlugin = { name: 'env', setup(build) { // Intercept import paths called "env" so esbuild doesn't attempt // to map them to a file system location. Tag them with the "env-ns" // namespace to reserve them for this plugin. build.onResolve({ filter: /^env$/ }, args => ({ path: args.path, namespace: 'env-ns', })) // Load paths tagged with the "env-ns" namespace and behave as if // they point to a JSON file containing the environment variables. build.onLoad({ filter: /.*/, namespace: 'env-ns' }, () => ({ contents: JSON. Stringify (process.env), loader: 'JSON ',}))},} // import {NODE_ENV} from 'env' // env is virtual module,Copy the code
function virtual module
Use module names as functions, complete compile-time calculations, and even support recursive function calls.
build.onResolve({ filter: /^fib((\d+))/ }, args => { return { path: args.path, namespace: 'fib' } }) build.onLoad({ filter: /^fib((\d+))/, namespace: 'fib' }, args => { let match = /^fib((\d+))/.exec(args.path), n = +match[1] let contents = n < 2 ? `export default ${n}` : ` import n1 from 'fib(${n - 1}) ${args.path}' import n2 from 'fib(${n - 2}) ${args.path}' export default n1 + n2` return {contents}}) // Usage import FIB5 from 'fib(5)' // The compiler obtains FIB5 results directlyCopy the code
stream import
NPM run dev can be done without downloading node_modules
import { Plugin } from 'esbuild'; import { fetchPkg } from './http'; export const UnpkgNamepsace = 'unpkg'; export const UnpkgHost = 'https://unpkg.com/'; export const pluginUnpkg = (): Plugin => { const cache: Record<string, { url: string; content: string }> = {}; return { name: 'unpkg', setup(build) { build.onLoad({ namespace: UnpkgNamepsace, filter: /.*/ }, async (args) => { const pathUrl = new URL(args.path, args.pluginData.parentUrl).toString(); let value = cache[pathUrl]; if (! value) { value = await fetchPkg(pathUrl); } cache[pathUrl] = value; return { contents: value.content, pluginData: { parentUrl: value.url, }, }; }); build.onResolve({ namespace: UnpkgNamepsace, filter: /.*/ }, async (args) => { return { namespace: UnpkgNamepsace, path: args.path, pluginData: args.pluginData, }; }); }}; }; Import react from 'react'; // The compiler will automatically convert to import react from 'https://unpkg.com/react'Copy the code
The above examples show that the virtual Module of ESbuild is very flexible and powerful. When we use the Virtual Module, In fact, our entire module system structure becomes something like this: we can’t copy what we’re loading and we can select different namespaces for different scenarios
- Local development: load all files through the local file namespace
- Node_modules: Similar to the streaming import scenario for deno and Snowpack, You can run the file namespace through the service file and the unpkg namespace through the node_modules file, which is suitable for the scenario where the development of a large monorepo project requires the installation of all node_modules.
- Web real-time compilation scenario (performance and network issues) : that is, the third-party library is fixed, the business code may change, the local file and node_modules are memFS.
- Dynamic compilation on the Web: in Intranet WebIDE scenarios, third-party libraries and service codes are not fixed. In this case, memFS is used for local files and unpkg is used for node_modules
We found that the Universal Bundler based on the Virtual Module is very flexible and can flexibly respond to various business scenarios without affecting the cost of each scenario.
universal bundler
Most Bundler runs in the browser by default, so the biggest challenge in building a universal Bundler is getting It to run in the browser. Unlike our native Bundler, browser Bundler has some limitations, so let’s take a look at some of the issues that need to be addressed when porting a Bundler to a browser.
rollup
First of all, we need to select an appropriate Bundler to help us complete the bundle’s work. Rollup is an excellent Bundler with many excellent properties
- Treeshaking support is very good, and tree Shaking for CJS is also supported
- Rich plug-in hooks with very flexible customization capabilities
- Supports running in a browser
- Support multiple output formats (ESM, CJS, UMD, SystemJS)
Formal because the excellent properties, so a lot of the latest bundler | bundleness tools are based on the rollup or compatible rollup plug-in system, is a typical vite and WMR, have to say write a rollup than write webpack plugin to comfortable a lot. Our early Universal Bundler was actually based on Rollup, but we had a number of problems with rollup, summarized below
Compatibility issues with CommonJS
Rollup-plugin-commonjs rollup-plugin-commonJS rollup-plugin-commonJS rollup-plugin-CommonJS rollup-plugin-CommonJS rollup-plugin-CommonJS rollup-plugin-CommonJS The first step is to convert CJS to ESM, and unfortunately, The Commonjs/ES Module interop issue is a very thorniest one. Find the Babel/rollup/typescript issue sokra.github. , there is a natural semantic gap between the two, converting ESM to Commonjs is not a big problem (avoid default export problem carefully), but converting CJS to ESM has more problems. Rollup-plugin-commonjs has a lot of work done on CJS2ESm, but there are still a lot of edge cases, and rollup is actually rewriting the core module github.com/rollup/plug… Some typical questions are as follows
Circular reference problem
Because the CommonJS export module is not live binding, converting a recurring reference to CommonJS to ESM is problematic
Hoist problem for dynamic require
Synchronous dynamic require almost cannot be converted to ESM. If it is converted to top-level import, According to the semantics of import, Bundler needs to hoist the content of synchronous require, but this violates synchronous require. So dynamic require is also difficult to deal with
Hybrid CJS and ESM
Because there is no standard specification for mixing ESM and CJS semantics in a module, while WebPack supports mixing CJS and ESM(downlevel to WebPack Runtime) in a module, rollup does not support this behavior (the latest version can be conditional enabled, I haven’t tried it.
Performance issues
Because of the complexity of CJS2ESM, the conversion algorithm is very complex, which leads to a sharp decline in the compilation performance of rollup once the business contains many CJS modules. This may not be a big problem when compiling some libraries, but the compilation speed is unacceptable when developing large businesses.
Switch from CJS to ESM in the browser
On the other hand, rollup can be easily ported to MEMFS, but rollup-plugin-commonJS is difficult to port to the Web. Therefore, we can only rely on online CJS2ESM services like Skypack to complete the above transformation when we do Web Bundler based on rollup in the early stage, but most of these services are realized through rollup-plugin-commonJS. So rollup doesn’t get out of the way, and there’s extra network overhead, and it’s hard to handle CJS modules that aren’t node_modules. Luckily esBuild takes a different approach from Rollup, introducing a very small runtime that supports CJS by using a Node-like Module Wrapper for CJS compatibility. But his runtime is not concise enough…) .
It is more compatible with CJS by completely dropping support for CJS Tree shaking, and at the same time can make Web Bundler support CJS directly without introducing plug-ins.
Support from the Virutual Module
The rollup virtual module supports hacks that rely on paths being spelled with a ‘\0’, which is intrusive to paths and unfriendly to ffi scenarios (c++ string regards ‘\0’ as terminator). The ‘\0’ path is very easy to handle.
filesystem
A local Bundler is a local file system to access, but browser does not have a local file system, so how to access files can be implemented by implementing Bundler independent of the specific FS, all file access through the configurable FS. rollupjs.org/repl/ is used in this way. So we just need to replace the module load logic from fs with memFS on the browser. OnLoad hooks are used to replace the file load logic.
node module resolution
When we switched file access to MEMFS, An immediate problem is how to obtain the actual path format corresponding to the id of require and import. The algorithm used in Node to map an ID to a real file address is Module Resolution. The implementation of this algorithm is complicated and needs to consider the following situations: See detailed algorithm tech.bytedance.net/articles/69…
- The file | index | catalog three kinds of situations
- Js, JSON, addon multiple file suffixes
- Difference between ESM and CJS Loader
- The main field processing
- Conditional exports processing
- exports subpath
- NODE_PATH processing
- Recursively look up
- The processing of symlink
In addition to the complexity of Node Module resolution itself, we may also need to consider the additional functions supported by Webpack but popular in the community, such as main Module filed fallback, alias support, TS support and other suffixes. Yarn | PNPM | NPM package management tools such as compatibility, etc. It costs a lot to implement this algorithm from scratch, and Node’s Module Resolution algorithm is constantly updated. The enhanced Resolve module of Webpack basically implements the above functions, and supports custom FS, which can be easily ported to MEMFS.
I think the node algorithm is over engineering and inefficient (a bunch of fallback logic has a lot of I/O overhead), and that’s why hoist, the root of all evil, is so popular, maybe bare import with import map, Or deno | golang this shows a better path.
main field
The main field is also a relatively complex issue, mainly because there is no unified specification, and the community library does not fully comply with the specification, which mainly involves the distribution of packages. In addition, the main field is officially supported by NodeJS. Module, Browser, browser and other fields are not agreed upon by bundler and third-party community libraries
- How to configure entries for CJS and ESM, ESNext and ES5, Node and browser, dev and prod
- The code in the module | main should be es5 or esnext (determines whether you need the code in the node_module transformer)
- Whether code in a Module should point to browser’s implementation or node’s implementation (determines node bundler)
Main and Module priorities in Browser Bundler)
- How to distribute the different code between Node and Browser, etc.
unpkg
Then we need to deal with node_modules modules. There are two ways: one is to fully mount node_modules to memFS and then use enhanced resolve to load the corresponding modules in MEMFS. The other way is to use unpkg. A request to convert node_modules’ ID to unpkg. The first is suitable for a fixed number of third-party modules (if not, MEMFS will not be able to carry infinite node_modules), and memFS access speed is much faster than network request access, so it is very suitable for building system implementation. The second applies to the unfixed number of third-party modules and no obvious real-time requirements for compilation speed. This is more suitable for webIDE scenarios like CodesandBox, where businesses can independently choose the NPM modules they want.
Shim and polyfill
Another problem with Web Bundler is that most of the community modules are built around Node. They rely heavily on Node’s native apis that are not supported in browsers, so running these modules directly on the browser can cause problems. There are two cases
- They don’t really depend on node Runtime. Browserify can simulate these apis in browsers. It provides a large number of Node apis for browser polyfills such as path-Browserify, stream-Browserify, etc.,
- Although node code does not need to be executed on the browser, it is not expected that the implementation of Node will increase the size of the browser bundle and cause errors. In this case, we need node related modules for external processing.
As a tip, most Of the Bundler configuration external may be troublesome or it is impossible to modify the Bundler configuration. We just need to wrap require in eval, and most Bundler will skip the packaging of the Require module. Such as eval (‘ require ‘) (‘ OS)
Polyfill and environment smell tan, contradiction and shield
On the one hand, polyfill tries to smooth out the differences between Node and browser as much as possible, and on the other hand, context-sniffing tries to distinguish between browser and Node as much as possible. If both of these functions are used at the same time, various hacks are required
webassembly
The modules in our business rely on c++. In the local environment, c++ can be compiled into static libraries to be called through ffi, but in the browser, it needs to be compiled into webassembly to run. However, most wasm sizes are not large, and esbuild wasm is about 8M. The size of wASM compiled by our own static library is about 3M, which has a great impact on the overall package size. Therefore, we can learn from the scheme of code split to split WASM. You can reduce the size of packages loaded for the first time by breaking down hot code for code that is likely to be used for first access and Cold code for code that is unlikely to be used.
Where can we use Esbuild
Esbuild has three vertical features that can be used in combination or completely independently
- minifier
- transformer
- bundler
More efficient Register and Minify tools
Using esbuild’s Transform function, esbuild-Register can replace ts-Node register for unit testing framework to greatly improve speed: see github.com/aelbore/esb… Ts-node now supports custom registers, which can be replaced by esbuild-Register. Esbuild’s Minify performance is 100 times better than Terser’s.
More efficient Prebundle tools
Esbuild is a better prebundle tool than Rollup, although business code is not bundleness. In order to prevent waterfall and CJS compatibility problems of third-party libraries, esbuild is a better prebundle tool. In fact, the latest version of Vite has replaced prebundle functionality from rollup to esbuild.
Better online CJS2ESM service
The esM CDN service is set up using esbuild: esm.sh is an example
node bundler
In contrast to the front-end community, the Node community seems to use less of the Bundle solution, partly because the Node service may use less bundle-friendly operations such as FS and Addon, and partly because most Bundler tools are designed for the front-end and require additional configuration for node. However, bundling node applications or services has significant benefits
- Reduce the node_modules size of the user and speed up the installation. Installing the bundle only significantly reduces the size and speed of the installation compared to installing a bunch of node application dependencies into the node_modules of the business. PNPM and YARN are positive examples of using esbuild to make all dependent bundles zero dependent twitter.com/pnpmjs/stat…
- This improves the speed of cold startup because the code behind the bundle reduces the size of THE JS code that actually needs to be parsed through tree shaking (the parse overhead of JS is a significant part of the cold startup speed of large applications, especially applications that are sensitive to cold startup speed), and avoids file IO. Both of these dramatically reduce the speed of application cold starts, making them ideal for cold-start sensitive scenarios such as Serverless
- Semver is a community specification, but it is very strict on the code. When introducing many third-party libraries, it is difficult to guarantee that the upstream dependencies will not break semver semantics. Therefore, the bundle code can completely avoid the application bugs caused by upstream dependencies. This is critical for security-critical applications such as compilers.
For this reason, I strongly encourage you to bundle node applications, and ESBuild provides support for Node bundles out of the box.
TSC Transformer alternatives
TSC has poor performance even with incremental compilation, so we can compile TS code using ESBuild instead of TSC. (Esbuid does not support TS typechecker and is not prepared to support ts typechecker), but if the dev phase does not strongly rely on typechecker, you can use ESbuild instead of TSC in dev phase. If typechecker is strongly required, You can look at SWC, which is rewriting the TYPE Checker part of TSC with Rust, github.com/swc-project…
Monorepo and monotools
Esbuild is one of the few tools that has good library and app development support (webPack library support is poor, rollup app development support is poor), which means you can unify your project’s build tools with EsBuild. Esbuild supports React development, and the bundle is extremely fast. Without any Big Model optimization, the entire bundle takes only 80ms to complete. Mobx, etc.)
Another benefit of this is that your monorePO can easily solve the compilation problems of public packages. All you need to do is set the esbuild main field to [‘source’,’module’,’main’] and point source to your source entry in your public library. Esbuild will try to compile your public library source first. The compilation of common libraries does not affect your overall bundle speed at all. I can only say that TSC is not a good candidate for running compilations, too slow && too complex.
Some problems with Esbuild
Debugging trouble
Esbuild core code is written in golang, users directly is compiled binary code and a pile of js glue code, binary code can hardly breakpoint debugging (LLDB) | GDB debugging, debugging esbuild code every time, need to pull down the code to compile debugging, debugging the demand is higher, Is difficult
Only target to ES6 is supported
Esbuild transformer currently only supports target to ES6, which has little impact on the dev stage. However, most of the Domestic market still needs to consider ES5 scenarios, so the products of ESBuild cannot be considered as the final product. Usually need to cooperate with Babel | TSC | do es6 to es5 SWC
Golang WASM has higher performance loss than Native, and wASM package size is larger.
Currently, the performance of Golang compiled WASM is not very good (3-5 times performance attenuation compared to Native), and the wASM package compiled by Go is large (8M+), which is not suitable for some package-sensitive scenarios
The plug-in API is streamlined
Compared with webPack and Rollup, esBuild only supports onLoad and onResolve, which can accomplish a lot of work, but is still lacking. For example, chunk post-processing after code Spliting is not supported
🔥 Volcano Engine APMPlus application Performance Monitoring is a performance monitoring product for volcano Engine application development suite MARS. Through advanced data collection and monitoring technologies, we provide enterprises with full-link application performance monitoring services, helping enterprises improve the efficiency of troubleshooting and solving abnormal problems.
At present, we specially launch “APMPlus Application Performance Monitoring Enterprise Support Action” for small and medium enterprises, providing free application performance monitoring resource pack for small and medium enterprises. Apply now for a chance to receive a 60-day free performance monitoring service with up to 60 million events.
👉 Click here to apply now