Small knowledge, big challenge! This article is participating in the creation activity of “Essential Tips for Programmers”

This article has participated in the “Digitalstar Project” and won a creative gift package to challenge the creative incentive money.

Bytecode cache

What is bytecode caching?

Bytecode Cache is an important part of the browser performance optimization mechanism. It reduces the startup time of a website through pasing + compilation results. At present, the mainstream browsers in the market all implement bytecode caching.

In Chrome’s case, V8’s early adoption of direct binary machine code generation was:

Js source => AST abstract syntax tree => binary

Binary code caching (cached in memory and hard disk) was also used to save on recompilation time.

However, the problems with binary code are: 1. 2. High code complexity; Lazy compilation caches only the outermost layer of code;

Later, V8 introduced the bytecode architecture:

Js source => AST abstract syntax tree => bytecode => binary

The advantages of introducing bytecode are: 1. 2. Fast compilation and startup; 3. Reduce code complexity;

In addition to Google, Mozilla has also implemented bytecode caching. Mozilla – JavaScript enables bytecode Caching

JSBC (The JavaScript Startup Bytecode Cache) is enabled on major websites.

Two-level caching strategy

Positive on – two level cache policy!

In fact, Chrome has a two-level caching strategy for V8 compiled bytecode:

  1. Isolate MEMORY cache;
  2. Fully serialized hard disk cache;

1. Isolate memory cache:

The Isolate cache occurs in the same process (on the same page) and attempts to use already available data as quickly and as little as possible;

Its disadvantages are: a. Low hit rate (80%); B. Cannot cross process;

When V8 compiles a script, the compiled script is stored in a Hashtable (in V8’s heap) with the source key. When Chrome asks V8 to compile another script, V8 first checks that the source code of the script matches the values in the HashTable. If so, the bytecode that already exists is returned.

2. Fully serialized hard disk cache:

The hard disk cache is managed by Chrome (Blink, to be exact) and fills in the gaps where the Isolate cache cannot share code caches across multiple processes or multiple Chrome sessions.

It leverages existing HTTP resource caches to manage cached and expired data received from the Web by:

① When a JS file (cold Run) is first requested, Chrome downloads it and gives it to V8 for compilation. It also stores files in the browser’s disk cache.

② On the second request for a warm run JS file, Chrome will retrieve the file from the browser cache and provide it to V8 again for compilation. This time, however, the compiled code is serialized and attached as metadata to the cached script file.

③ On the third hot run, Chrome retrieves the file and its metadata from the cache and hands both to V8. V8 deserializes metadata and can skip compilation.

inwarm runWhen using memory cache, inhot runDisk cache is used.

Now look at HTTP caching

Know yourself, know your enemy, and win a hundred battles.

With the knowledge of bytecode caching and two-level caching, we will have a clearer idea of how to use the browser caching mechanism to improve site loading performance!

Remember this flow chart about strong caching and negotiated caching?

From the bytecode caching perspective, we can tell the interviewer more details!

When the server returns 304 Not Modified, our bytecode cache remains warm run or hot run; When 200 OK is returned, the cache resource is updated and the bytecode cache is cleared, returning to the Cold Run state. (wow! HTTP cache files for further understanding! From “file cache” to “file compiled bytecode cache” understanding ~)

Therefore, reducing code changes is still the most basic and effective rule for using caching;

Of course, the same reason is: do not arbitrarily change the resource URL, because the bytecode cache is associated with the script URL, changing the URL will create a new resource entry in the browser cache resource, accompanied by a cold cache entry.

Another detail is that when we do A/B test, different resources may be cached, so ensuring the certainty of operation is also the premise to ensure that the corresponding cache can be used:

If (math.random () > 0.5) {A(); } else { B(); }Copy the code

There is also a well-worn HTTP caching strategy: separate stable third-party libraries into separate files;

In general, it is not recommended to consolidate all JS scripts into one large bundle, as breaking them up into smaller scripts is often better for reasons other than bytecode caching (e.g., multiple network requests, stream compilation, page interaction, etc.).

Note: The minimum file size for the bytecode cache is 1 Kb, even if the file is too small.

Forced to compile

But consider this carefully:

Only code compiled when code execution is complete is added to the bytecode cache, so there are many types of functions that are not cached even though they are executed later. Event handlers (even onloads), promise chains, unused library functions, and other lazy compilations that are not called before executing remain delayed and are not cached as bytecode;

How to do?

Oh, we can force them to compile by executing functions now (IIFE)!

(function foo() {//... }) ();Copy the code

Because IIFE expressions are invoked immediately, most JavaScript engines try to detect them and compile immediately, then compile fully;

Due to the difference in detection methods, functions are now compiled even if they are not actually executed immediately, as follows:

const foo = function() {
  // Lazily skipped
};
const bar = (function() {
  // Eagerly compiled
});
Copy the code

(seconds)

The service worker cache

Service workers also have bytecode caching mechanisms;

We know that the service worker allows you to build a local resource cache that provides resources from the local cache when you send a request. This is especially useful if you want to build an offline application, such as a PWA application.

Code examples:

  • The service worker adds handlers for the install (create resource) and fetch (provide resource from potential cache) events.
// sw.js
self.addEventListener('install', (event) => {
  async function buildCache() {
    const cache = await caches.open(cacheName);
    return cache.addAll([
      '/main.css',
      '/main.mjs',
      '/offline.html',
    ]);
  }
  event.waitUntil(buildCache());
});

self.addEventListener('fetch', (event) => {
  async function cachedFetch(event) {
    const cache = await caches.open(cacheName);
    let response = await cache.match(event.request);
    if (response) return response;
    response = await fetch(event.request);
    cache.put(event.request, response.clone());
    return response;
  }
  event.respondWith(cachedFetch(event));
});
Copy the code

These caches include JS resource caches. However, service worker caching is primarily used for PWA, so it has a slightly different mechanism than Chrome’s “automatic” caching;

In service workers, when JS resources are added to the cache, they create a bytecode cache immediately, which means that the bytecode cache is available for the second load (rather than only for the third load like the two-level cache described above).

Second, the service worker generates a “full” bytecode cache for these scripts, which does not have delayed compilation but is fully compiled and placed in the cache. This has the advantage of fast and predictable performance, with no execution order dependence, but this comes at the expense of increased memory usage;

Note that this policy applies only to service worker caches and not to other uses of the Cache API. In fact, the current Cache API does not perform bytecode caching.

Stage summary

Above, this paper first describes what is bytecode, and then about the two-level caching strategy of bytecode, and then about HTTP caching from the caching mechanism of bytecode, to give corresponding suggestions and different perspectives of understanding (core); One more detail: forced compilation; Finally, the difference of bytecode cache in service worker cache is discussed.

For some developers, caching is all that matters, and loading is all that matters. But a closer look at the internal mechanics of the browser reveals that we are standing so high because we are standing on the shoulders of giants!

Ok, call it a day!

👍👍👍👍 college

I am Anthony Nuggets, the public account of the same name, output exposure input, technical insights into life, goodbye ~

reference

  • JavaScript Startup Bytecode Cache
  • Code caching for JavaScript developers
  • Bug 19278 – JSC Cache bytecode to disk
  • V8 engine details