Welcome to join the Human high quality front-end framework research group, Band fly

Hello, everyone. I am Karsong.

The React source code internally uses a variety of algorithms and data agencies to implement different modules (e.g. the scheduler uses a small top heap).

Today we are going to talk about the LRU algorithm for data caching. The content includes four aspects:

  • This section describes the React feature

  • The relationship between this property and the LRU algorithm

  • Principle of LRU algorithm

  • React LRU implementation

Can be said to be from the entry to the implementation will talk about, so the content is more, suggest a “like” collection slowly eat.

Suspense for everything

Suspense and React.lazy were introduced in Update 16.6 to split component code.

For the following code:

import A from './A';
import B from './B';

function App() {
  return (
    <div>
      <A/>
      <B/>
    </div>)}Copy the code

Generated after being packaged by the packaging tool:

  • The chunk. Js (containA, B, AppComponent code)

For the first screen rendering, if the B component is not required, its code can be split out. You only need to make the following changes:

/ / before
import B from './B';
/ / after
const B = React.lazy(() = > import('./B'));
Copy the code

Generated after being packaged by the packaging tool:

  • Chunk. js (including A, App component code)

  • B. Js (including B component code)

In this way, the B component code is requested as a JSONP during the first screen rendering, and rendered after the request is returned.

To show placeholders before B requests return, use Suspense:

Omit the rest of the code before //
return (
  <div>
    <A/>
    <B/>
  </div>
)
// After that, omit the rest of the code
return (
  <div>
    <A/>
    <Suspense fallback={<div>loading...</div>} ><B/>
    </Suspense>
  </div>
)

Copy the code

loading. Will be rendered before the request is returned. .

as a placeholder.

Suspense can be seen as:

Placeholders (fallback properties) are displayed until the asynchronous content is returned, and the content is displayed after it is returned

Take a look at the JSX structure returned after using Suspense components and you’ll notice a remarkable detail:

return (
  <div>
    <A/>
    <Suspense fallback={<div>loading...</div>} ><B/>
    </Suspense>
  </div>
)
Copy the code

It is not clear from this JSX that component B is rendered asynchronously!

The difference between synchronous and asynchronous is:

  • Sync: Start -> Results

  • Asynchronous: Start -> Intermediate -> Result

Suspense can converge the intermediate state logic of wrapped child components into Suspense (aka Suspense’s fallback property), so child components don’t need to distinguish between synchronous and asynchronous.

So, can you extend the ability of Suspense from react.lazy (asynchronous request component code) to all asynchronous operations?

The answer is yes.

The great work of Resource

The React repository is a Monorepo containing multiple libraries (such as React and react-dom). It includes a library called React-cache that combines Suspense with Suspense.

Suppose we have a method called fetchUser that requests user data:

const fetchUser = (id) = > {
  return fetch(`xxx/user/${id}`).then(
    res= > res.json()
  )
};
Copy the code

Wrapped by the createResource method of the React-cache, it becomes a resource:

import {unstable_createResource as createResource} from 'react-cache';

const userResource = createResource(fetchUser);
Copy the code

Suspense resource coordination makes it possible to write asynchronous request data logic synchronously:

function User({ userID }) {
  const data = userResource.read(userID);
  
  return (
    <div>
      <p>name: {data.name}</p>
      <p>age: {data.age}</p>
    </div>)}Copy the code

As you can see, userResource. Read is written completely synchronously and calls fetchUser internally.

The logic behind this is:

  1. The first call to userResource. Read creates a promise (the return value of fetchUser)

  2. throw promise

  3. Suspense component render Fallback after React catch Promise. Suspense component render fallback after React catch Promise

  4. After promise resolve, the User component is re-rendered

  5. A call to userResource. Read at this point returns the result of resolve (the data requested by fetchUser), which is used to continue render

As you can see from steps 1 and 5, userResource.read may be called twice for a single request, namely:

  • Send the request for the first time, return the promise

  • Returns the requested data the second time

So userResource needs to cache the promise value internally. The cached key is userID:

const data = userResource.read(userID);
Copy the code

Because the userID is the props of the User component, when the User component receives different userids, the promise corresponding to different userids needs to be cached inside the userResource.

If you switch 100 userids, 100 promises will be cached. Obviously, we need a cache cleanup algorithm, otherwise the cache will grow and overflow.

The react-cache uses the LRU algorithm to clean the cache.

LRU principle

The core idea of LRU algorithm is as follows:

If the data has been accessed recently, there is a higher chance that it will be accessed in the future

Therefore, the more frequently the data is used, the higher the weight. When you need to clean up data, always clean up the least commonly used data.

React-cache LRU implementation

The react-cache implementation consists of two parts:

  • Data access

  • LRU algorithm implementation

Data access

Each resource created through the createResource has a map where:

  • The key of this map is the key passed in when resource. Read (key) is executed

  • The map value is the promise returned after the execution of resource. Read (key)

In our userResource example, the map is created when the createResource executes:

const userResource = createResource(fetchUser);
Copy the code

When userResource.read is executed for the first time, it sets an entry (called an entry) in the map with userID as key and promise as value:

const data = userResource.read(userID);
Copy the code

To get an entry, you need to know two things:

  • Key corresponding to entry

  • Resource to which Entry belongs

LRU algorithm implementation

React – Cache uses a two-way circular list to implement the LRU algorithm, which consists of three operations: insert, update, and delete.

The insert

The first time userResource.read(userID) is executed, entry0 (n0 for short) forms a circular list with itself:

At this point, first (the highest weight) points to n0.

After changing the userID props, execute userResource.read(userID) to get entry1 (n1 for short) :

So n0 and N1 form a circular list, and first points to N1.

If n2 is inserted, it looks like this:

It can be seen that whenever a new entry is added, first always points to it, implying the idea that new is always of high weight in LRU.

The update operation

Each time an entry is accessed, its weight is updated to the highest as it is used.

For the following n0 n1 n2, where n2 has the highest weight (first refers to him) :

When n1 is accessed again, the following function is called:

Corresponding userID userResource. Read (n1);Copy the code

N1 will be given the highest weight:

Delete operation

When the number of caches exceeds the upper limit, react-cache clears low-weight caches.

For the following n0 n1 n2, where n2 has the highest weight (first refers to him) :

If the cache limit is 1 (that is, only one entry is cached), the first. Previous is iterated until the number of caches is 1.

That is, first clean n0:

Next, clear n1:

After each clearing, the corresponding entry in the map is deleted.

For the full LRU implementation, see React-cache LRU

conclusion

In addition to React. Lazy and React-cache can be combined with Suspense, any asynchronous process can be converged into Suspense as long as you use your imagination, such as React Server Compontnt and streaming SSR.

As the underlying act18 stabilizes by the end of the year, we believe that synchronous writing will become more and more mainstream in the future.

No matter how many new gadgets React develops in the future, the underlying algorithms and data structures will always be there.

Is really plain and boring……