This article was first published on the public account “Xiao Li’s Front-end cabin”, welcome to follow ~

preface

The last question of the front-end interview is often handwritten, the topic is not limited to the basic API implementation, algorithm questions, scene application questions, etc.

Today, I would like to share with you a handwritten problem I met in an interview with a large factory: using JS to simply implement a set of SWR mechanism.

What is the SWR

Many of you have probably never heard of SWR, let alone implemented it in code.

The name SWR comes from stale-while-revalidate: an HTTP cache invalidation policy popularized by HTTP RFC 5861. Similar to max-age, it controls the Cache. It is an instruction of cache-control, and the English word stale means old, not fresh. In the HTTP cache world, stale is used to describe a cache that has expired.

A common cache policy is to revalidate a resource’s cache if you want to use it again after it has expired. During the revalidate execution, the client must wait until the revalidate request completes.

This traditional mechanism of updating the cache synchronously is considered to be performance problematic in some performance focused scenarios.

The SWR strategy says that when a revalidate request is made, the client can use the expired cache without waiting. The cache will be updated when the revalidate request is made, and the new cache will be used the next time.

Therefore, SWR implements the functions of “background cache refresh” and “asynchronous cache update” in popular terms.

SWR is usually used with max-age, such as cache-control: Max-age =1, stale-while-revalidate=59 This cache is fresh within 1s. In 1-60s, the cache expires, but it can still be used directly, and asynchronous revalidate is carried out at the same time. After 60s, the cache expires completely and traditional synchronous revalidate is required.

SWR can be used in situations such as apis for current weather conditions or headlines written in the past hour.

Code implementation

Now that you know what SWR is, let’s look at how to implement it.

Before achieving it, disassemble the goal:

1. When requested, data is first read from the cache and immediately returned to the caller

2. If the data has expired, the fetch request is initiated to obtain the latest data

We need a “container” to cache the complex data we request back, and Object is the first thing that comes to mind in JS.

There is nothing wrong with using Object, but its structure is a string-value counterpart, supporting only strings as key names. In ES6, Map provides a more sophisticated value-to-value Hash, which is better suited to key-value data structures.

During the interview, we should always show our knowledge reserve to the interviewer, so it is better to choose Map here.

In order to facilitate the implementation of the code, there is a good comparison. Here’s how to request data without caching:

const data = await fetcher();
Copy the code

Data caching support

In order for Fetcher to support data caching capabilities, a layer of encapsulation is required for fetcher.

Before wrapping, define what data needs to be cached. What data needs to be cached?

Obviously, that’s what the request returns.

But at the same time, you should also think that if you call a function repeatedly, it is better not to send multiple requests.

So the cached data should have:

  • Request returned data

  • Current ongoing requests (if any), avoid multiple requests

const cache = new Map(a);// Cache data

async function swr(cacheKey, fetcher) {
  // Get it from the cache first
  let data = cache.get(cacheKey) || { value: null.promise: null };
  // Write cache
  cache.set(cacheKey, data);
  
  // There is no data and no request, the request needs to be sent
  if(! data.value && ! data.promise) {// Save the promise of the current request
    data.promise = fetcher()
      .then((val) = > {
        data.value = val; // The request is successful, save the data
      });
      .catch((err) = > {
        console.log(err);
      })
      .finally(() = > {
        data.promise = null; // The promise is not saved
      });
  }
  
  // No data, but in request, reuse saved promise
  if(data.promise && ! data.value)await data.promise;
  // Return data
  return data.value;
}
Copy the code

In this way, we realize the capability of data caching.

Cache expiration time is supported

It is easy to support cacheTime on top of existing caching capabilities.

Just check the expiration date before making a new request:

const isStaled = Date.now() - Time when data is retrieved > cacheTimeCopy the code

Therefore, we also need to store the time when we get the data in the cache:

const cache = new Map(a);// Add the cacheTime parameter
async function swr(cacheKey, fetcher, cacheTime) {
  let data = cache.get(cacheKey) || { value: null.time: 0.promise: null };
  cache.set(cacheKey, data);
  
  // Whether to expire
  const isStaled = Date.now() - data.time > cacheTime;
  // The request has expired and is not in the request
  if(isStaled && ! data.promise) { data.promise = fetcher() .then((val) = > {
        data.value = val;
        data.time = Date.now(); // Save the time when the data was obtained
      });
      .catch((err) = > {
        console.log(err);
      })
      .finally(() = > {
        data.promise = null;
      });
  }
  
  if(data.promise && ! data.value)await data.promise;
  return data.value;
}
Copy the code

With the above encapsulation, the calling method is changed to:

// before
const data = await fetcher();

// after
const data = await swr('cache-key', fetcher, 3000);
Copy the code

When first invoked, data is requested through the interface. The subsequent call immediately returns cached data. If the call interval is longer than 3s, the cached data is returned before the interface is asked to get the latest data.

And you’re done! We implemented a simple SWR mechanism in about 20 lines of code.

The above code can only be a qualified level, we should make full use of their skills to impress the interviewer, let him remember you.

Conditions of the request

In our current code, although we use Map, the cacheKey is still a string and does not really function as a Map. In addition to the basic capabilities, consider passing in function as a cacheKey to implement the conditional request feature.

The return value of the function is used as a cacheKey; if it returns, the above logic is performed; if not, it is not cached.

const shouldCache = function() {... }// cacheKey supports passing in functions
const data = await swr(shouldCache, fetcher, 3000);

async function swr(cacheKey, fetcher, cacheTime) {
  // If it is a function, the function returns the value as cacheKey
  const cKey = typof cacheKey === 'function' ? cacheKey() : cacheKey;
  
  // Caching is enabled only if there is a cacheKey
  if (cKey) {
    let data = cache.get(cKey) || { value: null.time: 0.promise: null}; cache.set(cKey, data); . }else {
    return awaitfetcher(); }}Copy the code

LRUCache elimination algorithm

Let’s continue to leverage Map’s capabilities.

We know that the traversal order of a Map is the insertion order, and combined with the data structure of its key-value pairs, it is easy to think of implementing an LRU cache flushing strategy based on this feature.

LRU (Least recently used) algorithm filters out data based on historical access records. Its core idea is that “if data has been accessed recently, it has a higher chance of being accessed in the future”.

The whole process is roughly as follows:

  1. The newly added data is inserted into the first item
  2. Whenever a cache hit (that is, cached data is accessed), the data is promoted to the first item
  3. When the cache is full, the last item is discarded

Due to limited interview time, I don’t recommend continuing to write during the interview. It can be self-defeating. However, you can actively introduce the idea to the interviewer and continue to score points by adding “This algorithm is used in the Keep-Alive component of Vue” to indirectly convey that you know how Vue works.

In fact, the LRU algorithm is usually written as a separate problem, so today we will also write to consolidate:

Const cache = new Map(); Make some modifications:

  • It specifiedMaximum cache capacity capacity
  • The use of both the exposed GET and SET APIS remains unchanged
class LRUCahe {
  constructor(capacity) {
    this.cache = new Map(a);this.capacity = capacity; // Maximum cache capacity
  }

  get(key) {
    // Update as existing (delete after added)
    if (this.cache.has(key)) {
      const temp = this.cache.get(key);
      this.cache.delete(key);
      this.cache.set(key, temp);

      return temp;
    }
    return undefined;
  }

  set(key, value) {
    if (this.cache.has(key)) {
      // Update as existing (delete after added)
      this.cache.delete(key);
    } else if (this.cache.size >= this.capacity) {
      // Join if it does not exist
      // If the cache exceeds the maximum value, the map removes the first key that has not been used recently
      // map.keys() returns an Iterator
      this.cache.delete(this.cache.keys().next().value);
    }
    this.cache.set(key, value); }}// before
const cache = new Map(a);// after
const cache = new LRUCahe(50); // The maximum cache capacity is 50
// The following SWR code will remain unchanged
Copy the code

The time complexity of IMPLEMENTING LRU with Map is O(1)

conclusion

Visible, a small handwritten topic or hidden a lot of deep knowledge points. The interviewer is looking at all aspects of your abilities, and if you write the above code and present a follow-up feature that you didn’t have time to implement, it will show many aspects of your abilities:

  • Understand HTTP caching policies
  • Understand the differences between Object and Map and the application scenarios of Map
  • Understand Promise and async functions and can actually use them
  • Write code with performance optimization in mind
  • Master the judgment method of data type
  • Understand the implementation of Vue
  • Capable of API abstraction and encapsulation
  • Able to think rigorously and comprehensively

If I were the interviewer, I would have been impressed.

Finally, I wish you all can win satisfactory offer in Jinsan Yinsi!

❤ ️ support

If this article has been helpful to you, please support me by clicking 👍. Your “likes” are the motivation for my creation.

As for me, I am currently a front-line developer of Bytedance, working for four and a half years. I use React in my work and like Vue in my spare time.

At ordinary times, I will share and summarize the thinking and practice of front-end work in depth from time to time. The public account “Xiao Li’s Front-end cabin”, thank you for your attention ~