Described as follows

  • If the first request is successful, the remaining requests will not be sent and the successful results will be returned as the remaining requests
  • If the first request fails, then request number 2 is issued. If the request is successful, the remaining requests are not issued and the successful results are returned as remaining requests
  • If the second request fails, then request number 3 is issued. If the request is successful, the remaining requests are not issued and the successful result is returned as the remaining request
  • .This is recurred until the last request is sent in the worst-case scenario

Concurrency: An interface request that is still pengding sends the same request within a short period of time

async function fetchData (a)  {
    const data = await fetch('/ / 127.0.0.1:3000 / test')
    const d = await data.json();
    console.log(d);
    return d;
}

fetchData(2) / / no. 1
fetchData(2) / / 2
fetchData(2) / / 3
fetchData(2) / / 4
fetchData(2) / / 4
fetchData(2) / / 5
fetchData(2)
fetchData(2)
Copy the code

The old versioncachedAsync

I have previously used vUE’s caching function to cache successful requests. The implementation looks like this. The following cachedAsync will only cache successful requests and pull up new ones if they fail. However, in the above concurrent scenario, the same request will be sent three times in a row because the cache cannot be hit. Therefore, this concurrent scenario cannot be handled.

const cachedAsync = function(fn) {
    const cache = Object.create(null);
    return async str => {
        const hit = cache[str];
        if (hit) {
            return hit;
        }
        // Only successful promises are cached, and failures are rerequested
        return (cache[str] = await fn(str));
    };
};
const fetch2 = cachedAsync(fetchData)
fetch2(2);
fetch2(2);
fetch2(2);
Copy the code

The advanced version

First of all, caching is a must, so we just have to deal with how to control concurrency. One way to think about it is this

  • Each request returns a new Promise, whose exector execution timing is stored in a queue.
  • When the queue length is 1, execute a request. If the request is successful, traverse the queue exector, get the result of the request, and resolve.
  • If the request fails, the Promise is rejected and the queue is pushed off the stack. And then recursively call itnext
  • Until the exector queue is empty
  const cacheAsync = (promiseGenerator, symbol) = > {
    const cache = new Map(a);return async (params) => {
      return new Promise((resolve, reject) = > {
        symbol = symbol || params;
        let cacheCfg = cache.get(symbol);
        if(! cacheCfg) { cacheCfg = {res: null.exector: [{ resolve, reject }],
          };
          cache.set(symbol, cacheCfg);
        } else {
          // Hit cache
          if (cacheCfg.res) {
            return resolve(cacheCfg.res)
          }
          cacheCfg.exector.push({ resolve, reject });
        }

        const { exector } = cacheCfg;
        
        // Handle concurrency, making the same request while it is still pending
        // take the first request
        if (exector.length === 1) {
          const next = async() = > {try {
              if(! exector.length)return;
              const reqData = await promiseGenerator(params);
              // If successful, resolve the remaining request
              while (exector.length) {
                exector.shift().resolve(reqData);
              }
              cacheCfg.res = reqData;
            } catch (error) {
              // If this fails, the promise is reject
              const{ reject } = exector.shift(); reject(error); next(); }}; next(); }}); }; };Copy the code

The test link

Scenarios to test

  • The request interface randomly succeeds or fails
  • Success is expected, and the remaining requests are not issued
  • Fail and proceed to the next request

Set up a server quickly:

const koa = require("koa");
const app = new koa();

function sleep(seconds) {
 return new Promise((resolve, reject) = > {
   setTimeout(resolve, seconds);
 });
}

app.use(async (ctx, next) => {
 if (ctx.url === "/test") {
   await sleep(200);

   const n = Math.random();
   // Randomly disconnects the interface
   if (n > 0.8) {
       ctx.body = n;
   } else {
       ctx.status = 404
       ctx.body = ' '} next(); }}); app.listen(3000."127.0.0.1".() = >
 console.log("start listening on port 127.0.0.1:3000"));Copy the code

The client


  var fetch2 = cacheAsync(fetchData, "test2");

  async function fetchData(a) {
    const data = await fetch("/ / 127.0.0.1:3000 / test");
    const d = await data.json();
    console.log(d);
    return d;
  }

  console.log(fetch2(2));
  console.log(fetch2(2));
  console.log(fetch2(2));
  console.log(fetch2(2));
  console.log(fetch2(2));
  console.log(fetch2(2));
Copy the code

Take a look at the test results and refresh the page

The first time, the interface is lucky. The first time the request succeeds, only one request is sent

The second test was unlucky and the last request was successful, which was the worst scenario

Third test, request third successful

The test results show that this is correct, in line with the concurrency and caching scenarios. Very good

prompt

The cache is closed, so the refresh page cache is invalidated. But I think this should be the case, because most scenarios refresh the page to reset the state, and if you want to persist it, you might as well save it to local storage.

GIT – HUB address