preface

Share a BGM to get to the point: average

Concurrent request handling, it’s kind of a mandatory problem, and there’s a lot of answers on the web that you can do all kinds of things, but keep a little note for yourself, so let’s get right to it.

Please implement the following function, which can request data in batches, all URL addresses are in urls parameters, and the concurrency of requests can be controlled by Max parameter. When all requests are completed, callback callback function needs to be executed, and the request function can be fetch (no request failure processing).

Si Chen: We just don’t know how to look at the sky

Get perspective

Simplify the problem

Interviewer: Look at your empty head, let’s simplify the question for you. We don’t care about the Max value

function requestData(urls = [], callback = () = >{{})

  let requestArr = []

  for(let i = 0, length = urls.length; i < length; i++) {

    // fetch can be used directly by MDN

    requestArr.push(fetch(urls[i]))

  }

  Promise.all(requestArr)).then((a)= > {

    callback();

  }).catch(err= > {

    console.log(err)

  })

}

Copy the code

Interviewer: Well, that’s probably the roughest way to do it, so you put the Max limit on it, and then change your mind and see how you can do it, and give a hint that you can use recursion.

Quick-witted reader: Oh, I got it

The specific process

  • Create a requestArr (Max) to execute the request.

  • We throw requested promises into it one by one (recursive implementation) and delete them when they’re done

  • The requestArr reaches the upper limit so we wait for the request to execute and empty space to continue adding

  • After urls are added, we make a promise. all decision to execute the callback in the requestArr (which only needs to get the last array of requests because it will delete itself when the request is finished).

In the code

Put on your glasses and look good

function requestData(urls = [], max = 1, callback = () = >{{})

    let requestArr = [],

        i = 0;

    function toFetch({

        if (i === urls.length) {

            // Return resolve when all values are added

            return Promise.resolve();

        }



        // Create a request promise with the fetch method

        let fetchItem = fetch(urls[i++]);

        requestArr.push(fetchItem);

        // Define a self-deleting microtask for each item

        fetchItem.then( (a)= > {requestArr.splice(requestArr.indexOf(fetchItem), 1)});

        

        let result = Promise.resolve();

        if (requestArr.length === max) {

            // We use the promise.race method to determine if there is an "empty place" when the array reaches its limit.

            result = Promise.race(requestArr);

        }

        return result.then((a)= > toFetch());

    }

    

    toFetch().then((a)= > Promise.all(requestArr)).then((a)= > {

        callback();

    })

}

Copy the code

To upgrade the

In fact, the main idea of this topic is to stop when our task queue is full, so wePromiseObject use.thenWell, that would have to be done recursively, so we don’t haveawaitWell, with await we can directly determine the stop function when the task queue is full. To kang code

async function requestData(urls = [], max = 1, callback = () = >{{})

  // Define an array to receive all the fetch requests

  const fetchArr = []

  const requestArr = []

  for(const item of urls) {

    const p = fetch(item)

    fetchArr.push(p)

    // We need to do this step if the maximum limit is smaller than the array

    if(max <= urls.length) {

      const e = p.then((a)= > {

        requestArr.splice(requestArr.indexOf(p), 1)

      })

      requestArr.push(e)

      if (requestArr.length >= max) {

        await Promise.race(requestArr);

      }

    }

  }

  Promise.all(fetchArr)).then((a)= > {

    callback()

  })



Copy the code

So we seem to have solved the problem a little bit more subtly.

conclusion

In fact, I personally feel that this kind of scenario is relatively rare in practical application (of course, I may also see too little), but it is also a frequent question in the interview, the feeling is mainly to check the actual use of promise. Cold wind bleak, blocked in the company, overtime does not hurt, taxi pain, dry rice people, blunt!!