Asynchronous jitter
We can think of each asynchronous operation (Promise or setTimeout) as a single asynchronous thread. In actual programming, asynchronous threads are mostly left to fend for themselves, but this can cause problems.
For example, component A requests A back-end location resolution service; When the A component is used in A list, the list’s for loop causes n concurrent requests. We can think of this as asynchronous behavior causing jitter.
Asynchronous image stabilization
The following pseudocode describes the basic points of our daily anti-shake treatment:
const debounce = (func, delay = 500) = > {
let timeout = 0;
return (. args) = > {
// If not blocked
if(! timeout){// Then start blocking
timeout = setTimeout((a)= > {
// Remove the block after delay
timeout = 0;
}, delay)
// Execute immediatelyfunc(... args); }else {
// Do nothing}}}Copy the code
It can be seen that the main idea of this kind of anti-shake is to block the behavior during the delay. We can think of this as behavioral stabilization, often used in event-binding scenarios such as mouse clicks and mouse movements.
However, in an asynchronous operation, such as case 1 above, if the asynchronous request behavior of the component in the for loop is blocked, then the data required by each component is not available and the next logic can not be triggered by resolve or reject. In this case, we need to do data shaking.
The process of data stabilization can be roughly divided into the following steps:
- Suspend all subsequent requests after the first asynchronous request.
- Share data with all pending requests after the first asynchronous request returns;
- Reset status and data after all pending queues have been cleared.
To put it simply, a group of asynchronous requests are handed over to the first one, and the rest just wait for the result of the request. This is somewhat similar to the relationship between processes and threads. In addition, different HTTP requests have different urls and return different data. Therefore, each asynchronous request must be grouped by URL and shared data must be isolated by URL.
Simple implementation
First simulate an asynchronous request:
let somePromise = (key) = > new Promise((resolve, reject) = > {
setTimeout((a)= > {
resolve([key, Math.random()]);
}, 500 + 500 * Math.random())
});
Copy the code
We can group httpGet requests using the parameter URL.
Here are some anti-tremors based on Promise
const debouncePromise = (factory, keyIndex = 0, delay = 50) = > {
// Share the data space
const cache = {};
return (. args) = > new Promise((resolve, reject) = > {
// Get the cache group
let key = args[keyIndex];
let state = cache[key];
if(! state){ state = {status: 0.taskCount: 0 };
cache[key] = state;
}
// The first request is made by the main thread
if(state.status === 0) {// Lock state, suspend other requests.
state.status = 1; factory(... args).then(result= > {
// End its asynchronous behavior
resolve(result);
// Share data
state.result = result;
// Unlock status to notify other requests.
state.status = 2;
}, err => {
reject(err);
state.result = err;
// Unlock status to notify other requests.
state.status = 3; })}// Other requests can be seen as secondary threads, only waiting for the main thread result.
else if(state.status === 1) {// Number of tasks +1
state.taskCount += 1;
const waitingHandle = setInterval((a)= > {
// Unlocked state, 2 or 3
if(state.status > 1) {// Clear the wait loop
clearInterval(waitingHandle);
// Process the result
(state.status === 2 ? resolve : reject)(state.result);
// The number of tasks is -1
state.taskCount -= 1;
// If the number of tasks is zero, it is the last thread
// Reset status and data
if(state.taskCount <= 0) {delete cache[key];
}
}
}, delay)
}
})
}
Copy the code
Test code:
somePromise('aaaaa').then(res= > console.log(res))
somePromise('aaaaa').then(res= > console.log(res))
somePromise('aaaaa').then(res= > console.log(res))
// debounce it
somePromise = debouncePromise(somePromise)
somePromise('bbb').then(res= > console.log(res))
somePromise('bbb').then(res= > console.log(res))
somePromise('bbb').then(res= > console.log(res))
somePromise('cc').then(res= > console.log(res))
somePromise('cc').then(res= > console.log(res))
somePromise('cc').then(res= > console.log(res))
// The queue is pushed again below the delay threshold
// The expected result should be the same as the BBB grouping above.
setTimeout((a)= > {
somePromise('bbb').then(res= > console.log(res))
somePromise('bbb').then(res= > console.log(res))
somePromise('bbb').then(res= > console.log(res))
}, 10)
// Resend the request when the delay threshold is higher
// The expected result should be inconsistent with the BBB grouping above.
setTimeout((a)= > {
somePromise('bbb').then(res= > console.log(res))
somePromise('bbb').then(res= > console.log(res))
somePromise('bbb').then(res= > console.log(res))
}, 1000)
Copy the code
Execution Result:
// The result of each request jitter before buffeting.
["aaaaa".0.6301757853487]
["aaaaa".0.2816070377500479]
["aaaaa".0.009064307010989259]
// The result does not jitter within the delay threshold
["bbb".0.43005402041935437]
["bbb".0.43005402041935437]
["bbb".0.43005402041935437]
["cc".0.956314414078062]
["cc".0.956314414078062]
["cc".0.956314414078062]
// Delay indicates jitter and keeps data sharing
["bbb".0.43005402041935437]
["bbb".0.43005402041935437]
["bbb".0.43005402041935437]
// A new request is considered outside the delay threshold
["bbb".0.7923457809536392]
["bbb".0.7923457809536392]
["bbb".0.7923457809536392]
Copy the code
The execution results met expectations.
Example code: codepen. IO /marvin_2019…