The introduction of
Queues are important for any language, serialization of IO, parallelism of requests, etc. In JavaScript, asynchronous programming is important for single-threaded reasons. Yesterday, inspired by an interview question, I used express middleware ideas to realize asynchronous queues in JS, and spread them to CO implementation and generator, as well as asyncToGenerator.
Ps: There are no pictures in this paper and many codes. The use case code is here, you can clone it and try it out
Asynchronous queue
One of the questions asked in many interviews is how to make asynchronous functions run sequentially. Callback, Promise, observer, generator, async/await, all of these JS that handle asynchronous programming can fulfill this serial requirement. But the trouble is, it’s a lot of trouble to process, you have to keep manually calling from one task to the next. Like promise, like this:
a.then(() => b.then(() => c.then(...) ))Copy the code
Code nesting problem, a bit serious. So it would be nice to have a queue, just add an asynchronous task to the queue and let the queue run when it executes. Let’s set up our API. We have a queue, the queues are maintained internally, add asynchronous tasks through queue. Add queue.
Pass a next method to async-fun, as in the previous Express middleware implementation, and the queue will continue down until next is called. Next is crucial, and it moves the queue back one bit to perform the next async-fun. We need a queue to hold async-fun and we need a cursor to control the order.
Here’s my simple implementation:
const queue = (a)= > {
const list = []; / / the queue
let index = 0; / / the cursor
/ / the next method
const next = (a)= > {
if (index >= list.length - 1) return;
// cursor + 1
const cur = list[++index];
cur(next);
}
// Add a task
const add = (. fn) = >{ list.push(... fn); }/ / execution
const run = (. args) = > {
const cur = list[index];
typeof cur === 'function' && cur(next);
}
// Return an object
return {
add,
run,
}
}
// Generate asynchronous tasks
const async = (x) = > {
return (next) = > {// Pass in the next function
setTimeout((a)= > {
console.log(x);
next(); // Asynchronous task completion invocation
}, 1000); }}const q = queue();
const funs = '123456'.split(' ').map(x= > async(x)); q.add(... funs); q.run();1, 2, 3, 4, 5, 6 every second.Copy the code
Instead of constructing a class, I’m going to do it through the closure properties. The queue method returns an object containing add, run, which means adding asynchronous methods to a queue, and run, which starts execution. Inside the queue, we define several variables, list to hold the queue, index to indicate which function the queue is now in, and, most importantly, the next method, which controls the cursor to move backwards.
Once the run function is executed, the queue starts to run. We start by executing the first async function in the queue. We pass the next function to it. Then the async function decides when to execute next, i.e. the next task. We do not know when an asynchronous task is complete, but we can only tell the queue that a task is complete by forming some sort of consensus. Is the next function passed to the task. Async returns a function called Thunk, which we’ll talk about later.
Thunk
Thunk is designed to solve the problem of “calling by name”. I’m passing an expression to function A as an argument to x + 1, but I’m not sure when x + 1 is going to be used, and if it’s going to be used, it’s not going to be necessary. So there is A temporary function Thunk that holds the expression and passes it into function A to be called when needed.
const thunk = (a)= > {
return x + 1;
};
const A = thunk= > {
return thunk() * 2;
}Copy the code
B: well… It’s a callback function…
suspended
In fact, as long as a task doesn’t continue to call next, the queue stops going down. For example, we add a judgment to async tasks (usually asynchronous IO, fault tolerant handling of requests) :
// Queue stays the same,
// async adds a constraint
const async = (x) = > {
return (next) = > {
setTimeout((a)= > {
if(x > 3) {
console.log(x);
q.run(); / / try again
return;
}
console.log(x);
next();
}, 1000); }}const q = queue();
const funs = '123456'.split(' ').map(x= > async(x)); q.add(... funs); q.run();// Prints the result: 1, 2, 3, 4, 4, 4, 4 is always 4Copy the code
When we get to the fourth task, x is 4, we don’t continue, and we can return without calling next. It is also possible that there is an error and we need to try again, so we can call q.run again, because the cursor holds the index of the current async task.
Another option is to add a stop method. Although the above method feels OK, the advantage of stop is that you can actively stop the queue instead of placing a constraint on async tasks. A pause is followed by a retry, which is used to retry the previous pause. The other one is goOn, whatever the last one was, move on to the next one. The code:
const queue = (a)= > {
const list = [];
let index = 0;
let isStop = false;
const next = (a)= > {
/ / add restrictions
if (index >= list.length - 1 || isStop) return;
const cur = list[++index];
cur(next);
}
const add = (. fn) = >{ list.push(... fn); }const run = (. args) = > {
const cur = list[index];
typeof cur === 'function' && cur(next);
}
const stop = (a)= > {
isStop = true;
}
const retry = (a)= > {
isStop = false;
run();
}
const goOn = (a)= > {
isStop = false;
next();
}
return {
add,
run,
stop,
retry,
goOn,
}
}
const async = (x) = > {
return (next) = > {
setTimeout((a)= > {
console.log(x);
next();
}, 1000); }}const q = queue();
const funs = '123456'.split(' ').map(x= > async(x)); q.add(... funs); q.run(); setTimeout((a)= > {
q.stop();
}, 3000)
setTimeout((a)= > {
q.goOn();
}, 5000)Copy the code
Actually, it’s still intercepting… Instead of async, I switch to next, using isStop to switch true/false and pause. I added two timers, one is 3 seconds after pause, one is 5 seconds after continue, (please ignore the error of timer), according to the truth should be queue to three seconds, that is, the third task is finished pause, and then another 2 seconds, continue. So when you print at 3, you stop, and then two seconds later you go 4,5,6.
Two ideas, please consider the problem in combination with the scene.
concurrent
What if I run in parallel? It’s easy to run the queue all at once.
// To make the code shorter, retry goOn is removed.
const queue = (a)= > {
const list = [];
let index = 0;
let isStop = false;
let isParallel = false;
const next = (a)= > {
if (index >= list.length - 1 || isStop || isParallel) return;
const cur = list[++index];
cur(next);
}
const add = (. fn) = >{ list.push(... fn); }const run = (. args) = > {
const cur = list[index];
typeof cur === 'function' && cur(next);
}
const parallelRun = (a)= > {
isParallel = true;
for(const fn oflist) { fn(next); }}const stop = (a)= > {
isStop = true;
}
return {
add,
run,
stop,
parallelRun,
}
}
const async = (x) = > {
return (next) = > {
setTimeout((a)= > {
console.log(x);
next();
}, 1000); }}const q = queue();
const funs = '123456'.split(' ').map(x= > async(x)); q.add(... funs); q.parallelRun();// All outputs 1, 2, 3, 4, 5, 6 after one secondCopy the code
I’ve added a parallelRun method for parallelism, but I think it’s better not to put it in the run function, and try to refine the abstraction unit. It also adds an isParallel variable, which defaults to false. In case the next function might be called, an interception is needed to make sure it doesn’t mess up.
The above is using only thunk function, combined with next implementation of asynchronous queue controller, queue, with you can change es6 code to ES5, ensure compatibility, of course is simple enough, not applicable to responsible scenarios 😆, only provide ideas.
The generator and the co
The reason for introducing generator is that it is also used to handle asynchronous callbacks, and it is also used in a way that the next function is called and the generator is suspended by default. Yield is equivalent to Q.A. D above, adding tasks to the queue. So I’m going to introduce it together to broaden our thinking. Divergent thinking, similar knowledge points do a good summary, and then one day you will suddenly have a kind of: so is the matter, the original XXX is borrowed from the child YYY, and then you go to study YYY – -.
Brief introduction to the generator
Brief introduction review, because some students do not often use, there will be forgotten.
// A simple chestnut, introduce its usage
function* gen(x) {
const y = yield x + 1;
console.log(y, 'here'); / / 12
return y;
}
const g = gen(1);
const value = g.next().value; // {value: 2, done: false}
console.log(value); / / 2
console.log(g.next(value + 10)); // {value: 12, done: true}Copy the code
First, a generator is simply an iterator that defines an iterative algorithm inside the function body and returns an iterator object. See my other article on Iterator. The gen execution returns an object, G, rather than a result. G, like any iterator, guarantees the cursor + 1 by calling next and returns an object containing value (yield statement result) and done (iterator completion). Also, the value of the yield statement, such as y in the code above, is the argument passed in the next call to next, which is value + 10, so 12. This is beneficial because you can retrieve the result of the previous iteration (or the result of the processing) when defining the iterative algorithm within the generator.
However, the drawback of generator is that it does not automatically execute. TJ wrote a co to automatically execute generator, which automatically calls next. It requires that the function/statement following yield must be either a thunk function or a promise object, because only then will the queue be executed in series, the same way we implemented queue in the first place. Co implementation has two ideas, one is thunk, one is promise, let’s give it a try.
Thunk implementation
Remember how queue was originally implemented, internally defining a next function to keep the cursor moving forward, async receiving Next to execute next. The generator does not provide an index, but it does provide a g.next function, so we just need to pass g.next to async. Async is the yield statement, g.value. But you can’t pass g.ext directly. Why? The next thunk function will have to be fetched by the return value of g.next. So we still need to define a next function to wrap it.
The code:
const coThunk = function(gen, ... params) {
constg = gen(... params);const next = (. args) = > { // args is used to receive parameters
constret = g.next(... args);// args passes to g.ext, which is assigned to the last yield.
if(! ret.done) {// To determine whether it is done
ret.value(next); // ret.value is the next thunk function
}
}
next(); // Call a wave first
}
// Return asyncFn of thunk function
const asyncFn = (x) = > {
return (next) = > { / / receive next
const data = x + 1;
setTimeout((a)= > {
next && next(data);
}, 1000)}}const gen = function* (x) {
const a = yield asyncFn(x);
console.log(a);
const b = yield asyncFn(a);
console.log(b);
const c = yield asyncFn(b);
console.log(c);
const d = yield asyncFn(c);
console.log(d);
console.log('done');
}
coThunk(gen, 1);
// 2, 3, 4, 5, doneCopy the code
Gen, as defined here, has the simple function of passing in parameter 1, and then asynchronously adding each asyncFn, i.e. multiple asynchronous operations successively, and the next one depending on the return value of the previous one.
Promise to realize
The idea is exactly the same, except that I call next and move it inside co. Since the yield statement is a promise object, we can get it inside co, and then execute next in g.ext (). Value.
/ / define co
const coPromise = function(gen) {
// To execute the result can continue then
return new Promise((resolve, reject) = > {
const g = gen();
const next = (data) = > { // For passing, just change the name
const ret = g.next(data);
if(ret.done) { Resolve (co().then(resolve))
resolve(data); // You'd better give it the result of the last time
return;
}
ret.value.then((data) = > { // The first argument in then is the resolve in the Promise object, which is accepted and passed with data.
next(data); //调用下一次 next}) } next(); })}const asyncPromise = (x) = > {
return new Promise((resolve) = > {
setTimeout((a)= > {
resolve(x + 1);
}, 1000)})}const genP = function* () {
const data1 = yield asyncPromise(1);
console.log(data1);
const data2 = yield asyncPromise(data1);
console.log(data2);
const data3 = yield asyncPromise(data2);
console.log(data3);
}
coPromise(genP).then((data) = > {
setTimeout((a)= > {
console.log(data + 1); / / 5
}, 1000)});// Same 2, 3, 4, 5Copy the code
In fact, the source code of CO is implemented through these two ideas, but it does more catch error handling, and supports you to yield an array, object, through promise. All to implement. In addition, when the yield thunk function is used, it will be used as a promise. Those who are interested can go to have a look at CO. I believe it must be very clear now.
async/await
Async is also a generator-based implementation, but wrapped. To convert async/await to generate/yield, simply change the await syntax to yield, throw it into a generate function, and change the async execution to coPromise(gennerate).
const asyncPromise = (x) = > {
return new Promise((resolve) = > {
setTimeout((a)= > {
resolve(x + 1);
}, 1000)})}async function fn () {
const data = await asyncPromise(1);
console.log(data);
}
fn();
// This is a generator. CoPromise is the implementation above
function* gen() {
const data = yield asyncPromise(1);
console.log(data);
}
coPromise(gen);Copy the code
AsyncToGenerator works this way, and in fact Babel does the same.
The last
I first implemented a QUEUE (asynchronous queue solution) that was commonly needed in JS using the middleware ideas of Express, then went on to implement a simple coThunk, and finally replaced thunk with promise. Because asynchronous solution is very important in JS, to use the ready-made solution, if you can think about the implementation of the principle, I believe it is conducive to our learning progress.
Welcome to star’s personal blog: github.com/sunyongjian… 😜