Asynchronous solution history
1. Callback function
Callbacks are the original asynchronous solution, executed after the asynchronous code has finished executing
fs.readFile('./1.txt'.'utf-8', (err, data) => {
if(err) console.error(err)
console.log('here is data', data)
})
Copy the code
There are several disadvantages to this:
- If there are too many callbacks, callback hell will form and the code will become less readable
- Error capture is difficult
- The callback mode of the callback sleeve is serial and inefficient
- Unable to return data via return
2. Callback function + publish subscribe
In addition to callback nesting, we can also use a publish-subscribe model, where we want to solve serial inefficiencies by having multiple asynchronous tasks running at the same time and notifies us when they are all finished.
class eventBus {
constructor() {this.bus = {}
}
on(eventName, callback){ / / to monitor
this.bus[eventName] = callback
}
emit(eventName, data){ / / triggers
this.bus[eventName](data)
}
}
const event = new eventBus()
event.on('getData', (function(){
const collection = [] // To collect data returned asynchronously
return function(data) {
collection.push(data)
if(collection.length === 2) {console.log('here is all data! ', collection) // All execution completed}}} ()))Copy the code
Simply emit the getData event in the callback of each asynchronous task.
Each emit getData event triggers eventBus to store the data via the ON-bound callback function.
When the length of the stored data reaches the required value, it indicates that all asynchronous functions have been completed.
fs.readFile('./1.txt'.'utf-8', (err, data) => {
if(err) console.error(err)
event.emit('getData', data)
})
fs.readFile('./2.txt'.'utf-8', (err, data) => {
if(err) console.error(err)
event.emit('getData', data)
})
Copy the code
Writing this way does allow for asynchronous concurrency, but the code is a bit more complicated.
3. generator
With the advent of the Generator scheme, functions are no longer executed at once, and can be divided into multiple execution segments, handing over control of execution.
Let’s start with the syntax of generator.
function* generator(){
console.log(1)
let b = yield 'a'
console.log('b', b)
let c = yield b
console.log('c', c)
return c
}
Copy the code
The function is not executed directly. Instead, an iterator object of the function is generated, and the next method on the iterator object enables the function segment to be executed downwards
let go = generator()
console.log(go.next()) // 'a'
console.log(go.next('b_value')) // 'b_value'
console.log(go.next('c_value')) // 'c_value'
Copy the code
The overall process is as follows:
The first time the next method is called, the function executes to yield ‘a’ and then stops. Yield ‘a’ will return an object of the form {value: ‘a’, done: false}, and the input argument to the next method will be the value of B
The generator relinquishes execution of the code, allowing us to execute asynchronous code as if it were synchronous
function* generator(){
let b = yield setTimeout((a)= > { console.log('1'); go.next()},100);
let c = yield setTimeout((a)= > { console.log(2); go.next()},0);
return c
}
let go = generator()
go.next()
Copy the code
In fact, the next method is usually not called again and again, using co library + generator to achieve
Here we implement a simple CO library ourselves
The expected usage of CO is as follows:
co(generator())
.then(
data= > console.log(data),
err => console.log(err)
)
Copy the code
Passing an execution of a generator function to CO (equivalent to passing it), the CO library will automatically execute next for us, passing each return value of Next to CO as a parameter to next, and then returning a Promise, The value of the generator function return (note that the default return is undefined if there is no return) is passed to the.then method
To write our CO library:
function co(it){
return new Promise((resolve, reject) = > {
function next(input){
const {value, done} = it.next(input)
if(done){
return resolve(value)
}
// treat the return value of it.next() as a promise
Promise.resolve(value).then(data= > {
try {
next(data) // Pass the return value of the last next to the next next
} catch (e) {
reject(e)
}
})
}
try {
next()
} catch (e) {
reject(e)
}
})
}
Copy the code
async await
With the basics of CO + Generator above, async await is easy to understand
Async await is a syntactic sugar for Generator + CO, touted as the ultimate solution to asynchrony
async function asyncFn(){
const a = await new Promise((resolve, reject) = > {
console.log('a')
resolve('a')})console.log('haha')
return 1
}
asyncFn()
.then(d= >console.log(d))
Copy the code
The print result is as follows:
'a'
'haha'
1
Copy the code
Let’s get back to the principle of async await: async functions are basically syntactic candy for generator functions, and when async functions are executed it will automatically call.next() for us. Await is essentially the syntactic sugar of yield. The return value from the code after await is wrapped in a promise that will be passed to the next method in the then method. That’s why ‘a’ is printed before ‘haha’ above.
In short, regardless of whether the code behind await is asynchronous or not, it executes immediately, wraps the return value as a promise, and executes the next method in.then.
The above.
Thank you for reading here, more personal blog posts poke here