Why asynchrony?
- In the program
now
Run the sum of partsIn the future
The relationship between running parts is at the heart of asynchronous programming. - Common scenarios in js: timer, mouse click, Ajax request, code created for future execution, block created for future execution, asynchronous mechanism introduced.
Event loop
// eventLoop is an array of queues, first-in, first-out
var eventLoop = [];
var event;
while(true) {
if(eventLoop.length > 0){
event = eventLoop.shift();
try{
event();
} catch(err) { reportError(err); }}}Copy the code
- Each round of the loop is called a tick, and for each tick, if there is a waiting event in the queue, an event is removed from the event for execution. This event is your callback function.
Concurrent cooperation
Asynchronous batch processing is then returned to the event queue
- The goal is to channel a long-running “process” and break it up into multiple steps or batches so that other concurrent “processes” have the opportunity to insert their own operations into the event loop queue to run alternately.
var res = [];
// response(...) Get the array from the Ajax call
function response(data) {
res = res.concat(
data.map(function(val) {
return val * 2; }}))// ajax(...) Is an AjaX function provided in a library
ajax("http://some.url.1",response);
ajax("http://some.url.2",response);
Copy the code
- Problem: If the data returned by the interface is huge, tens of millions of levels (the real interface side is batch interface, there is pagination processing)
- Solution: Create a concurrent system that does not hobble the event loop queue, can asynchronously batch these results, and then return the event queue after each processing, giving other waiting events a chance to run.
var res = [];
function response(data){
// Fetch 1000 entries ata time from data
let chunk = data.splice(0.1000);
res = res.concat(
data.map(function(val){
return val* 2; }))// If there is still data left after each 1000 intercepts
if(data.length > 0) {setTimeout(() = >{
// Process the following data asynchronously and recursively until the data in data is empty
response(data);
},0)
}
}
ajax("http://some.url.1",response);
ajax("http://some.url.2",response);
Copy the code
- SetTimeout (() => {},0) performs asynchronous scheduling, which means putting the task at the end of the current event loop
Concurrent interface request
Problem: some interface according to the incoming parameters of the batch return data, how to asynchronous batch request interface solution: according to the above idea, a large task, through the way of asynchronous recursive processing
// According to the id information of the passed list, the interface returns the specific list information of the specified ID
let list = [1.2.4.6.5.9. ] ;let result = [];
function task(list){
// Process 50 data at a time
let chunk = list.splice(0.50);
Ajax("http://id_detail", {ids:chunk},(res ) = >{
// The result of each request is put in result
result.push(res);
});
if(list.length > 0) {// Process the following data asynchronously and recursivelytask(list); }}Copy the code
What are the disadvantages of the callback function?
What is callback hell
Chestnut 1;
listen("click".function handler() {
setTimeout(() = > {
ajax("http://some.url.1".function response(test){
if(test == "hello") {
handler();
} else{ request(); }})},500)})Copy the code
The above example yields a chain of three asynchronously nested chains, each function representing a step in an asynchronous sequence (task, process). This code is called callback hell
The brain works in a sequential way
doA(function(){
doB();
doC(function(){
doD();
})
doE();
})
doF();
Copy the code
The human brain works sequentially, requiring rapid context switching when performing multiple asynchronous tasks
Let me rewrite chestnut 1
listen("click",handler);
function handler() {
setTimeout(request,500);
}
function request() {
ajax("http://some.url.1",response);
}
function response(text) {
if(text == 'hello') { handler(); }esle { request(); }}Copy the code
Disadvantages of callbacks
- You need to jump around looking at the code
- By concatenating steps 2, 3, and 4 and having them execute sequentially, using only callbacks, the only way the cost is acceptable is if step 2 is hardcoded into Step 1 and step 3 into Step 2
- Hard coding makes the code more fragile, and if step 2 fails, it never gets to Step 3
- Once the code specifies all possible events and paths, the code becomes very complex
Trust issues
- If you pass your own code into a third-party tool, such as the code using a third-party payment platform, pass in your own callback
analytics.trackPurchase(purchaseData,function() {
chargeCreditCard();
displayThankyouPage();
})
Copy the code
-
Problem 1: The callback function for trackPurchase passed into the third-party library Analytics will be executed multiple times if due to some unusual circumstances
-
Solution: To prevent third parties from calling their own callbacks multiple times, you have an idea: Create a Latch to handle multiple concurrent calls to the callback
var tracked = false;
analytics.trackPurchase(purchaseData,function(){
if(!tracked){
chargeCreditCard();
displayThankyouPage();
}
})
Copy the code
- Problem 2: Calling the callback too early, before tracing
- Problem 3: Call callback too late (or not called);
- Too few or too many callbacks (as you’ve had!) ;
- Does not pass the required environment/parameters successfully to your callback function;
- Swallow possible errors or exceptions;
Now you have a better idea of how callback hell is.
- How to solve problem 3, what if there is no call? You may need to set a timeout to cancel the event
function timeoutify(fn,delay){
var intv = setTimeout(function(){
// If the timer is triggered, set the timer to null
fn(new Error('Timeout'));
intv = null;
},delay);
return function() {
// If there is no timeout
if(intv){
clearTimeout(intv);
fn.apply(this.arguments); }}}Copy the code
What is a Promise
- Promise is an asynchronous solution to the unreadable and unmaintainable problem of callback functions and nested callbacks
- The advantage of promises is that you can write asynchronous code the same way you write synchronous code
- The downside is that promises are implemented asynchronously nested
What are the features of promises
- Promise is an immediate execution function
- A Promise has three states: Pending, Fulfiled, and Rejected. Once a state is changed it cannot be changed again
- Resolve is called on success, reject is called on failure, and the return value of the last Promise is returned to the next Promise
- Promise has a then method that passes in two callback functions, the first for a successful callback and the second for a failed callback
- Each then returns a new Promise
- If resolve in the previous PROMISE was a Promise and was a normal value, the normal value is returned directly to the successful callback
- Value penetration occurs if the promise.then returns a normal value (not a Promise, not a throw)
How promises come true
Juejin. Cn/post / 690233…
The use of the generator
- Generator is an asynchronous programming solution provided by ES6
The basic implementation of traverser
const iterable = {0:'a'.1: 'b'.2 : 'c'.length: 3};
interable[Symbol.iterator] = function(){
let index = 0;
return {
next:() = > {
retrun {value: this.[index], done : index++ == this.length }
}
}
}
console.log([...iterable]);
// If the object's [symbol. iterator] property is not modified, an error is reported
// Uncaught TypeError: iterable is not iterable
Copy the code
- By rewriting the object’s Symbol. The iterator attribute, make the object has a next method, method of internal return contains the value and the done attributes, when done for true iteration is complete
Implemented by the generator yield
- Generator functions need to be preceded by an asterisk *,
How can an array or object be deconstructed using residual argument syntax?
let iterable = {0:'a'.1: 'b'.2 : 'c'.length: 3};
// You can override an object by modifying its symbol. iterator property
let iterable = {
0:'a'.1:'b'.2:'c'[Symbol.iterator]:function* (){
yield 'a'.yield 'b'.yield 'c'}}Copy the code
This can be done through a loop like this:
const iterable = {0:'a'.1: 'b'.2 : 'c'.length: 3};
iterator[Symbol.iterator] = function* () {
let index = 0;
while(index ! = =this.length){
yield this[index++]; }}console.log([...iterable]);
Copy the code
Pass the results of asynchronous A’s execution to asynchronous B
const fs = require('fs').promises;
function *myReadFile() {
const a = yield fs.readFile('a.txt'.'utf8');
const b = yield fs.readFile(a, 'utf8');
console.log(b);
}
// yield calls
let it = myReadFile();
let {value ,done} = it.next();
value.then(data= >{
// Pass the value read by the upload to the next
let {value , done } = it.next(data);
value.then(data= > {
let {value , done} = it.next(data);
console.log('Results obtained', value); })})Copy the code
Simple implementation of CO library
// If there are two yields as above, the loop is called several times, and the number of nested layers becomes more complex as the number of yields increases
// Can you pass in the original function to get the desired result
// Pass the last result to the next then
function co(itera){
return new Promise((resovle,reject) = >{
// The first thought of asynchronous iteration is to create a next function, which is called manually the first time
function next(data){
let {value ,done} = it.next(data);
// As long as this is not done, next needs to be recursively called again
if(! done){// If the first output is a normal value, wrap the Promise with promise.resolve ()
Promise.resolve(value).then(data= >next(data); })}else {
// If done is true, the iteration is complete, and the current value is resolveresolve(value); }}// The first call passes undefined by defaultnext(); })}Copy the code
Implement an asynchronous method of generator yourself
- How is the generator invoked? By calling next
- Each time the generator calls next to pass the results of the previous call to
function mygenerator() {
let context = {
prevPoint:0.nextPoint:0.done:false.// Whether it is completed
myVal:10.stop(){
this.done = true ; // Stop is called to set done to true}},return {
next() {
return {
value: $gen(context),
done:context.done
}
}
}
}
// Each time the pointer is moved down one space, stop is called to terminate the iteration when the previous one is the same as the next
function $gen(context){
switct(context.prevPointer = context.nextPointer){
case 0:
context.nextPinter = 1; // Move the pointer down one at a time
return context.myVal += 1;
case 1:
context.nextPointer = 2;
return context.myVal += 2
case 2:
context.nextPointer = 2;
return context.myVal += 3
case 3:
context.nextPointer = 2;
return context.myVal += 4}}Copy the code
What does yield ultimately translate into bable?
function *read(){
const a = yield 1;
console.log(a);
const b = yield 2;
console.log(b);
const c = yield 3;
console.log(c);
}
Copy the code
The result obtained by translating www.babeljs.cn/ using bable is
"use strict";
var _marked = /*#__PURE__*/regeneratorRuntime.mark(read);
function read() {
var a, b, c;
return regeneratorRuntime.wrap(function read$(_context) {
while (1) {
switch (_context.prev = _context.next) {
case 0:
_context.next = 2;
return 1;
case 2:
a = _context.sent;
console.log(a);
_context.next = 6;
return 2;
case 6:
b = _context.sent;
console.log(b);
_context.next = 10;
return 3;
case 10:
c = _context.sent;
console.log(c);
case 12:
case "end":
return _context.stop();
}
}
}, _marked);
}
Copy the code
The final implementation is a wrap function on the regeneratorRuntime method, which we simply implemented in version 1
const regeneratorRuntime = {
mark(fn){
return fn;
},
wrap(iteratorFn){
const context = {
next:0.// Index of the next function
done:false.stop(){
context.done = true;
},
sent:null
},
let it = {};
it.next = function(value){
context.sent = value;
let v = iteratorFn(context);
return {
value:v,
done:context.done
}
}
returnit; }}Copy the code
The ultimate async + await solution
What does Babel eventually convert async + AWIat into?
async function first() {
const a = await setTimeout(() = >{
console.log("i am first");
},1000)
console.log("i am later")}Copy the code
After the transformation
"use strict";
// The above function calls first, which actually calls _first
function first() {
return _first.apply(this.arguments);
}
function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) {
try {
var info = gen[key](arg); // Gen ['next'](arg)
var value = info.value;
} catch (error) {
reject(error); return;
}
// Resolve if done in the context is true
if (info.done) {
resolve(value);
} else {
// If not done, wrap the current result as Promies calls the then method
Promise.resolve(value).then(_next, _throw); }}function _asyncToGenerator(fn) {
return function () {
var self = this, args = arguments;
return new Promise(function (resolve, reject) {
var gen = fn.apply(self, args); function _next(value) {
asyncGeneratorStep(gen, resolve, reject, _next, _throw, "next", value);
}
function _throw(err) {
asyncGeneratorStep(gen, resolve, reject, _next, _throw, "throw", err);
}
_next(undefined);
});
};
}
// _first succeeds in an iterator, moving the context pointer down one at a time
function _first() {
_first = _asyncToGenerator( /*#__PURE__*/regeneratorRuntime.mark(function _callee() {
var a;
return regeneratorRuntime.wrap(function _callee$(_context) {
while (1) {
// Stop iterating when the next pointer is the same as the previous one
switch (_context.prev = _context.next) {
case 0:
_context.next = 2;
return setTimeout(function () {
console.log("i am first");
}, 1000);
case 2:
a = _context.sent;
console.log("i am later");
case 4:
case "end":
return _context.stop();
}
}
}, _callee);
}));
return _first.apply(this.arguments);
}
Copy the code
You can strengthen the foundation by writing down the following questions juejin.cn/post/684490…