It has been a long time since I met you. Because of the new business project recently, my time for writing has been severely squeezed.

How Async is implemented has been lying in the draft box intermittently for a long time. Finally in a dark and windy Saturday night can draw an end to him.

The introduction

Most front-end developers are comfortable with Async/Await as the ultimate processing solution for asynchronous tasks, whether it’s during an interview or everyday business development.

However, the specific implementation process of Async function is unknown, and only syntactic sugar based on Promise and Generator functions is understood.

When it comes to the internal implementation of Async functions in JavaScript, most developers are not aware of the process. Never even thought about how Async’s so-called syntactic sugar is combined with JavaScript.

Don’t worry, I’ll walk you through the polyfill implementation of Async syntax in older browsers, and I’ll walk you through the implementation of so-called Async syntax based on Promises and Generators.

We will gradually conquer the implementation principle behind Async functions from the following aspects:

  • 🌟 Promise chapter comb, from the entry to the source code to take you to master the Promise application.

  • 🌟 What are generator functions? Basic features of Generator functions.

  • 🌟 how Generator is implemented and how Babel implements Generator Generator functions in earlier versions of browsers.

  • 🌟 is a general-purpose asynchronous solution Generator function for how to solve asynchronous solutions.

  • 🌟 Open source Co library basic principle implementation.

  • 🌟 Async/Await function is called syntactic sugar and how it is implemented.

I believe that after reading this article, you can really “know what it is and why it is” with Async/Await.

Promise

By Async/Await syntax we mention that it is essentially syntactic sugar based on Promise and Generator functions.

I won’t go too far into the basics and principles of Promise in this article. There are already excellent articles on the web that introduce Promise. If you’re interested in following Promise, check it out:

  • 🌟 JavaScript Promise MDN Docs

MDN provides detailed instructions and examples for the use of the Promise foundation. It is strongly recommended that students who are unfamiliar with Promise can refer to MDN to consolidate the basic knowledge of Promise.

  • 🌟 Promise A+ specification

Promise A+ Implementation criteria. Different browsers/environments have their own implementations of promises, and they all follow the same specification to implement promises.

I have implemented a full version of the Promise specification at ➡️, and those interested can review the code themselves to reinforce the Promise principle.

  • 🌟 V8 Promise source code comprehensive interpretation

As for all kinds of boundary applications in Promise and the realization of deep Promise principle, THE author strongly suggests that students who are interested in deeper levels should read this article by reference.

Generator function

Generator functions and iterators may not be as common as Promise in most developers’ everyday applications.

So I’m going to start with the basics a little bit for generators.

The Generator concept

Introduction to Generator Basics

A Generator function is an ES6 implementation of a coroutine, with the most important feature being the ability to hand over execution of the function (i.e. the effect of suspending the execution of the function).

function* gen() {
  yield 1;
  yield 2;
  yield 3;
}

let g = gen();

g.next(); // { value: 1, done: false }
g.next(); // { value: 2, done: false }
g.next(); // { value: 3, done: false }
g.next(); // { value: undefined, done: true }
g.next(); // { value: undefined, done: true }
Copy the code

The above function is an example of a Generator function. We create a Generator function named gen by adding an * after the function declaration.

Calling the created Generator function returns a Generator {} instance object.

Those of you who are new to generator functions may be a little confused by the examples above. What is a Generator instance object, what does the yield keyword in a function do, and how should we use it?

Don’t worry, let’s take the puzzle one step at a time.

The returned G generator object you can simply think of as an object structure like this:

{
    next: function () {
        return {
            done:Boolean.// done indicates whether the generator function has finished executing, which is a Boolean value
            value: VALUE, // value represents the value returned by this call to the generator function}}}Copy the code
  • First, we passlet g = gen()A generator object is created by calling the generator functiong, now G has the next method of the above structure.

In this step, we become g for the returned generator object and gen for the generator function. The generator object G is returned by calling the generator function gen.

  • After that, the next method in the generator object returns once per call{ value:VALUE, done:boolean }The object.

Each call to the next method of the generator object returns an object of the above type:

Done indicates whether the generator function is finished, and value indicates the yield value of the generator function.

For those of you who are not familiar with this process, let’s take a look at the execution process of the above functions in detail:

  • The gen() generator function is first called to return the G generator object.

  • The second returned G generator object has a method for Next.

  • Whenever we call the g.ext () method, the generator function executes immediately after the last one until the function hits the yield key.

    • The yield keyword stops function execution and returns the yield value as the value of the call to the next function.

    • Also, if this call to g.ext () results in the generator function finishing, done becomes true, or false.

For example, when we call let g = gen(), we return a generator function that has a next method.

Then the first time the g.ext () method is called, the generator function gen is executed. The function executes until it hits the yield keyword and pauses until the yield 1 statement completes, assigning 1 to value

And since the generator function gen is not finished, done should be false at this point. So the first call to g.ext () should return {value: 1, done: false}.

After that, when we call the g.ext () method a second time, the function executes from the previous interrupt result. That is, the yield 2 statement will continue.

When yield 2 is encountered, the yield statement is encountered. The second g.ext () will return {value: 2, done: false} because the function is not finished and the yield statement is immediately followed by 2.

Again, yield 3; The logic is the same as the previous two executions.

It is important to note that when we call iterator g.ext () for the fourth time, because the generator function has already been executed by the end of the third g.ext (). So when g.ext () is called again, done will be false because the function ends. And because the function does not return a value, value is undefined.

This is a simple implementation of a Generator function, which is very simple in nature:

Calling a generator function returns a generator object, and each call to the next method of the generator object executes the function until the next yield keyword stops execution, and returns one{ value: Value, done: boolean }The object.

I’ve used a little space to describe this simple process. If you still don’t have the meaning of the above Demo, then please stop the progress at this point and be sure to understand this simple Demo at the beginning.

Return value of the Generator function

Now that you’ve mastered the basic Generator functions and the yield keyword, strike while the iron is hot to tackle the advanced syntax of Generator Generator functions.

As usual, let’s start with this code:

function* gen() {
  const a = yield 1;
  console.log(a,'this is a')
  const b = yield 2;
  console.log(b,'this is b')
  const c = yield 3;
  console.log(c,'this is c')}let g = gen();

g.next(); // { value: 1, done: false }
g.next('param-a'); // { value: 2, done: false }
g.next('param-b'); // { value: 3, done: false }
g.next('param-c'); // { value: undefined, done: true }

// The console will print:
// param-a this is a
// param-b this is b
// param-c this is c
Copy the code

Here we focus on what happens when the next method of a Generator object is called with an argument. Understanding the arguments of the next() method is a key implementation idea for subsequent Generator solutions to asynchrony.

As we mentioned earlier, the yield keyword in a generator function suspends the function. In simple terms, for example, the first time we call the g.ext () method the function will execute to yield 1, at which point the function will be suspended.

When the g.ext () method is called a second time, the generator function continues execution from the last paused statement. There is one caveat here: when the generator function resumes execution, because the last execution to the right of const a = yield 1 did not assign a value to const a.

If const a = yield 1, what value will be assigned to a? Careful students may have found it. The param-a argument we passed in g.ext (‘param-a’) will be reexecuted as a generator function when the return value of the last yield statement is executed.

In simple terms, a call to g.ext (‘param-a’) is equivalent to a const in a generator function = yield 1; Becomes const a = ‘param-a’; Execute.

The second call to g.ext (‘param-a’) naturally prints param-a this is a.

Similarly, when we call g.ext (‘param-b’) for the third time, the arguments passed in the next call will be yield 2 and assigned to b, and param-b this is b will be printed.

In the same way g.ext (‘paramc’) prints param-c this is b.

In a nutshell, when we make a call passing a value for next, the passed value is treated as the yield keyword return value from the last time the generator function paused.

Naturally, the first call to g.ext () passing in arguments is meaningless. Because the first time the next function is called, the generator function does not perform any execution and there is no yield keyword processing.

Let’s look at the return value of the so-called generator function:

function* gen() {
  const a = yield 1;
  console.log(a, 'this is a');
  const b = yield 2;
  console.log(b, 'this is b');
  const c = yield 3;
  console.log(c, 'this is c');
  return 'resultValue'
}

let g = gen();

g.next(); // { value: 1, done: false }
g.next('param-a'); // { value: 2, done: false }
g.next('param-b') // { value: 3, done: false }
g.next() // { value: 'resultValue', done: true }
g.next() // { value: undefined, done: true }
Copy the code

If the generator function has a return value, we will call g.ext () on the fourth time to resume execution, at which point the generator function will continue execution and the function will complete.

At this point, the natural done will change to true to indicate that the generator function has completed execution, after which the value will change to ‘resultValue’ because the function returns a value.

When the generator when the function has been completed, the original this call next () method returns the {done: true, value: undefined} to the {done: true, value: ‘resultValue}.

This concludes the basic use of the Generator function and lets see how it is implemented in JavaScript.

Implementation of Generator principle

In fact, the principle of Generator functions is not very related to our subsequent asynchronism, but based on the starting point of “know what it is, know why it is”.

I hope you can be patient to read this summary, in fact, its internal operation mechanism is not very complex. I myself was asked how to implement polyfill for Generator in an interview with a large e-commerce company.

First, you can open the link to see the ➡️ Babel Generator Demo that I have edited.

At first glance, many of you may be a little confused, but that’s okay. This code isn’t hard, it’s your fear of the unknown.

This is a polyfill implementation of the Generator Generator function Babel implemented for us in earlier browsers.

On the left is the generator syntax in ES6, and on the right is the translated implementation of compatible lower-version browsers.

First, the gen generator function on the left is converted to a simple normal function on the right, which we will ignore for the moment.

In the code on the right, the normal gen function is wrapped with a layer of ReGeneratorRuntime.mark (gen) processing, This step in fact in the source code in order to inherit GeneratorFunctionPrototype so as to realize the common gen function will gen () returns the object into the Generator instance objects.

This step is not that important for understanding the Generator, so we can simply rewrite regeneratorRuntime.mark to look like this:

// Define your own regeneratorRuntime object
const regeneratorRuntime = {
    // There is a mark method that accepts the fn passed in. Returns fn for unread
    mark(fn) {
        return fn
    }
}
Copy the code

We defined the regeneratorRuntime object ourselves and a Mark method for it. It takes a function as an input parameter and returns the function, and that’s it.

Next, inside the gen function, on the left side of the source code when we call gen(), we return an Iterator (it has a next method and returns {value: value,done: Boolean} every time we call next).

So on the right hand side we see that we should return the same thing as the compiled normal gen() function, and that the so-called ReGeneratorRuntime.wrap () method should also return an iterator object with the next attribute.

The regeneratorRuntime.wrap() method takes two arguments. The first argument is a function that accepts the context passed in, and the second argument is the _marked object we handled earlier.

The second argument to the wrap() method is still used as an inheritance argument for compiled generator functions and does not affect the core logic of the function. So let’s ignore that for now.

At this point, we know that there should be a Wrap method on the regeneratorRuntime object and that the Wrap method should return an object that satisfies the iterator protocol.

// Define your own regeneratorRuntime object
const regeneratorRuntime = {
    // There is a mark method that accepts the fn passed in. Returns fn for unread
    mark(fn) {
        return fn
  },
  wrap(fn) {
    // ...
    return {
      next() {
          done:... .value:... }}}}Copy the code

Let’s get into the concrete function passed in by regeneratorRuntime.wrap:

function gen() {
  var a, b, c;
  return regeneratorRuntime.wrap(function gen$(_context) {
    // While (1) with function return has no practical meaning
    While (1) is usually used in programming to indicate that the inside of a while will be executed multiple times
    while (1) {
      switch ((_context.prev = _context.next)) {
        case 0:
          _context.next = 2;
          return 1;

        case 2:
          a = _context.sent;
          console.log(a, 'this is a');
          _context.next = 6;
          return 2;

        case 6:
          b = _context.sent;
          console.log(b, 'this is b');
          _context.next = 10;
          return 3;

        case 10:
          c = _context.sent;
          console.log(c, 'this is c');

        case 12:
        case 'end':
          return _context.stop();
      }
    }
  }, _marked);
}
Copy the code

Regeneratorruntime. wrap uses the internal logic of while(1), which doesn’t really mean anything because we have a return statement inside the while loop.

Normally, in programming we use while(1) to indicate that the internal logic will be executed many times, and indeed the while loop inside the function will actually execute this logic every time it calls the next method.

First let’s see what attributes exist for the _context argument passed in:

  • _context.prev represents the position of the pointer when the generator function is executed.

  • _context.next represents the pointer position for the next generator function call.

  • _context.sent represents the params parameter passed when g.ext (params) is called.

  • _context.stop represents the method that is called when the g.ext () generator function is called.

After explaining the meanings of the attributes on the _context object, you may still not know what they mean. Let’s first look at the simplified implementation code:

const regeneratorRuntime = {
  // There is a mark method that accepts the fn passed in. Returns fn for unread
  mark(fn) {
    return fn;
  },
  wrap(fn) {
    const _context = {
      next: 0.// represents the subscript in the generator function state machine switch for the next execution
      sent: ' '.// Indicates that the value passed in on the next call is the last yield value
      done: false.// Whether to complete
      // Complete the function
      stop() {
        this.done = true; }};return {
      next(param) {
        // 1. Change the last yield value to context.sent
        _context.sent = param;
        // 2. Execute the function to obtain the return value
        const value = fn(_context);
        / / 3. Returns
        return {
          done: _context.done, value, }; }}; }};Copy the code

The complete regeneratorRuntime object as implemented above looks pretty simple, right?

In the wrap function, we accept a state machine function passed in. Each time the next(param) method returned by the wrap() method is called, the arguments passed in next(Param) are passed to the _context.sent maintained in the wrap function, which appears as a simulation of the last yield return.

And the FN passed in wrap(fn) you can think of as a little state machine that executes the state machine fn function every time the next() method is called.

However, because the value of _context.prev is different each time in the state machine, each call to the next function executes a different logic in the state machine.

Return _context.stop() until the switch statement in the state machine fn is matched, at which point the _context.done function is set to true and the object is returned.

The core of the Generator is essentially a state machine function fn wrapped around the RegeneratorRuntime.wrap function.

The wrap function maintains a _context object inside, so that each time the next method of the returned generator object is called, the wrapped state machine function performs different logic by matching the corresponding state based on the corresponding properties of the _context.

This is the core principle of Generator. Those who are interested in seeing the full code should check out Facebook/Regenerator.

Generator Asynchronous solution

After covering the basic concepts of Generator and the polyfill principle, let’s step into asynchrony and see how it can be applied to asynchronous programming.

In most cases, we’ll use promises directly to handle asynchronous problems. Promise helped us solve the very bad “callback hell” asynchronous solution.

However, promises still need to have constant.then. Then when there is too much asynchronous nesting, then invocation in promises is not intuitive.

Whenever a problem arises, a solution must follow, and when promises addressed asynchronous problems, the Generator was clearly on the horizon as a solution.

function promise1() {
  return new Promise((resolve) = > {
    setTimeout(() = > {
      resolve('1');
    }, 1000);
  });
}

function promise2(value) {
  return new Promise((resolve) = > {
    setTimeout(() = > {
      resolve('value:' + value);
    }, 1000);
  });
}

function* readFile() {
  const value = yield promise1();
  const result = yield promise2(value);
  return result;
}
Copy the code

Let’s take a look at the code above and see if the readFile function is a little bit async.

What if I expect the so-called readFile() method to behave like an async function, return a result that is also a Promise and keep the appeal code written?

Here you can think a little bit about how to use the features of Generator functions to do this.


As mentioned above, generator functions can be paused and return a generator object when called. Each time the next method of a generator object is called, the generator function will continue to execute until the next yield statement is reached, and each time the next(param) method of a generator object is called, we can pass in an argument as the return value of the previous yield statement.

Using the above features, we can write the following code:

function promise1() {
  return new Promise((resolve) = > {
    setTimeout(() = > {
      resolve('1');
    }, 1000);
  });
}

function promise2(value) {
  return new Promise((resolve) = > {
    setTimeout(() = > {
      resolve('value:' + value);
    }, 1000);
  });
}

function* readFile() {
  const value = yield promise1();
  const result = yield promise2(value);
  return result;
}

function asyncGen(fn) {
  const g = fn(); // Call the passed generator function to return the generator object
  // Expect to return a Promise
  return new Promise((resolve) = > {
    // The first call to the g.ext () method executes the generator function until the first yield keyword is reached
    Yield promise1() and return the value of the iterator object
    const { value, done } = g.next();
    // Since value is a Promise, we can wait for the Promise to complete and then continue to call the g.ext (res) recovery generator function to continue execution
    value.then((res) = > {
      // The second call to g.ext () is in the promise.then returned last time
      // We can get the value of the last Promise which was '1'
      // pass g.ext ('1') as the last yield. This is equivalent to const value = '1'.
      const { value, done } = g.next(res);
      // Continue with the above process
      value.then(resolve);
    });
  });
}

asyncGen(readFile).then((res) = > console.log(res)); // value: 1
Copy the code

We wrapped the readFile generator function by defining an asyncGen function to take advantage of the fact that generator functions can be paused with yield keyword. At the same time, the asynchronous writing method similar to async function is realized by combining the features of promise.prototype. then method.

It looks a lot like Async, right, but the current code has a fatal problem:

AsyncGen is not universal. In the example above, the readFile function is wrapped with two yield promise layers. We also call the g.ext () method twice inside the asyncGen function.

If we wrap the Promise of three yield processing layers, then I am not rewriting the asyncGen function logic. Or if the readFile contains statements such as yield ‘1’, which is not a Promise, then we will get an error as a Promise.

This approach doesn’t have any generality, so no one would organize asynchronous code this way in a real project. But with this example I believe you can get a glimpse of how Generator functions can be combined with Promise as an asynchronous solution.

tj/co

We used Generator as an asynchronous solution. We wrote a wrapped function called asyncGen, but it doesn’t have any generality.

Let’s think about how to make this method more general to better solve asynchronous problems in a variety of scenarios:

Also, I want my readFile method to be written as intuitively as before:

function promise1() {
  return new Promise((resolve) = > {
    setTimeout(() = > {
      resolve('1');
    }, 1000);
  });
}

function promise2(value) {
  return new Promise((resolve) = > {
    setTimeout(() = > {
      resolve('value:' + value);
    }, 1000);
  });
}

function* readFile() {
  const value = yield promise1();
  const result = yield promise2(value);
  return result;
}
Copy the code

We had previously used Generator features to handle Promise asynchrony, each time foolishly nesting function logic based on the Yeild keyword.

As some of you might have thought before, there are boundary stop conditions for endless nested call logic. So when we need to encapsulate a universal function, isn’t it better to use recursive processing?

Maybe with that in mind, you can try to encapsulate it yourself.


Without further suspense, let’s look at a general-purpose and more elegant Generator asynchronous solution:

function co(gen) {
  return new Promise((resolve, reject) = > {
    const g = gen();
    function next(param) {
      const { done, value } = g.next(param);
      if(! done) {// Continue recursion without completion
        Promise.resolve(value).then((res) = > next(res));
      } else {
        // Complete the direct reset of the Promise state
        resolve(value);
      }
    }
    next();
  });
}

co(readFile).then((res) = > console.log(res));
Copy the code

We define a co function to wrap the passed generator function.

In the co function, we return a Promise as the return value of the wrapped function, and we call gen() the first time we call the co function to get the corresponding generator object.

Then we define the next method, and inside the next function we recursively call the next function in the then method of value as long as the iterator is not complete.

In most cases, asynchronous iteration can be handled in a recursive manner similar to the one in this function.

There are three things to note about this function:

  • First we can see that the next function takes a param argument passed in.

This is because when we use Generator to handle asynchronous problems, we pass the promise’s resolve value to A via const a = yield Promise, So we need to pass the RES in each then function to the next(res) as the return value of the last yield.

  • Second, careful students can notice this line of codePromise.resolve(value).then((res) => next(res));.

Resolve wraps value in promise. resolve because when yield is not followed by a Promise in a generator function, we need to treat it as a Promise because we need to call the.then method uniformly.

  • Finally, the first time we call the next() method, we pass no param argument.

This isn’t too hard to understand. When we don’t pass in param, we call g.ext () directly. We mentioned above that when we pass in a parameter to the next method on a generator object, that parameter is treated as the return value of the last yield statement.

Because the first time g.ext () is called, there is no previous yield inside the generator function, passing the argument makes no sense.

It doesn’t seem too difficult, but this little piece of code gives our Generator a writing method that allows asynchronous code to be called synchronously.

This little piece of code is also the core principle of the so-called CO library, of course there is much more to CO, but it is enough to see how we can use the so-called Generator as the ultimate asynchronous solution before Async/Await.

Async/Await

It’s been a long time coming to the ultimate solution to asynchrony in JavaScript.

Earlier we talked about the basic usage of the so-called Generator and how Babel uses Generator in EcmaScript 5.

We also talked in passing about how we used Generator in conjunction with Promise to handle Async before Async existed.

Although the previous asynchronous invocation looks very similar to the async syntax:

function* readFile() {
  const value = yield promise1();
  const result = yield promise2(value);
  return result;
}
Copy the code

However, when using readFile, we still need to use the co function to process the Generator for a separate layer:

co(readFile).then((res) = > console.log(res));
Copy the code

In this case, async has been introduced to solve this problem. As usual, let’s take a look at how Babel handles it in older browsers that do not support async syntax:

It may seem like a lot of information at first, but don’t worry. This is all code that we’ve analyzed and even implemented before.

Before I dive into this code, I’ll tell you how the Async syntax is implemented:

Previously, when we addressed asynchrony with Generator and Promise, we needed to wrap Generator functions with an additional layer of CO to achieve synchronous asynchronous function calls.

So if we want to implement Async syntax in older browsers, shouldn’t we compile the CO function with Generator polyfill?

Async is simply a layer of CO functions wrapped around Generator, so it is called the syntactic sugar of Generator and Promise.

Let’s take a look at the Polyfil implementation on the right.

If this looks familiar, we talked a little bit about how it works.

The only difference is that it returns the implementation of the Generator with an additional layer of the _asyncToGenerator function.

function _asyncToGenerator(fn) {
  return function () {
    var self = this,
      args = arguments;
    return new Promise(function (resolve, reject) {
      var gen = fn.apply(self, args);
      function _next(value) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, 'next', value);
      }
      function _throw(err) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, 'throw', err);
      }
      _next(undefined);
    });
  };
}
Copy the code

Take a look at the so-called _asyncToGenerator function, which internally accepts an incoming FN function. This FN is what is called a Generator function.

When hello() is called, it essentially ends up calling the return function inside _asyncToGenerator, which returns a Promise.

return new Promise(function (resolve, reject) {
      var gen = fn.apply(self, args);
      function _next(value) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, 'next', value);
      }
      function _throw(err) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, 'throw', err);
      }
      _next(undefined);
    });
Copy the code

First, the Promise goes like this:

var gen = fn.apply(self, args);
Copy the code

It calls fn (the generator function) we passed in to return the generator object, after which it defines two methods:

function _next(value) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, 'next', value);
      }
function _throw(err) {
         asyncGeneratorStep(gen, resolve, reject, _next, _throw, 'throw', err);
     }
Copy the code

Both methods internally make function calls based on asyncGeneratorStep:

function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) {
  try {
    var info = gen[key](arg);
    var value = info.value;
  } catch (error) {
    reject(error);
    return;
  }
  if (info.done) {
    resolve(value);
  } else {
    Promise.resolve(value).then(_next, _throw); }}Copy the code

In fact, the so-called asyncGeneratorStep can be regarded as the principle of the asynchronous solution CO we implemented before. Their internal ideas and implementation are completely consistent. The only difference here is that the compiled implementation of Babel takes into account the error case whereas we didn’t take into account the error case.

Async is essentially a syntactic sugar that takes advantage of paused features inside Generator functions and makes recursive calls in promise.prototype.then.

In fact, I believe that the realization principle of Async/Await will be familiar to you after the introduction of the previous knowledge points.

It is called syntactic sugar because it does not have any additional knowledge extensions for Generator functions and promises per se, but rather combines these two syntactic features to produce a more elegant and simple asynchronous solution.

At the end

At the end of the article, first of all, thanks to every friend who can read here.

We talked about asynchronous solutions that evolved from Generator functions to Async/Await and how they polyfilled into their implementation in older browsers.

In fact, a lot of the code in this article is a simplified version of the implementation, if you are still confused about which step to read the article, you can also leave your opinion in the comments section for discussion, or you can check out the corresponding source address at the end of each chapter.