The previous article introduced the common UI thread architecture, where each UI thread has a message queue, and all events that are waiting to be executed are added to the message queue. The UI thread loops events from the message queue and executes them according to certain rules. JavaScript was originally run in the UI thread. In other words, the JavaScript language is designed with this common UI thread architecture in mind.

Based on this basic UI framework, JavaScript also extends many new technologies, among which the most widely used are macro tasks and micro tasks.

A macro task is simply an event in a message queue waiting to be executed by the main thread. V8 recreates the stack as each macro task is executed, changes the stack as function calls are made in the macro task, and finally, when the macro task is finished, the stack is emptied again and the main thread continues to execute the next macro task.

A microtask is a little more complicated; you can actually think of a microtask as a function that needs to be executed asynchronously, after the main function has finished but before the current macro task has finished.

The main reason why microtasks are introduced in JavaScript is that the time granularity of macro tasks in message queues is too coarse for the main thread, which is not capable of some scenes requiring high precision and real-time performance. Therefore, microtasks can make an effective tradeoff between real-time performance and efficiency. In addition, using microtasks can change our current asynchronous programming model so that we can write asynchronous calls using synchronous code.

As important as microtasks are, they are not always easy to understand. Let’s first take a look at the knowledge stack related to microtasks, as shown in the figure below:

As you can see, microtasks are based on message queues, event loops, UI mainthreads, and stacks, which in turn extend to coroutines, Promises, generators, await/async, and other technologies that are often used in modern front ends. That is, if you don’t have a deep understanding of message queues, main threads, and call stacks, you can get confused when working on microtasks.

Today, we will first get through the underlying technology of microtask, understand the correlation between message queue, main thread and call stack, and then analyze the implementation mechanism of microtask.

Main thread, call stack, message queue

Let’s start with the main thread and call stack. As we know, the call stack is a data structure that manages the call relationships of functions executed on the main thread. Next, we examine how the callback stack manages function calls on the main thread by executing the following code.

function bar() {

}

foo(fun){

fun()

}

foo(bar)

When V8 is ready to execute this code, it pushes the global execution context onto the call stack, as shown below:

V8 then starts executing foo on the main thread. First it creates an execution context for foo and pushes it onto the stack, so the relationship between the stack and the main thread looks like this:

Ready to perform

Foo then calls bar, so when V8 executes bar, it also creates an execution context for bar and pushes it onto the stack, resulting in something like this:

Call bar function

When the bar function is executed, V8 will pop the execution context of the bar function from the stack, as shown below:

The bar function is complete

Finally, foo completes execution, and V8 pops the execution context of foo from the stack, as shown below:

The foo function is finished

This is how the call stack manages function calls on the main thread, but there is a problem with this approach: stack overflow. Take this code for example:

function foo(){

foo()

}

foo()

Foo function due to the internal nested calls itself, so at the time of call foo function, it will stack has been upward growth, but because of stack space in memory is continuous, so usually we will limit the size of the call stack, if when nested function layer too deep, too much execution context of accumulation will lead to stack overflow in the stack, The final picture is as follows:

Stack overflow

The problem of stack overflow can be solved by using setTimeout. The essence of setTimeout is to change a synchronous function call to an asynchronous function call, where foo is encapsulated as an event and added to the message queue, and the main thread reads the next task from the message queue in a regular loop. After using setTimeout, the code code is as follows:

function foo() {

setTimeout(foo, 0)

}

foo()

Now we can analyze the execution flow of this code in terms of the call stack, main thread, and message queue.

First, the main thread fetches the macro task that needs to be executed from the message queue, assuming that the currently fetched task is the code that needs to be executed, then the main thread enters the code execution state. The relationship between the main thread, message queue and call stack is as follows:

V8 then executes function foo, and when it executes function foo, it creates the execution context for function foo and pushes it onto the stack, resulting in something like this:

When V8 executes setTimeout in foo, setTimeout encapsulates foo as a new macro task and adds it to the message queue, as shown in the following state diagram when V8 executes setTimeout:

When foo completes, V8 will end the current macro task and the call stack will be cleared, as shown below:

When a macro task is finished, the busy main thread is not idle, it will repeat the process of fetching and executing the macro task. The callback macro that was encapsulated by setTimeout will also be picked up and executed by the main thread at some point, which is the call to foo. The specific schematic diagram is as follows:

Because foo is not executed inside the parent function, it is encapsulated as a macro task, thrown into a message queue, and then waits for the main thread to fetch the task from the message queue before executing the callback function foo, thus solving the stack overflow problem.

Microtask solves the problem that the execution timing of macro task is not controllable

For stack overflow problem, however, although we can put some function encapsulation ChengHong task to solve, but a macro task need to be put into message queue, if some macro task execution time is too long, then it will affect the macro behind the message queue task execution, and the effect is not controllable, You have no way of knowing how long the macro task will take to complete.

JavaScript introduced microtasks, which are executed near the end of the current task. With microtasks, you can control the timing of your callback function with some precision.

In layman’s terms, V8 maintains a microtask queue for each macro task. When V8 executes a piece of JavaScript, it creates an environment object for that code, and the microtask queue is stored in that environment object. Resolve: When you generate a microtask with Promise. Resolve, it is automatically added to the microtask queue by V8, and the environment object is destroyed at the end of the code execution, but before it is destroyed, V8 processes the microtasks in the microtask queue.

There are only two things you need to keep in mind to understand the timing of microtasks:

First, if a microtask is created in the current task, both promise.resolve () and promise.reject () will trigger the microtask. The triggered microtask will not be executed in the current function, so the execution of the microtask will not result in infinite stack expansion.

Second, unlike asynchronous calls, microtasks are still executed prior to the end of the current task, meaning that no other task in the message queue can be executed until the end of the current microtask.

So a microtask fired inside a function must take precedence over a macro task fired inside a function. To test this idea, let’s examine a piece of code:

function bar(){

console.log(‘bar’)

Promise.resolve().then(

(str) =>console.log(‘micro-bar’)

)

setTimeout((str) =>console.log(‘macro-bar’),0)

}

function foo() {

console.log(‘foo’)

Promise.resolve().then(

(str) =>console.log(‘micro-foo’)

)

setTimeout((str) =>console.log(‘macro-foo’),0)

bar()

}

foo()

console.log(‘global’)

Promise.resolve().then(

(str) =>console.log(‘micro-global’)

)

setTimeout((str) =>console.log(‘macro-global’),0)

In this code, which contains the setTimeout macro task and the microtask created through promise.resolve, what order do you think will eventually be printed?

After executing this code, we find that the final printed order is:

foo

bar

global

micro-foo

micro-bar

micro-global

macro-foo

macro-bar

macro-global

We can clearly see that microtasks are performed before macro tasks. Let’s take a closer look at how V8 executes this JavaScript code.

First, when V8 executes this code, it pushes the global execution context onto the call stack and creates an empty queue of microtasks in the execution context. Then at this point:

The call stack contains the global execution context;

The microtask queue is empty.

The state diagram of message queue, main thread and call stack is as follows:

A call to function foo is then made, and V8 creates the execution context for function foo and pushes it onto the stack. Resolve is then executed, which triggers a micro-foo1 microtask, which V8 adds to the microtask queue. The setTimeout method is then executed. This method triggers a macro-foo1 macro task, which V8 adds to the message queue. Then at this point:

The call stack contains the global execution context, the execution context of the foo function;

The microtask queue has a microtask, micro-foo;

The message queue holds a macro task, macro-foo, set with setTimeout.

The state diagram of the message queue, main thread and call stack is as follows:

Next, foo calls bar, and V8 needs to create an execution context for bar, push it onto the stack, and then execute Promise. Resolve, which triggers a micro-bar microtask that is added to the microtask queue. The setTimeout method is then executed, which also triggers a macro-bar macro task, which is also added to the message queue. Then at this point:

The call stack contains the global execution context, the execution context of foo, and the execution context of bar.

The microtasks in the microtask queue are micro-foo and micro-bar.

In the message queue, the status of the macro task is macro-foo, macro-bar.

The state diagram of the message queue, main thread and call stack is as follows:

Next, the bar function completes and exits, and the execution context of the bar function is popped from the stack, followed by the completion and exit of the foo function, and the execution context of the foo function is popped from the stack. Then at this point:

The call stack contains the global execution context, because the bar and foo functions are finished, so their execution context is removed from the call stack;

Microtasks in the microtask queue are also micro-foo and micro-bar;

The status of the macro task in the message queue is still macro-foo, macro-bar.

The state diagram of the message queue, main thread and call stack is as follows:

The main thread completes foo, and then executes the code Promise. Resolve in the global environment, which triggers a micro-global microtask that V8 adds to the microtask queue. It then executes the setTimeout method, which triggers a macro-global macro task that V8 adds to the message queue. Then at this point:

The call stack contains the global execution context;

Microtasks in the microtask queue are also micro-foo, micro-bar, micro-global;

The status of the macro task in the message queue is also macro-foo, macro-bar, macro-global.

The state diagram of the message queue, main thread and call stack is as follows:

When this code is executed, the V8 will destroy the environment of the code object, the object’s destructor is called the environment (note that the destructor is a concept in c + +) here, here is the V8 performs the task of micro a checkpoint, at this time the V8 will check the task queue, if there is micro tasks in micro task queue, V8 then pulls out the microtasks in turn and executes them in line. Since the tasks in the microtask queue are micro-foo, micro-bar, and micro-Global, the order of execution is the same.

The state diagram of the message queue, main thread and call stack is as follows:

After all the microtasks in the microtask queue are executed, the current macro task is finished, and the main thread will continue to repeat the process of fetching and executing the task. Macro -foo, macro-bar and macro- Global are all printed out in the first – in, first – out order.

After all tasks are completed, the state diagram of message queue, main thread and call stack is as follows:

Above is the full analysis of the process execution, here, believe that you already know about micro and macro task task execution time is different, the task is in the current task is executed before the end, the macro task is the task in the message queue, the main thread after performing a macro task, will then under a macro task removed from a message queue and executed.

Can you trigger new microtasks in a loop within a microtask?

Since macro tasks and microtasks are called asynchronously, but the execution timing is different, can setTimeout solve the stack overflow problem, and change the triggering macro task to triggering microtask?

For example, we changed the code to:

function foo() {

return Promise.resolve().then(foo)

}

foo()

When foo is executed, a microtask is triggered by a call to promise.resolve (), at which point V8 adds the microtask to the microtask queue and exits the current foo execution.

V8 then checks the microtask queue before exiting the current macro task, finds a microtask in the microtask queue, and executes the microtask first. Since the microtask is calling function foo itself, function foo needs to be called during the microtask, which triggers the same microtask during the execution of function foo.

The loop will continue forever, and the current macro task cannot exit, which means that other macro tasks in the message queue cannot be executed, such as mouse and keyboard events. These events are kept in the message queue, and the page cannot respond to these events, which is reflected in the page freezes.

However, since V8 exits the stack of the current foo function each time it performs a microtask, this code does not cause stack overflow.

conclusion

In this article, we mainly analyze microtasks from the perspective of call stack, main thread and message queue.

A call stack is a data structure that manages the call relationships of functions executed on the main thread. In the process of executing tasks, if the function call level is too deep, it may cause stack overflow error. We can use setTimeout to solve the problem of stack overflow.

The essence of setTimeout is to change a synchronous function call to an asynchronous function call, where the callback function is wrapped as a macro task and added to the message queue, and the main thread reads the next macro task from the message queue in a regular loop.

Events in message queues are also known as macro tasks. However, the granularity of macro tasks is too coarse to be suitable for some scenarios requiring high precision and real-time performance, while micro tasks can make effective trade-offs between real-time performance and efficiency.

This effect depends on the timing of the execution of the microtask. The microtask is a function that needs to be executed asynchronously, after the execution of the main function and before the completion of the current macro task.

Because the microtask is still executed within the current task, if a new microtask is triggered in a loop within the microtask, no other tasks in the message queue will have a chance to be executed.