If you take too much, you lose everything. — Biography of Napoleon

preface

Originally, we had an article that briefly described the JS (Event Loop) Event Loop and the Call Stack. Analyze the operation mechanism of event loop in browser from macroscopic point of view.

Theory is theory after all, we know the principle, is to better solve the actual problem. The paper come zhongjue shallow, and must know this to practice. So, today, with the help of a visual tool to detail the various running details of the JS event loop.

A picture is worth a foreword

concise

  1. Event loops are used for processingMacro taskandMicro tasks. Push them to the call stack and satisfyAt the same timeOnly one task can be performed (single-threaded nature). It also controls when the page is rendered
  2. The call stack(Call Stack) used fortrackingFunction call. isLIFOLast in, first outThe stack structure. eachThe stack frameRepresents a function call
  3. Macro task queueIs aFIFOFirst in, first outThe queue structure. Macro tasks stored in the structure are looped through eventsTo probeTo the. And these tasks areA synchronized block. When one task is executed, other tasks are suspended (queued in order)
  4. Microtask queueIt’s new to ES6 specifically for processingPromiseThe data structure of the call. It is similar to macro task queues, but the main difference is that microtask queues are dedicated to processing logic related to microtasks.

The article summary

  1. Four big king kong
  2. Code parsing

1. The Four King Kong

1. Event Loop

An event loop is a loop function that continuously fetches tasks from a macro/micro task queue. Under certain conditions, you can compare it to a perpetual motion machine that never stops. It pulls tasks from the macro/micro task queue and pushes them to the call stack to be executed.

The event loop consists of four important steps:

  1. Execute the ScriptTo:Synchronous modeExecute the code in the script untilThe call stack is emptyBefore stopping.

    In fact, at this stage, JS will also carry out some pre-compilation and other operations. (e.g., variable promotion, etc.). The process involves theV8The scope of knowledge is not discussed in this article. If you are interested, please refer to itHow does V8 handle JS
  2. Execute a macro task: Pick the oldest task from the macro task queue and push it to the call stack until the call stack is empty.
  3. Execute all microtasks: Pick the oldest task from the microtask queue and push it to the call stack until the call stack is empty. But, but, but (here comes the twist), continue to pick the oldest task from the microtask queue and execute it. Until the microtask queue is empty.
  4. UI Rendering: Render the UI, then skip to step 2 and continue to select tasks from the macro task queue. (This step only applies to the browser environment, not Node environment)

Use pseudocode to describe the operation mechanism of Event Loop

// Start the perpetual motion machine at the first step of Script execution.
while (EventLoop.waitForTask()) {
  // Step 2: Select the oldest task from the macro task queue
  const taskQueue = EventLoop.selectTaskQueue();
  if (taskQueue.hasNextTask()) {
    taskQueue.processNextTask();
  }
  // Step 3: Select the oldest task from the microtask queue
  const microtaskQueue = EventLoop.microTaskQueue;
  while (microtaskQueue.hasNextMicrotask()) {
    microtaskQueue.processNextMicrotask();
  }
  // Step 4: UI render
  rerender();
}
Copy the code

Event loops are used to handle macro and micro tasks. And push them onto the call stack so that only one task can be performed at a time (single-threaded nature). It also controls when the page is rendered.

2. Call Stack

The Call Stack is a fundamental part of JS. It’s a record-keeping structure that allows us to perform function calls. In the call stack, each function call is replaced by a data structure called a stack frame. This structure helps the JS engine (V8) maintain the order and relationships between functions. After a function has finished, the rest of the code can be executed using the information stored in the stack frame. Make JS applications have memory.

When the JS code is executed for the first time, the call stack is empty. Only when the first function is called will the frame corresponding to that function be pushed to the top of the stack. When the function completes execution (to the return statement), the corresponding stack frame throws (POP) from the call stack.

The Call Stack is used to track function calls. LIFO is a LIFO stack structure. Each stack frame represents a function call.

3. Macro Task Queue

It can also be called a Callback queue.

The call stack is the mechanism for tracking functions that are being executed, and the macro task queue is the mechanism for tracking functions that are about to be executed.

A macro task queue is a FIFO(first-in, first-out) queue structure. Macro tasks stored in the structure are probed by the event loop. Also, these tasks are blocked synchronously. You can think of these tasks as a function object.

The event loop runs tirelessly, pulling objects from the macro task queue according to certain rules (described below). A single iteration of an event loop is called tick.

Does vue.nexttick (callback) come to mind when you see a tick? A deferred callback is performed after the next DOM update loop ends. Use this method immediately after modifying the data to get the updated DOM.

Vue.nexttick (callback) The Vue performs DOM updates asynchronously (and is pushed into a macro task queue). As soon as a data change is observed, the Vue opens a queue and pushes the watcher that observed the data change in the same Event loop into the queue.

If the watcher is triggered multiple times, it will only be pushed to the queue once. This buffering behavior effectively eliminates unnecessary computation and DOm manipulation caused by duplicate data. During the next event loop, Vue empties the queue and performs the necessary DOM updates.

When you set vm.someData = ‘new value’, the DOM is not updated immediately, but the necessary DOM updates are made when the asynchronous queue is cleared and the update is performed at the start of the next event loop. If you try to do something based on the updated DOM state at this point, you’ll have a problem. To wait for Vue to finish updating the DOM after the data changes, use vue.nexttick (callback) immediately after the data changes. This callback will be called after the DOM update is complete.

To process a macro task, the event loop calls a function that corresponds to that macro task. When a macro task is executed, it hogs the entire call stack and is exclusive. That is, the macro task (function) being executed has the highest priority. The event loop picks the oldest task from the macro task queue and continues until the task is complete and the call stack is empty.

Oldest task: this has two meanings: 1. If each macro task is executed at the same time (see the second parameter in setTimeout), then the oldest task is executed in the order in which it is enqueued. The earlier it is enqueued, the earlier it will be executed. If the time is inconsistent, select the specified execution time. The shorter the execution time, the earlier the execution time

During the execution of a task (function), if a new macro task is triggered, it will also submit the new task to the macro task queue and arrange the tasks according to the queue queue order. There are many ways to trigger a new macro task, the simplest of which is to trigger setTimeout(newTaskFn,0) in your code. Of course, the functions that trigger macro tasks are called Web APIS, as described in JS (Event Loop) and Call Stack. Here, I’m going to go straight to the theory.

Web APIs

  1. setTimout/setInteral
  2. An AJAX request
  3. Operating the DOM
  4. Access local storage,
  5. The use of the worker

A macro task queue is a FIFO(first-in, first-out) queue structure. Macro tasks stored in the structure are probed by the event loop. Also, these tasks are blocked synchronously. When one task is executed, the other tasks are suspended (queued in order).

4. Microtask Queue

The microtask queue is also a FIFO(first-in, first-out) queue structure. Also, the microtasks stored in the structure are probed by the time loop. Microtask queues are similar to macro task queues. As part of ES6, it was added to the JS execution model to handle Promise callbacks.

Microtasks are also very similar to macro tasks. It is also a synchronous blocking code that hogs the call stack at run time. Like macro tasks, new microtasks will also be triggered during running, and the new tasks will be submitted to the microtask queue, and the tasks will be properly placed according to the queue queue order.

There are no two leaves alike, Brecht said. Although macro and micro tasks are similar in many ways, they differ in storage and processing.

  • Macro tasks are stored in macro task queues, and microtasks are stored in microtask queues (Listen to your words, such as listen to your words )
  • Macro tasks are executed in loops and UI rendersInterspersed withIn the macro task.

    Microtasks are inaAfter the macro task is complete, it is fired before the UI renders.

Microtask queues are new data structures in ES6 specifically designed to handle Promise calls. It is similar to macro task queues, but the main difference is that microtask queues are dedicated to processing logic related to microtasks.


2. Code details

Suppose we are a V8 engine, receiving a piece of JS code and following a set pattern to output the result the user wants.

Call Stack operation mechanism


function seventh() {}function sixth() { seventh() }

function fifth() { sixth() }

function fourth() { fifth() }

function third() { fourth() }

function second() { third() }

function first() { second() }

first();
Copy the code

To be a proper JS engine, you need to know what tools (call stacks, task queues, etc.) are needed to execute this code.

From a code point of view, here are some functions defined and called. There are no side effects. So, just the regular call stack is enough to handle them. As mentioned above, the call stack stores the stack frames corresponding to each function. It can remember the calling relationships between functions. So after a function returns, the information stored in the stack frame can be retrieved from the previous function.

Operation mechanism of Task Queue

The booking runs for the same time

setTimeout(function a() {}, 0);

setTimeout(function b() {}, 0);

setTimeout(function c() {}, 0);

function d() {}

d();
Copy the code

As the code executes, the JS engine parses it from top to bottom. The first thing you see are three setTimeouts, executing the setTimeout code one by one. Queue the setTimeout callback functions in the order they are called. (a=>b=>c)

The synchronous code d() is then executed, and since it is synchronous code, it is pushed onto the call stack to execute the corresponding code logic.

After the call stack is empty (d() completes execution), the event loop extracts the required tasks from the macro task queue. Since all three macro tasks have the same scheduled running time, they are pushed onto the call stack in the order in which they are enqueued.

Booking runs at different times

This code is a bit different from the example above in that the logic for executing synchronized code is the same.

The code is executed from top to bottom, with the corresponding setTimeout callbacks enqueued. (= a = > b > c). The synchronization function d() is then executed.

But, but, but (here we go again), after d(), the time loop extracts the callback function from the macro task queue that satisfies the condition. The three callback functions will not be fetched in enqueue order at this time due to inconsistent booking execution times. Instead, the function is extracted based on the size of the scheduled run time. Which is what we mentioned above. The smaller the time, the more forward.

setTimeout(function a() {}, 1000);

setTimeout(function b() {}, 500);

setTimeout(function c() {}, 0);

function d() {}

d();
Copy the code

Microtask Queue operation mechanism

There are some pre-points that need to be explained briefly. There’s a concept in Promise called non-reentrant

Non-reentrant: When a Promise enters a settled (resolve/reject) state, the handler associated with that state does not execute immediately, and the synchronization code after the handler executes before it

Calling then() on a resolution promise pushes the onResolved handler into the message queue

// Create a promise to resolve
let p = Promise.resolve(); 
// Add a resolution handler
p.then(() = > console.log('New handler')); 
// Synchronize output
console.log('Sync code'); 

// Actual output:
// Synchronize the code
// Add a handler
Copy the code
fetch('https://www.google.com')
  .then(function a() {});

Promise.resolve()
  .then(function b() {});

Promise.reject()
  .catch(function c() {});
Copy the code

This code is intentional. All three statements produce promises. As we know, promises generate microtasks.

Continuing to parse the code, fetch(), from the top down, makes an asynchronous interface request, and while the interface is pending (success/failure), the state of the promise is pending. Yes does not trigger the corresponding callback function. Therefore, fetch() fires preferentially, but does not enter the microtask queue or call stack.

Promise.resolve()/ promise.reject (), on the other hand, return a Promise that goes straight into a settled (resolve/reject) state. So, as they are traversed, the resulting microtasks are queued in code order.

When the synchronized code is finished executing, the time loop retrieves the tasks that need to be processed from the macro/microtask queue. The macro task queue is empty. Therefore, tasks are extracted from the microtask queue and pushed to the call stack for execution.

After all the tasks in the current microtask queue are processed, the asynchronous interface of FETCH () will also change and then method corresponding to promise will be triggered. At this time, a new microtask will be generated and the microtask will be queued. Continue with the above steps.

Task vs Microtask

setTimeout(function a() {}, 0);

Promise.resolve().then(function b() {});
Copy the code

Continue analyzing the code.

Execute code from top to bottom. In the execution process, setTimeout is executed first, and the corresponding callback function is called into the macro task queue. (Believe me, this step is already easy wow)

Continue, execute promise.resolve (), which returns a Promise directly into the resolved state. The pair of microtasks are treated differently and invited into the microtask queue.

Now, this is a little bit of a puzzle. As mentioned earlier, macro task queues are similar to microtask queues, but in the event loop they are the same master. Microtask queues are new and have a higher priority than macro task queues. After all, youth is capital.

Actually, there’s an important point here:

A microtask fired inside a function must take precedence over a macro task fired inside a function

This might be confusing, but the code here, it’s not inside the function. In fact, when script is parsed, there is an anonymous function in the global scope. If you’re in the debugger, at the bottom of the console-Call Stack, there’s an anonymous function that represents the current script.

All right, take it. Turn a little far, said so much, in fact is to say a. Microtasks are executed before macro tasks in the same function scope

One more word, actually wow, in JS. Macro and micro tasks are 1 versus N.

V8 maintains a microtask queue for each macro task

The timing of the at the same time, the task is performed, it is in the V8 will destroy the environment of a global code object, the object’s destructor invokes the environment at this time (this is a concept of c + +), at this point, the V8 will check the task queue, if the task queue in micro tasks, then the V8 will in turn to take out the task, and in accordance with anterograde implementation.

Promise.all() vs Promise.race()

The following code analysis will be briefly passed, which involves knowledge points, in fact, are aimed at the operation mechanism of promise. Later, an article is planned specifically for Promise.

const GOOGLE = 'https://www.google.com';
const NEWS = 'https://www.news.google.com';

Promise.all([
  fetch(GOOGLE).then(function b() {}),
  fetch(GOOGLE).then(function c() {}),
]).then(function after() {});

Copy the code

Let’s get right to the conclusion. The microtasks generated in promise.all are all internal tasks in a settled (resolved/rejected) state before subsequent processing is performed.

The result of the above code isb=>c=> afterorc=>b=>afterWhich one is it? It dependsb/cWhich one goes into the resting state first.

const GOOGLE = 'https://www.google.com';
const NEWS = 'https://www.news.google.com';

Promise.race([
   fetch(NEWS).then(function b() {}),
   fetch(GOOGLE).then(function c() {}),
]).then(function after() {});
Copy the code

Promise.race(), on the other hand, executes subsequent code (then) if the internal task is set first. Unprocessed tasks in race continue to be processed after the callback of THEN is processed.

The result of the above code isc=>after=>bHere,b/cThe running order also depends on which one is settled first.

Errors are generated in promises

Promise.resolve()
  .then(function a() {
    Promise.resolve().then(function d() {})
    Promise.resolve().then(function e() {})
    throw new Error('Error')
    Promise.resolve().then(function f() {})
  })
  .catch(function b() {})
  .then(function c() {})
Copy the code

Just to be brief, throw an error in a Promise that will truncate subsequent code. The next code here is the code after the throw. Promise.resolve().then(function f() {}) will not be executed.

When an error is thrown in a PROMISE, because the error is actually thrown asynchronously from the message queue, it does not prevent the runtime from continuing to execute the synchronous instruction.

Promise.reject(Error('foo')); 
console.log('bar'); 

// bar 
// Uncaught (in promise) Error: foo
Copy the code

Promise.prototype.catch() calls promise.prototype. Then (null, onRejected) and returns a new Promise instance.

Promise chain operation

Promise.resolve()
  .then(function a() {
    Promise.resolve().then(function c() {});
  })
  .then(function b() {
    Promise.resolve().then(function d() {});
  });
Copy the code

During the execution of microtasks, if a new microtask is generated, it will be added to the microtask queue. After the new microtask is completed, subsequent operations are performed.

At this point, there are some unknown bugs.

function foo() {
  return Promise.resolve().then(foo)
}
foo()
Copy the code

When foo is executed, promise.resolve () is called in foo, triggering a microtask. V8 adds the microtask to the microtask queue and exits the current execution of function foo.

Before V8 is ready to exit the current macro task, it checks the microtask queue, which has a microtask, and executes the microtask first. Since the microtask is a call to function foo itself, the process of performing the microtask requires calling function foo, which triggers the same microtask in the process of executing function foo.

The current macro task cannot exit, and other macro tasks in the message queue cannot be executed, such as mouse and keyboard events. Events are kept in the message queue, and the page cannot respond to these events, resulting in the page being stuck.

Afterword.

Sharing is an attitude, and this article is based on a visual tool for code analysis. If you want to practice, you can verify yourself.

References:

  1. Vue.nextTick
  2. JS Advanced programming (4)
  3. Google V8

See all see here, that bother, move a little hand, a key three even wow