React 16.concurrent mode

Video courses (efficient Learning) :Into the curriculum

Course Contents:

1. Introduction and questions

2. React design philosophy

React source code architecture

4. Source directory structure and debugging

5. JSX & core API

Legacy and Concurrent mode entry functions

7. Fiber architecture

8. Render phase

9. The diff algorithm

10. com MIT stage

11. Life cycle

12. Status update process

13. Hooks the source code

14. Handwritten hooks

15.scheduler&Lane

16. Concurrent mode

17.context

18 Event System

19. Write the React mini

20. Summary & Answers to interview questions in Chapter 1

21.demo

concurrent mode

React17 supports concurrent mode, which is a set of new features including Fiber, Scheduler, and Lane that can adjust the response speed of an application based on the user’s hardware performance and network condition. The core is to enable asynchronous interruptible updates. Concurrent mode is also the direction of major react iterations in the future.

  • Cup: Allows the time-consuming reconcile process to cede JS execution to higher-priority tasks, such as user input,
  • Suspense: Depend on Suspense

Fiber

Fiber has been introduced before. Here we will look at the meaning of Fiber in concurrent mode. Prior to REact15, reconcile is executed synchronously. To solve this problem, a set of asynchronous interruptible updates is required to allow time-consuming computations to cede JS execution to higher-priority tasks, which can be performed when the browser is free. So we need a data structure to describe real DOM and updated information that can be reconciled in memory when appropriate. This data structure is Fiber.

Scheduler

Scheduler is independent from React itself and is equivalent to a separate package. The significance of Scheduler lies in that, when the computation amount of CUP is large, we calculate the time of a frame according to the FPS of the device and execute the OPERATION of CUP within this time. When the task execution time is about to exceed one frame, The task is paused to give the browser time to rearrange and redraw. Continue the task when appropriate.

In JS we know that generator can also pause and resume tasks, but we also need to prioritize tasks, which generator cannot do. Time slices are implemented using MessageChannel in Scheduler, and tasks are prioritized using a small top heap to achieve asynchronous interruptible updates.

The Scheduler can use expiration time to represent priority.

The higher the priority, the shorter the expiration time and the closer to the current time, meaning that it will be executed later.

The lower the priority is, the longer the expiration time is. In other words, it can be executed after a long time.

lane

Lane uses binary bits to represent the priority of tasks, which is convenient for priority calculation. Different priorities occupy ‘tracks’ of different positions, and there is a concept of batch. The lower the priority, the more’ tracks’. High priority interrupts low priority, and what priority should be assigned to new tasks are all problems that Lane needs to solve.

batchedUpdates

In simple terms, multiple updates are triggered simultaneously in a context, and these updates are merged into one update, for example

onClick() {
  this.setState({ count: this.state.count + 1 });
  this.setState({ count: this.state.count + 1 });
}
Copy the code

In previous versions of react, if you put multiple updates in a setTimeout, you wouldn’t merge them if you took them out of context, because multiple setState executionContext in the same context would have BatchedContext, SetState with BatchedContext will be merged, and when executionContext is equal to NoContext, SyncCallbackQueue will be synchronized, so multiple set states in setTimeout will not be merged, And it will be executed synchronously.

onClick() {
 setTimeout(() = > {
    this.setState({ count: this.state.count + 1 });
    this.setState({ count: this.state.count + 1 });
  });
}
Copy the code
export function batchedUpdates<A.R> (fn: A => R, a: A) :R {
  const prevExecutionContext = executionContext;
  executionContext |= BatchedContext;
  try {
    return fn(a);
  } finally {
    executionContext = prevExecutionContext;
    if (executionContext === NoContext) {
      resetRenderTimer();
       // Execute tasks in SyncCallbackQueue synchronously if executionContext is NoContextflushSyncCallbackQueue(); }}}Copy the code

In Concurrent mode, the above example will also be combined into one update, the root cause is in the following simplified source code. If multiple setState callbacks are performed, the priorities of these setState callbacks will be compared. If the priorities are consistent, the return will be first, without the subsequent render phase

function ensureRootIsScheduled(root: FiberRoot, currentTime: number) {
  const existingCallbackNode = root.callbackNode;// The setState callback that was previously called
  / /...
	if(existingCallbackNode ! = =null) {
    const existingCallbackPriority = root.callbackPriority;
    // The new setState callback and the previous setState callback with equal priority enter batchedUpdate logic
    if (existingCallbackPriority === newCallbackPriority) {
      return;
    }
    cancelCallback(existingCallbackNode);
  }
	// Dispatch the start of the render phase
	newCallbackNode = scheduleCallback(
    schedulerPriorityLevel,
    performConcurrentWorkOnRoot.bind(null, root),
  );
	/ /...
}
Copy the code

In Concurrent mode, setTimeout is called multiple times with the same setState priority as requestUpdateLane, CurrentEventWipLanes === NoLanes, so their currentEventWipLanes parameters are the same, In findUpdateLane, the schedulerLanePriority parameters are the same (scheduling priority is the same), so the returned lanes are the same.

export function requestUpdateLane(fiber: Fiber) :Lane {
	/ /...
  if (currentEventWipLanes === NoLanes) {CurrentEventWipLanes === NoLanes for the first time
    currentEventWipLanes = workInProgressRootIncludedLanes;
  }
  / /...
  // schedulerLanePriority, currentEventWipLanes are the same in setTimeout, so the return lane is the same
  lane = findUpdateLane(schedulerLanePriority, currentEventWipLanes);
  / /...

  return lane;
}
Copy the code

Suspense

Suspense can display pending state when data is requested, and data will be displayed when data is successfully requested. The reason is that components in Suspense have a low priority, while fallback components out of Suspense have a high priority. In Suspense, the render phase will be rescheduled after resolve. This happens in updateSuspenseComponent, see the Suspense video for details

conclusion

Fiber provides data-level support for the Concurrent architecture.

Scheduler provides a guarantee for Concurrent to implement time slice scheduling.

The Lane model provides updated policies for Concurrent

The upper layer implements batchedUpdates and Suspense