Click on the React source debug repository.
An overview of the
As soon as the user interaction produces an update, an Update object is created to host the new state. Multiple updates are connected into a ring-mounted linked list: updateQueue, which is mounted to Fiber and then cycled in the beginWork phase of the fiber to process updates in turn. This is the general process of processing updates, which is the essence of calculating the new state of the component. In React, class components and root components use one class of Update objects, while function components use another class of Update objects, but both follow a similar processing mechanism. For the moment, let’s focus on the update object of the class component.
Relevant concepts
How do updates come about? In the class component, an update can be generated by calling setState:
this.setState({val: 6});
Copy the code
SetState actually calls enqueueSetState, generates an Update object, and calls enqueueUpdate to put it in updateQueue.
const classComponentUpdater = {
enqueueSetState(inst, payload, callback){...// Create update priorities based on event priorities
const lane = requestUpdateLane(fiber, suspenseConfig);
const update = createUpdate(eventTime, lane, suspenseConfig);
update.payload = payload;
enqueueUpdate(fiber, update);
// Start schedulingscheduleUpdateOnFiber(fiber, lane, eventTime); . }};Copy the code
Assuming that node B has an update, the update uE for node B will eventually look like this:
A/B -- -- -- -- - updateQueue. Shared. Pending = update -- - / ^ | | /... | C -- -- -- -- - > DCopy the code
UpdateQueue. Shared. Stored in the pending update. Let’s walk through the structure of update and Update Ue below.
The structure of the update
Update objects, as update carriers, must store updated information
const update: Update<*> = {
eventTime,
lane,
suspenseConfig,
tag: UpdateState,
payload: null.callback: null.next: null};Copy the code
- EventTime: Update occurrence time. If the update is not executed due to insufficient priority, it will time out and be executed immediately
- Lane: Indicates the priority of update
- SuspenseConfig: Task hangs related
- Tag: Indicates the type of update (UpdateState, ReplaceState, ForceUpdate, CaptureUpdate)
- Payload: Updates the status carried.
- Class component: There are two possibilities, object ({}), and function ((prevState, nextProps):newState => {})
- In the root component: is react. element, the first argument to reactdom.render
- Callback: can be understood as a callback to setState
- Next: pointer to the next update
The structure of updateQueue
It is possible to generate multiple updates on a component, so for Fiber, you need a linked list to store these updates. This is called updateQueue, which is structured as follows:
const queue: UpdateQueue<State> = {
baseState: fiber.memoizedState,
firstBaseUpdate: null.lastBaseUpdate: null.shared: {
pending: null,},effects: null};Copy the code
Let’s assume that an update is now generated and look at what these fields mean based on the time the update was processed:
- BaseState: The state calculated from the previous update, which is the state calculated from the updates before the first update skipped. It’s going to be the basis for calculating the state for this time
- FirstBaseUpdate: The first update object skipped in the updateQueue from the previous update
- LastBaseUpdate: The last update in the queue truncated from the first skipped update to the last update in the updateQueue in the previous update.
- Shared. pending: The update queue that stores this update, which is the actual Update Ue. Shared means that the current node shares an update queue with the workInProgress node.
- Effects: array. Save the update callback! = = null Update
A few things need to be explained:
- For scenarios where multiple Update objects are generated, multiple calls to setState are sufficient
this.setState({val: 2});
this.setState({val: 6});
Copy the code
The resulting updateQueue structure is as follows:
You can see that it’s a one-way ring-linked list
u1 ---> u2
^ |
|________|
Copy the code
- On why the update queue is circular.
Conclusion: This is because it is easy to locate the first element in the list. UpdateQueue points to its last update, and updateQueue. Next points to its first update.
Imagine that without a circular list, update Ue points to the last element and you have to iterate to get to the top of the list. Even if you point your update ue to the first element, you still have to traverse the end of the list when adding an update. A circular list, on the other hand, just remembers the tail and doesn’t need to traverse to find the head. Understanding the concept is Paramount, so let’s look at the implementation:
function enqueueUpdate<State>(fiber: Fiber, update: Update<State>) { const updateQueue = fiber.updateQueue; if (updateQueue === null) { return; } const sharedQueue: SharedQueue<State> = (updateQueue: any).shared; Update const pending = sharedQueue.pending; If (pending === null) {// If (pending == null) {// If (pending == null) {// If (pending == null) {// If (pending == null) {// If (pending == null) { } else {// Pending indicates the end of the list. // Pending. Next indicates the header update. Update = pending. Next; update = pending. Next; update = pending. // The last element of the existing queue next points to the new update, so that the new update // is attached pending. Next = update; Sharedqueue.pending = update; sharedqueue.pending = update; }Copy the code
- FirstBaseUpdate and lastBaseUpdate are linked lists: BaseUpdate, based on the current update, stores all updates between the first low-priority update skipped in the last update ue and the last update in the queue. With respect to baseState, it is the state calculated by those updates prior to the first skipped update.
These two things are a little bit confusing, so here’s an example:
A1 -> B1 -> C2 -> D1 - E2
Copy the code
Letters indicate the status carried by the UPDATE, and numbers indicate the priority carried by the update. In the Lanes model, the smaller the number is, the higher the priority is. Therefore, 1 is greater than 2
The first time the queue is processed with render priority 1, when C2 is encountered, its priority is not 1 and skipped. So until the update Ue is processed this time, the baseUpdate linked list is
C2 -> D1 - E2
Copy the code
After this update is complete, firstBaseUpdate isC2
LastBaseUpdate forE2
BaseState forABD
.
Use firstBaseUpdate and lastBaseUpdate to record all updates skipped up to the last update, and use baseState to record the calculated state of updates before the skipped update. This is done to ensure that the final update ue results are the same as expected when all priority updates are processed. That is, although A1 -> B1 -> C2 -> D1-e2 evaluates to ABD the first time with priority 1 (because priority 2 is skipped), the final result must be ABCDE, because this is the result of all update objects in the queue being processed. Here’s a closer look at the update Ue processing mechanism.
Update processing mechanism
Processing updates can be divided into three stages: preparation stage, processing stage, and completion stage. The first two phases focus on update Ue, and the last phase assigns the newly calculated state to fiber.
Preparation stage
Sorting updateQueue. Because of the priority, low-priority updates may be skipped and wait for the next execution. In this process, new updates may be generated. So when an update is processed, it is possible to have two UPDATE queues: the one left over from the last update and the one added this time. What’s left over from last time is all updates from firstBaseUpdate to lastBaseUpdate; The new updates are the ones that are generated.
In the preparation phase, the two queues are merged, and the merged queue is no longer circular, so that the process can be traversed from beginning to end. In addition, the above operations are all processed in the workInProgress node updateQueue, so we also need to perform the same operation in the current node, so that the render is interrupted by a higher priority task. When a new workInProgress node is modeled after the current node again, updates that have not been processed before are not lost.
Processing phase
Loop through the update queue cleaned up in the previous step. There are two main points here:
- Whether or not this update handles the update depends on its priority (update.lane) and renderLanes.
- The calculation results of this update are based on baseState.
Underpriority
An under-priority update is skipped, which does three things in addition to skipping:
- The skipped update is placed in a linked list of firstBaseUpdate and lastBaseUpdate (that is, baseUpdate) and will be processed the next time low-priority updates are processed.
- Records baseState, which is the result of all updates processed prior to this low-priority update, and is only recorded on the first hop, since low-priority tasks are redone from the first skipped update.
- Recording the priority of the skipped update and placing it in WorkinProgress.lanes at the end of the update process is key to enabling scheduling to initiate again to redo low-priority tasks.
The second point is explained in the comments at the head of the reactupDatequeue.js file, so I’ll explain it again to make it easier to understand.
The first update to baseState is an empty string, and the update queue looks like this, with letters for state and numbers for priority. RenderLanes (A1, B1-C2-D1-E2, priority 1 > 2) are the first queues to be processed: [A1, B1, D1] <- The first skipped update is C2, the baseUpdate queue is [C2, D1, E2], and the result of all previous processed updates is AB. At this point record baseState = 'AB' note! When a lower-priority update(E2) is skipped again, baseState Result state is not recorded: 'ABD'-------------------------------------------------------------------------------------------------- Base state: 'AB' <- When scheduling again, fetch the baseUpdate queue left over from the last update, based on the result of the calculation. Updates: [C2, D1, E2] Result state: 'ABCDE'Copy the code
Adequate priority
If an UPDATE is of sufficient priority, there are two main things:
- Determine if the baseUpdate queue is not empty (there was an update skipped before) and place the current update in the baseUpdate queue.
- Process updates and calculate new states.
This can be seen in conjunction with enqueueing baseUpdate with low-priority updates above. This effectively means that as soon as an update is skipped, it is used as a starting point and all subsequent updates up to the end are truncated regardless of priority. Let’s use the example above to illustrate.
B2 is skipped, and the baseUpdate queue is B2-C1-d2Copy the code
This is done to ensure that the final results of all updates are consistent with the expected results of all updates triggered by the user’s actions. For example, although A1 and C1 are preferentially executed at the first time and display the result as AC, this is only to timely respond to the temporary result generated by user interaction. In fact, the result of C1 needs to rely on the calculation result of B2. When the second render is performed, B2-c1-d2 queue is processed based on the result of update (baseState = A) of B2, and the final result is ABCD. This can be demonstrated in the provided example of queue-jumping for high-priority tasks.
Change process is 0 -> 2 -> 3, life cycle set state to 1(task A2), click the event to state + 2(task A1), under normal circumstances A2 is scheduled normally, but the render is not completed, at this time A1 cuts the queue and updates queue A2-A1, in order to respond to high-priority updates first, Skip A2 and calculate A1 first. The number changes from 0 to 2. BaseUpdate is A2-A1 and baseState is 0. Then redo the lower-priority tasks. BaseUpdate a2-a1, based on base estate (0), the final result is 3.
Complete the stage
Mostly doing some assignment and priority marking.
- Assignment updateQueue baseState. If no render update is skipped, assign the newly calculated state, otherwise assign the update before the first skipped update.
- Assign firstBaseUpdate and lastBaseUpdate to updateQueue, that is, assign the truncated queue to updateQueue’s baseUpdate linked list if any updates are skipped.
- Update lanes on the workInProgress node. Update policy: If no priority is skipped, all updates are processed and lanes are empty. Otherwise, place the priorities of low-priority UPDATE packets to lanes. As I said before,
This is the key to initiating another scheduled redo of the lower priority task.
- Update memoizedState on the workInProgress node.
The source code to achieve
Above the basic process of dealing with all the update described, now let’s look at the source code implementation. This part of the code is in the processUpdateQueue function, which involves a lot of linked list operations. It’s a lot of code, so let’s take a look at its structure, and I’ve highlighted the three stages.
function processUpdateQueue<State> (workInProgress: Fiber, props: any, instance: any, renderLanes: Lanes,) :void {
// Preparation phase
const queue: UpdateQueue<State> = (workInProgress.updateQueue: any);
let firstBaseUpdate = queue.firstBaseUpdate;
let lastBaseUpdate = queue.lastBaseUpdate;
let pendingQueue = queue.shared.pending;
if(pendingQueue ! = =null) { / *... * / }
if(firstBaseUpdate ! = =null) { // The processing phase
do{... }while (true);
// Complete phase
if (newLastBaseUpdate === null) { newBaseState = newState; } queue.baseState = ((newBaseState: any): State); queue.firstBaseUpdate = newFirstBaseUpdate; queue.lastBaseUpdate = newLastBaseUpdate; markSkippedUpdateLanes(newLanes); workInProgress.lanes = newLanes; workInProgress.memoizedState = newState; }}Copy the code
After understanding the above concepts and the main structure of the source code, I released the complete code, but deleted the irrelevant parts, I added comments, compared with the three processes will be more helpful to understand, otherwise just looking at the linked list operation is still a bit complicated.
function processUpdateQueue<State> (
workInProgress: Fiber, props: any, instance: any, renderLanes: Lanes,) :void {
// Preparation phase ----------------------------------------
// Fetch updateQueue from the workInProgress node
// The queue in the following code is update Ue
const queue: UpdateQueue<State> = (workInProgress.updateQueue: any);
// Fetch the baseUpdate queue (hereafter referred to as the legacy queue) on the queue, and
// Prepare to access the update queue created this time (hereinafter called new queue)
let firstBaseUpdate = queue.firstBaseUpdate;
let lastBaseUpdate = queue.lastBaseUpdate;
// Fetch a new queue
let pendingQueue = queue.shared.pending;
// The following operation is actually to connect the new queue to the last legacy queue.
if(pendingQueue ! = =null) { queue.shared.pending = null;
// Fetch a new queue
const lastPendingUpdate = pendingQueue; const firstPendingUpdate = lastPendingUpdate.next;
// Break the circular list by pointing the last element of the queue to NULL
// Add a new queue at the end
lastPendingUpdate.next = null;
if (lastBaseUpdate === null) {
firstBaseUpdate = firstPendingUpdate;
} else {
// Next of the last update in the legacy queue points to the first update in the new queue
// The connection is complete
lastBaseUpdate.next = firstPendingUpdate; } // Change the tail of the legacy queue to the tail of the new queue
lastBaseUpdate = lastPendingUpdate;
// Update firstBaseUpdate and on current in the same way
// lastBaseUpdate (baseUpdate queue).
// This is equivalent to backing up the merged queue as a baseUpdate queue to the current section
// because if the render is interrupted this time, the workInProgress node is copied the next time the task is restarted
// From the current node, the baseUpdate queue above it will keep the update to ensure that the update is not lost.
const current = workInProgress.alternate;
if(current ! = =null) {
// This is always non-null on a ClassComponent or HostRoot
const currentQueue:UpdateQueue<State> = (current.updateQueue: any);
const currentLastBaseUpdate = currentQueue.lastBaseUpdate;
if(currentLastBaseUpdate ! == lastBaseUpdate) {if (currentLastBaseUpdate === null) {
currentQueue.firstBaseUpdate = firstPendingUpdate;
} else{ currentLastBaseUpdate.next = firstPendingUpdate; } currentQueue.lastBaseUpdate = lastPendingUpdate; }}}// At this point, the new queue has been merged to the legacy queue, firstBaseUpdate as
// The newly merged queue will be processed in a loop
// Process phase -------------------------------------
if(firstBaseUpdate ! = =null) { Get baseState / /
let newState = queue.baseState;
// declare newLanes, which will be processed as this update
// The priority is finally marked on the WIP node
let newLanes = NoLanes;
// declare newBaseState, notice when it is assigned next, and preconditions:
// 1. If a priority is skipped, newBaseState is set to newState,
// Queue. BaseState
// 2. If no priority is skipped after all processing is complete, newBaseState is set to
// The state of this new calculation is finally updated to queue.basestate
let newBaseState = null;
// Use newFirstBaseUpdate and newLastBaseUpdate // to represent the baseUpdate queue generated by this update. The purpose is to intercept the existing queue
// The first low-priority update to be skipped will be updated to the last update
// updateQueue firstBaseUpdate and lastBaseUpdate
// As the legacy queue for the next rendering (baseUpdate)
let newFirstBaseUpdate = null;
let newLastBaseUpdate = null;
// Start the loop from the beginning
let update = firstBaseUpdate;
do {
const updateLane = update.lane;
const updateEventTime = update.eventTime;
// The isSubsetOfLanes function is used to determine the priority of the current update.
RenderLanes if not, the priority is insufficient
if(! isSubsetOfLanes(renderLanes, updateLane)) {const clone: Update<State> = {
eventTime: updateEventTime,
lane: updateLane,
suspenseConfig: update.suspenseConfig,
tag: update.tag,
payload: update.payload,
callback: update.callback,
next: null};// Add update to baseUpdate queue
if (newLastBaseUpdate === null) {
newFirstBaseUpdate = newLastBaseUpdate = clone;
// newBaseState updates to the result of the previous update task, next round
// The render process with the new priority will calculate on this basis when it processes the update queue.
newBaseState = newState;
} else {
// If the baseUpdate queue already has an update, the current update will be executed
// append to the end of the queue
newLastBaseUpdate = newLastBaseUpdate.next = clone;
}
/* * * newLanes will be assigned to workinprogress. lanes, which in turn will be collected to root.pendingLanes. Update processUpdateQueue update processUpdateQueue update processUpdateQueue The update is removed, and the lower priority task is therefore redone * */
newLanes = mergeLanes(newLanes, updateLane);
} else {
if(newLastBaseUpdate ! = =null) {
// The current update is after an underpriority update,
// There are two reasons:
// First, the priority is enough;
// Second, newLastBaseUpdate is not null, indicating that there is an underpriority update
// then put this high priority into the baseUpdate, which is implemented from updateQueue as mentioned earlier
// Intercepts the low-priority update to the last update
const clone: Update<State> = {
eventTime: updateEventTime,
lane: NoLane,
suspenseConfig: update.suspenseConfig,
tag: update.tag,
payload: update.payload,
callback: update.callback,
next: null}; newLastBaseUpdate = newLastBaseUpdate.next = clone; } markRenderEventTimeAndConfig(updateEventTime, update.suspenseConfig);// Process updates to compute new results
newState = getStateFromUpdate( workInProgress, queue, update, newState, props, instance, );
const callback = update.callback;
// Callback is the second argument to setState, which is a side effect.
// Will be placed in the side effects queue
if(callback ! = =null) {
workInProgress.effectTag |= Callback;
const effects = queue.effects;
if (effects === null) {
queue.effects = [update];
} else{ effects.push(update); }}}// Move pointer for traversal
update = update.next;
if (update === null) {
// If there are any new queues, check if there are any new queues
// Continue processing after an existing queue
pendingQueue = queue.shared.pending;
if (pendingQueue === null) {
// If there is no update waiting to be processed, the loop is broken
break;
} else {
// If a new update comes in, add it to the previously merged queue
const lastPendingUpdate = pendingQueue;
const firstPendingUpdate = ((lastPendingUpdate.next: any): Update<State>);
lastPendingUpdate.next = null;
update = firstPendingUpdate;
queue.lastBaseUpdate = lastPendingUpdate;
queue.shared.pending = null; }}}while (true);
// If there are no low-priority updates, the new newBaseState is assigned to
// The state just calculated
if (newLastBaseUpdate === null) {
newBaseState = newState;
}
// Completion phase ------------------------------------queue.baseState = ((newBaseState: any): State); queue.firstBaseUpdate = newFirstBaseUpdate; queue.lastBaseUpdate = newLastBaseUpdate; markSkippedUpdateLanes(newLanes); workInProgress.lanes = newLanes; workInProgress.memoizedState = newState; }}Copy the code
The logic for the useReducer in hooks to handle update calculation state is basically the same as here.
conclusion
After combing through the above, you can see that the entire processing of updates revolves around priorities. The whole purpose of the processUpdateQueue function is to handle updates, but to ensure that updates are processed in a priority manner without messing around, because it follows a fixed set of rules: Once a priority has been skipped, remembering the status and the update queue after that priority and backing up the queue to the current node is crucial for the update objects to be processed in order and complete, and to ensure that the resulting processing results are consistent with the results of the interaction triggered by the user’s actions.