React version: V17.0.3
1, the preface
Remember that react-hooks Hooks are used in the same way that they are used in the same way. Remember that react-hooks Hooks are used in the same way that they are used in the same way.
-
Component mount phase:
useState = ReactCurrentDispatcher.current.useState = HooksDispatcherOnMount.useState = mountState;
-
Component update phase:
useState = ReactCurrentDispatcher.current.useState = HooksDispatcherOnUpdate.useState = updateState
Therefore, during the mount phase, the component performs useState, which actually performs mountState, and during the update phase, which actually performs updateState.
2. Type definition
Let’s start by looking at a few types of definitions in ReactFiberlinks.new.js.
2.1 Hook
We define multiple hooks in the function component, so how does the function component find the corresponding hook information? Let’s look at the hook type definition:
Export type Hook = {| memoizedState: any, / / points to the current render nodes Fiber, the last time the final state values baseState after full update: Any, // initialize initialState, already after each dispatch newState baseQueue: Update < any, any > | null, / / the current Update needs to be updated, after each Update, will be an Update on the assignment, convenient react at the edge of the rendering errors, the data back in the queue: UpdateQueue < any, any > | null, / / the cache update queue, store next update behavior many times: Hook | null, / / points to the next useState corresponding Hook object, through the next series each hooks |};Copy the code
The hook information of the function component is stored in the memoizedState property of the corresponding Fiber node, which represents the first hook in the function component. The data structure of the hooks is one-way linked list, and each node can find the next hook through the next pointer.
2.1 Update & UpdateQueue
The update queue of each hook will be stored in the queue attribute. After the update task in the update queue is executed, the latest state value can be obtained. The types of an update task and update queue are defined as follows:
type Update<S, A> = {|
lane: Lane,
action: A,
eagerReducer: ((S, A) => S) | null,
eagerState: S | null,
next: Update<S, A>,
|};
export type UpdateQueue<S, A> = {|
pending: Update<S, A> | null,
interleaved: Update<S, A> | null,
lanes: Lanes,
dispatch: (A => mixed) | null,
lastRenderedReducer: ((S, A) => S) | null,
lastRenderedState: S | null,
|};
Copy the code
Update is called an Update and is used when scheduling a React Update. UpdateQueue is an Update queue with an Update dispatch.
3. Mount stage
During the mount phase, useState is executed, which is actually mountState. Let’s look at the implementation of this function.
3.1 mountState
function mountState<S>( initialState: (() => S) | S, ): [S, Dispatch<BasicStateAction<S>>] {// Create a new hook and return the current workInProgressHook const hook = mountWorkInProgressHook(); Function if (typeof initialState === 'function') {// $FlowFixMe: Flow doesn't like mixed types initialState = initialState(); } // memoizedState is used to record the results that useState should return. // baseState initializes initialState // initialState assigns initialState to hook objects separately MemoizedState = hook.baseState = initialState; // Create a new queue const queue = (hook. Queue = {pending: null, interleaved: null, lanes: NoLanes, dispatch: Null, lastRenderedReducer: basicStateReducer, // useState Use reducer lastRenderedState: (initialState: any),}); const dispatch: Dispatch< BasicStateAction<S>, > = (queue.dispatch = (dispatchAction.bind( null, Queue currentlyRenderingFiber; queue currentlyRenderingFiber; Initialize to workInProgress queue in renderWithHooks(),): any)); Return [hook.memoizedState, dispatch]; }Copy the code
When the component is first loaded, execute the mountWorkInProgressHook method to create a new hook object, assigning the initial state to the memoizedState and baseState properties of the hook object. We create a new queue, then create a Dispatch (a function to update state), bind the current fiber and queue to the Dispatch, and finally return a state and a function to update state.
3.2 mountWorkInProgressHook
In mountState, a new hook object is created using the mountWorkInProgressHook() function. Let’s see how it is created:
/ / packages/react - the reconciler/SRC/ReactFiberHooks. New. Js / / create a new hook object, And return the current workInProgressHook object. Function mountWorkInProgressHook(): Hook {const Hook: Hook = {memoizedState: memoizedState: null, baseState: null, baseQueue: null, queue: null, next: null, }; // Hooks are stored as a linked list on the fiber's memoizedState field // New Hooks are stored as a linked list on the current Fiber MemoizedState property // Only when the page is opened for the first time, If (workInProgressHook === null) {// This is the first hook in the list CurrentlyRenderingFiber: The work-in-progress fiber. I've named it differently to distinguish it fromthe work-in-progress hook. currentlyRenderingFiber.memoizedState = workInProgressHook = hook; } else {// Append to the end of the list. WorkInProgressHook = workInProgressHook. Next = hook; } return workInProgressHook; }Copy the code
If the global workInProgressHook object does not exist (value null), the new hook is assigned to the workInProgressHook object. Also assign the hook object to the memoizedState property of currentlyRenderingFiber. If the workInProgressHook object already exists, Then hook object is connected to the tail of workInProgressHook to form a linked list.
3.3 dispatchAction
In mountState, we create a dispatch trigger with dispatchAction, which updates the state function. When we create the Dispatch trigger, we bind the current Fiber node to the new queue. Let’s focus on this dispatchAction:
function dispatchAction<S, A>( fiber: Fiber, queue: UpdateQueue<S, A>, action: A, ) { const eventTime = requestEventTime(); const lane = requestUpdateLane(fiber); // Create a new update. // Action is the value of setCount (count+1, count+2, count+3...). const update: Update<S, A> = { lane, action, eagerReducer: null, eagerState: null, next: (null: any), }; // alternate const fiber = fiber. Alternate; / * * * a: render phase processing * if branch of the if (fiber = = = currentlyRenderingFiber | | (alternate! == null && alternate === currentlyRenderingFiber) WorkInProgress exists representing the current render phase * In the Render phase, the following processing is done: * 1, mark variable didScheduleRenderPhaseUpdate, Subsequent individual treatment * 2, new or updated updateQueue circular linked list * / if (fiber = = = currentlyRenderingFiber | | (alternate! If (== null && alternate === currentlyRenderingFiber)) {// If (== null && alternate == currentlyRenderingFiber) The current update cycle and a new cycle / / if they are re - render, didScheduleRenderPhaseUpdateDuringThisPass are true, / / will loop count numberOfReRenders to record the number of re - render didScheduleRenderPhaseUpdateDuringThisPass = didScheduleRenderPhaseUpdate = true; /** * const pending = queue.pending; If (pending === NULL) {// This is the first update. Create a circular list Update. Next = update; update. Next = update; } else {// Update the looping list update.next = pending.next; Pending.next = update; } // Reassign an update to queue.pending queue.pending = update; } else {** * if (isInterleavedUpdate(fiber,)) {if (isInterleavedUpdate(fiber,)); Lane)) {// Update priority decision const interleaved = queue.interleaved; If (interleaved === null) {// This is the first update. Create a circular list. // At the end of the current render, This queue's interleaved updates will // be transfered to the pending queue. Interleavedqueue (queue) interleavedQueue (queue); } else {// Update the loop list update.next = interleaved.next; interleaved.next = update; } // Queue. Interleaved = update; } else {/** * update the update phase according to the priority */ / create/update the update loop const pending = queue.pending; If (pending === null) {// This is the first update. Create a circular list.update.next = update; } else {// Update the loop list update.next = pending.next; pending.next = update; } queue.pending = update; Lanes === NoLanes Indicates that update if (fiber. Lanes === NoLanes &&) does not exist on the fiber (alternate = = = null | | alternate. Lanes = = = NoLanes)) {/ / computing into the rendering stage before the next phase of the state, if the new state is the same as the current state, // The queue is currently empty, which means we can eagerly compute the // next state before entering the render phase. If the new state is the // same as the current state, we may be able to bail out entirely. const lastRenderedReducer = queue.lastRenderedReducer; if (lastRenderedReducer ! == null) { let prevDispatcher; try { const currentState: S = (queue.lastRenderedState: any); const eagerState = lastRenderedReducer(currentState, action); // Stash the eagerly computed state, and the reducer used to compute // it, on the update object. If the reducer hasn't changed by the // time we enter the render phase, then the eager state can be used // without calling the reducer again. update.eagerReducer = lastRenderedReducer; update.eagerState = eagerState; if (is(eagerState, currentState)) { // Fast path. We can bail out without scheduling React to re-render. // It's still possible that we'll need to rebase this update later, // if the component re-renders for a different reason and by that // time the reducer has changed. return; } } catch (error) { // Suppress the error. It will throw again in the render phase. } finally { if (__DEV__) { ReactCurrentDispatcher.current = prevDispatcher; }}}} // Get the current fiber, distinguish synchronous task and asynchronous task according to priority, synchronous task should be executed immediately, render first, Asynchronous tasks go scheduler / / https://juejin.cn/post/6898635086657224717#heading-16 / / scheduleUpdateOnFiber scheduling update tasks const root = scheduleUpdateOnFiber(fiber, lane, eventTime); if (isTransitionLane(lane) && root ! == null) { let queueLanes = queue.lanes; // If any entangled lanes are no longer pending on the root, then they // must have finished. We can remove them from the shared queue, which // represents a superset of the actually pending lanes. In some cases we // may entangle more than we need to, but that's OK. In fact it's worse if // we *don't* entangle when we should. queueLanes = intersectLanes(queueLanes, root.pendingLanes); // Entangle the new transition lane with the other transition lanes. const newQueueLanes = mergeLanes(queueLanes, lane); queue.lanes = newQueueLanes; // Even if queue.lanes already include lane, we don't know for certain if // the lane finished since the last time we entangled it. So we need to // entangle it again, just to be sure. markRootEntangled(root, newQueueLanes); }} // Mark the current update, If (enableSchedulingProfiler) {markstate update escheduled (fiber, lane); }}Copy the code
A queue data structure is maintained in dispatchAction:
/** * const pending = queue.pending; If (pending === NULL) {// This is the first update. Create a circular list Update. Next = update; update. Next = update; } else {// Update the looping list update.next = pending.next; Pending.next = update; } // Reassign an update to queue.pending queue.pending = update;Copy the code
Queue is a circular linked list with the following rules:
- Queue.pending points to the last update
- Pending.next points to the first update
If the first branch of fiber = = = currentlyRenderingFiber | | (alternate! If (== null && alternate === currentlyRenderingFiber) The current update cycle and a new cycle, if they are re – render, didScheduleRenderPhaseUpdateDuringThisPass set to true, and in renderWithHooks, If didScheduleRenderPhaseUpdateDuringThisPass to true, will cycle count numberOfReRenders to record the number of re – render.
if ( fiber === currentlyRenderingFiber || (alternate ! = = null && alternate = = = currentlyRenderingFiber)) {/ / this is a render phase trigger updates, need didScheduleRenderPhaseUpdate tag variables, Subsequent individual treatment didScheduleRenderPhaseUpdateDuringThisPass = didScheduleRenderPhaseUpdate = true; }Copy the code
If the update is in the re-render phase, when there is no priority update on Fiber, calculate the state of the next phase before entering the render phase, if the calculated state (eagerState) is the same as currentState (eagerState), Will exit the render.
Lanes === NoLanes Indicates that update if (fiber. Lanes === NoLanes &&) does not exist on the fiber (alternate = = = null | | alternate. Lanes = = = NoLanes)) {/ / computing into the rendering stage before the next phase of the state, if the new state is the same as the current state, // The queue is currently empty, which means we can eagerly compute the // next state before entering the render phase. If the new state is the // same as the current state, we may be able to bail out entirely. const lastRenderedReducer = queue.lastRenderedReducer; if (lastRenderedReducer ! == null) { let prevDispatcher; try { const currentState: S = (queue.lastRenderedState: any); const eagerState = lastRenderedReducer(currentState, action); // Stash the eagerly computed state, and the reducer used to compute // it, on the update object. If the reducer hasn't changed by the // time we enter the render phase, then the eager state can be used // without calling the reducer again. update.eagerReducer = lastRenderedReducer; update.eagerState = eagerState; if (is(eagerState, currentState)) { // Fast path. We can bail out without scheduling React to re-render. // It's still possible that we'll need to rebase this update later, // if the component re-renders for a different reason and by that // time the reducer has changed. return; }}}}Copy the code
Let’s go through a process during the mount phase:
4. Update phase
Let’s take a look at the method updateState that useState actually calls during the update.
4.1 updateState
function updateState<S>(
initialState: (() => S) | S,
): [S, Dispatch<BasicStateAction<S>>] {
return updateReducer(basicStateReducer, (initialState: any));
}
Copy the code
As you can see, updateState actually calls updateReducer. For update actions triggered by useState, return the result of the function’s execution if the action is a function, otherwise return the value of the action directly:
function basicStateReducer<S>(state: S, action: BasicStateAction<S>): S {
// $FlowFixMe: Flow doesn't like mixed types
return typeof action === 'function' ? action(state) : action;
}
Copy the code
Therefore, useState is only a special case of the useReducer, whose reducer is passed in as basicStateReducer, which is only responsible for changing state, rather than the custom Reducer that the useReducer can pass in.
Next, let’s look at what the Update Educer does.
4.2 updateReducer
function updateReducer<S, I, A>( reducer: (S, A) => S, initialArg: I, init? : I => S,): [S, Dispatch<A>] const Hook = updateWorkInProgressHook(); const queue = hook.queue; invariant( queue ! == null, 'Should have a queue. This is likely a bug in React. Please file an issue.', ); queue.lastRenderedReducer = reducer; // currentHook global variable, list of hooks on current fiber const current: hook = (currentHook: any); // The last rebase update that is NOT part of the base state. let baseQueue = current.baseQueue; // The last pending update that hasn't been processed yet const pendingQueue = queue.pending; If (pendingQueue!) if (pendingQueue! == null) { // We have new updates that haven't been processed yet. // We'll add them to the base queue. if (baseQueue ! // Merge the pending queue and the base queue const baseFirst = baseQueue. Next; const pendingFirst = pendingQueue.next; baseQueue.next = pendingFirst; pendingQueue.next = baseFirst; } current.baseQueue = baseQueue = pendingQueue; queue.pending = null; } // Update if (baseQueue!) on the base queue (baseQueue! == null) { // We have a queue to process. const first = baseQueue.next; let newState = current.baseState; let newBaseState = null; let newBaseQueueFirst = null; let newBaseQueueLast = null; let update = first; // Update do on base queue {const updateLane = update.lane; if (! IsSubsetOfLanes (renderLanes, updateLane) {** * * 1, the current update will be skipped * 2, a copy of the current update To be skipped over or skipped over. To be skipped over or skipped over. */ // Priority is insufficient. the previous update/state is the new base // update/state. const clone: Update<S, A> = { lane: updateLane, action: update.action, eagerReducer: update.eagerReducer, eagerState: update.eagerState, next: (null: any), }; if (newBaseQueueLast === null) { newBaseQueueFirst = newBaseQueueLast = clone; newBaseState = newState; } else { newBaseQueueLast = newBaseQueueLast.next = clone; } // Update the remaining priority in the queue. // TODO: Don't need to accumulate this. Instead, we can remove // renderLanes from the original lanes. currentlyRenderingFiber.lanes = mergeLanes( currentlyRenderingFiber.lanes, updateLane, ); markSkippedUpdateLanes(updateLane); } else {/** * handle high-priority update: */ / This update does have sufficient priority. if */ / This update does have sufficient priority (newBaseQueueLast ! == null) { const clone: Update<S, A> = { // This update is going to be committed so we never want uncommit // it. Using NoLane works because 0 is a subset of all bitmasks, so // this will never be skipped by the check above. lane: NoLane, action: update.action, eagerReducer: update.eagerReducer, eagerState: update.eagerState, next: (null: any), }; newBaseQueueLast = newBaseQueueLast.next = clone; If (update.eagerreducer === reducer) {// If this update was reducer processed eagerly, and its reducer matches the // current reducer, we can use the eagerly computed state. newState = ((update.eagerState: any): S); } else { const action = update.action; newState = reducer(newState, action); }} // Get the next update on the list update = update. } while (update ! == null && update ! == first); if (newBaseQueueLast === null) { newBaseState = newState; } else { newBaseQueueLast.next = (newBaseQueueFirst: any); } // Mark that the fiber performed work, but only if the new state is // different from the current state. if (! is(newState, hook.memoizedState)) { markWorkInProgressReceivedUpdate(); } // after the update queue is finished, mount the newState on the hook object line and update queue hook.memoizedState = newState; hook.baseState = newBaseState; hook.baseQueue = newBaseQueueLast; queue.lastRenderedState = newState; } // For Interleaved type update, Interleaved updates are stored on a separate queue. We aren't going to // Process them during this render, but we do need to track which lanes // are remaining. const lastInterleaved = queue.interleaved; if (lastInterleaved ! == null) { let interleaved = lastInterleaved; do { const interleavedLane = interleaved.lane; currentlyRenderingFiber.lanes = mergeLanes( currentlyRenderingFiber.lanes, interleavedLane, ); markSkippedUpdateLanes(interleavedLane); interleaved = ((interleaved: any).next: Update < S, A >); } while (interleaved ! == lastInterleaved); } else if (baseQueue === null) { // `queue.lanes` is used for entangling transitions. We can set it back to // zero once the queue is empty. queue.lanes = NoLanes; } const dispatch: Dispatch<A> = (queue.dispatch: any); Const [state, dispatch] = reducer (reducer, initialArg, init); return [hook.memoizedState, dispatch]; }Copy the code
UpdateReducer is divided into three cases:
-
A new update is generated and added to the base queue.
-
Updates on the base queue are processed based on priority, because no new updates are generated during the processing of updates.
-
For interleaved type updates, no processing is done, just marking and tracking their priority.
Let’s focus on the second case of the updateReducer:
Update if (baseQueue!); update if (baseQueue! == null) { // We have a queue to process. const first = baseQueue.next; let newState = current.baseState; let newBaseState = null; let newBaseQueueFirst = null; let newBaseQueueLast = null; let update = first; // Update do on base queue {const updateLane = update.lane; if (! IsSubsetOfLanes (renderLanes, updateLane) {** * * 1, the current update will be skipped * 2, a copy of the current update To be skipped over or skipped over. To be skipped over or skipped over. */ // Priority is insufficient. the previous update/state is the new base // update/state. const clone: Update<S, A> = { lane: updateLane, action: update.action, eagerReducer: update.eagerReducer, eagerState: update.eagerState, next: (null: any), }; if (newBaseQueueLast === null) { newBaseQueueFirst = newBaseQueueLast = clone; newBaseState = newState; } else { newBaseQueueLast = newBaseQueueLast.next = clone; } // Update the remaining priority in the queue. // TODO: Don't need to accumulate this. Instead, we can remove // renderLanes from the original lanes. currentlyRenderingFiber.lanes = mergeLanes( currentlyRenderingFiber.lanes, updateLane, ); markSkippedUpdateLanes(updateLane); } else {/** * handle high-priority update: */ / This update does have sufficient priority. if */ / This update does have sufficient priority (newBaseQueueLast ! == null) { const clone: Update<S, A> = { // This update is going to be committed so we never want uncommit // it. Using NoLane works because 0 is a subset of all bitmasks, so // this will never be skipped by the check above. lane: NoLane, action: update.action, eagerReducer: update.eagerReducer, eagerState: update.eagerState, next: (null: any), }; newBaseQueueLast = newBaseQueueLast.next = clone; If (update.eagerreducer === reducer) {// If this update was reducer processed eagerly, and its reducer matches the // current reducer, we can use the eagerly computed state. newState = ((update.eagerState: any): S); } else { const action = update.action; newState = reducer(newState, action); }} // Get the next update on the list update = update. } while (update ! == null && update ! == first); if (newBaseQueueLast === null) { newBaseState = newState; } else { newBaseQueueLast.next = (newBaseQueueFirst: any); } // Mark that the fiber performed work, but only if the new state is // different from the current state. if (! is(newState, hook.memoizedState)) { markWorkInProgressReceivedUpdate(); } // after the update queue is finished, mount the newState on the hook object line and update queue hook.memoizedState = newState; hook.baseState = newBaseState; hook.baseQueue = newBaseQueueLast; queue.lastRenderedState = newState; }Copy the code
Updates on the update queue are processed differently depending on their priorities.
-
For low-priority UPDATES:
-
Make a copy of the low-priority UPDATE and add it to the end of the update queue
-
The new state is not computed
if (! isSubsetOfLanes(renderLanes, updateLane)) {
/ * *
- Update with low priority:
- 1. The current update will be skipped
- 2. Copy the current UPDATE and save it to the new update queue
* /
// Priority is insufficient. Skip this update. If this is the first // skipped update, the previous update/state is the new base // update/state. const clone: Update<S, A> = { lane: updateLane, action: update.action, eagerReducer: update.eagerReducer, eagerState: update.eagerState, next: (null: any), }; if (newBaseQueueLast === null) { newBaseQueueFirst = newBaseQueueLast = clone; newBaseState = newState; } else { newBaseQueueLast = newBaseQueueLast.next = clone; } // Update the remaining priority in the queue. // TODO: Don’t need to accumulate this. Instead, we can remove // renderLanes from the original lanes. currentlyRenderingFiber.lanes = mergeLanes( currentlyRenderingFiber.lanes, updateLane, ); markSkippedUpdateLanes(updateLane); }
-
For high-priority UPDATES
-
Copy the current UPDATE and add it to the end of the update queue
-
Call Reducer to calculate the new state
else {
/ * *
- Update with high priority:
- 1. Assign a copy of the current UPDATE for backup
- 2. Call Reducer to calculate the new state
* /
// This update does have sufficient priority.
if (newBaseQueueLast ! == null) { const clone: Update<S, A> = { // This update is going to be committed so we never want uncommit // it. Using NoLane works because 0 is a subset of all bitmasks, so // this will never be skipped by the check above. lane: NoLane, action: update.action, eagerReducer: update.eagerReducer, eagerState: update.eagerState, next: (null: any), }; newBaseQueueLast = newBaseQueueLast.next = clone; }
If (update.eagerreducer === reducer) {// If this update was processed eagerly, and its reducer matches the // current reducer, we can use the eagerly computed state. newState = ((update.eagerState: any): S); } else { const action = update.action; newState = reducer(newState, action); }}
4.3 updateWorkInProgressHook
UpdateReducer () : updateWorkInProgressHook() : updateWorkInProgressHook Let’s look at the implementation of updateWork progresShook:
function updateWorkInProgressHook(): Hook { // This function is used both for updates and for re-renders triggered by a // render phase update. It assumes there is either a current hook we can // clone, or a work-in-progress hook from a previous render pass that we can // use as a base. When we reach the end of the base The list, we must switch to / / the dispatcher, informs for mounts. / / get the current under a hook hook nextCurrentHook: null | hook; if (currentHook === null) { const current = currentlyRenderingFiber.alternate; if (current ! == null) { nextCurrentHook = current.memoizedState; } else { nextCurrentHook = null; } } else { nextCurrentHook = currentHook.next; } / / to remove a hook for the current hook let nextWorkInProgressHook: null | hook; if (workInProgressHook === null) { nextWorkInProgressHook = currentlyRenderingFiber.memoizedState; } else { nextWorkInProgressHook = workInProgressHook.next; } if (nextWorkInProgressHook ! == null) { // There's already a work-in-progress. Reuse it. workInProgressHook = nextWorkInProgressHook; nextWorkInProgressHook = workInProgressHook.next; currentHook = nextCurrentHook; } else {// Clone from the current hook. // Copy the current hook as workInProgressHook invariant(nextCurrentHook! == null, 'Rendered more hooks than during the previous render.', ); currentHook = nextCurrentHook; const newHook: Hook = { memoizedState: currentHook.memoizedState, baseState: currentHook.baseState, baseQueue: currentHook.baseQueue, queue: currentHook.queue, next: null, }; if (workInProgressHook === null) { // This is the first hook in the list. currentlyRenderingFiber.memoizedState = workInProgressHook = newHook; } else { // Append to the end of the list. workInProgressHook = workInProgressHook.next = newHook; } } return workInProgressHook; }Copy the code
There are two cases:
- If it is in the Render phase, the next hook is taken as the current hook and the workInProgressHook is returned;
- If it is in the Re-render phase, the current workInProgressHook will continue to be updated in the current processing cycle, and finally return the workInProgressHook.
This is the end of the useState analysis, and finally we summarize the useState execution process:
5, summary
The function component can determine the current workInProgress Fiber node by using renderWithHooks function. If the current fiber node exists, it can determine whether the current phase is Mount or Update. And get the ReactCurrentDispatcher of the corresponding stage, and execute the function component itself to get its children.
During execution, hooks of the corresponding phase will be executed. The hooks of the function component are a one-way linked list structure, which are stored on the memoizedState property of the Fiber node, and the next hook object will be obtained by the next pointer (hook.next). Queue.pending refers to the latest update and queue.pending.next refers to the first update.
Create a new hook object by executing the mountWorkInProgressHook, then return the initial state and the dispatch that triggered the action, Process the upate in the queue to get the latest state by executing updateState (which is actually updateReducer).
Call Dispatch to trigger action and initiate update task scheduling. Meanwhile, calculate the latest state in dispatchAction and update queue ring list. Then execute scheduleUpdateOnFiber to enter scheduling. Go to renderWithHooks again, execute updateState (actually doing updateReducer), get the new state value returned, and recalculate the render.