This is the 13th day of my participation in the August More Text Challenge. For details, see:August is more challenging
First of all, let’s review the Hook principle (overview). The main contents are as follows:
function
The type offiber
Node, whose handler function isupdateFunctionComponent, which passes againrenderWithHookscallfunction
.- in
function
throughHook Api
(such as:useState, useEffect
) to create aHook
Object.State the hooks
Implements state persistence (equivalentThe class components
maintenancefiber.memoizedState
).Side effects of Hook
Maintenance is implementedfiber.flags
And to provideSide effect callback
(similar to theThe class components
Life cycle callbacks)
- multiple
Hook
Objects constitute aChain table structure
And mount tofiber.memoizedState
Above the law. Fiber tree
Update phase, putcurrent.memoizedState
Everything on the listHook
Clone them in orderworkInProgress.memoizedState
On, implement data persistence.
On this basis, this section will deeply analyze the characteristics and implementation principles of state Hook.
Create a Hook
In the initial construction stage of fiber, useState corresponds to source mountState, and useReducer corresponds to source mountReducer
mountState
:
function mountState<S> (
initialState: (() => S) | S,
) :S.Dispatch<BasicStateAction<S> >]{
// 1. Create hook
const hook = mountWorkInProgressHook();
if (typeof initialState === 'function') {
initialState = initialState();
}
// 2. Initialize the properties of hook
/ / 2.1 installed hook. MemoizedState/hook. BaseState
// 2.2 Set hook.queue
hook.memoizedState = hook.baseState = initialState;
const queue = (hook.queue = {
pending: null.dispatch: null./ / queue. LastRenderedReducer is a built-in function
lastRenderedReducer: basicStateReducer,
lastRenderedState: (initialState: any),
});
// 2.3 Set hook.dispatch
const dispatch: Dispatch<
BasicStateAction<S>,
> = (queue.dispatch = (dispatchAction.bind(
null,
currentlyRenderingFiber,
queue,
): any));
// 3. Return [current state, dispatch function]
return [hook.memoizedState, dispatch];
}
Copy the code
mountReducer
:
function mountReducer<S.I.A> (reducer: (S, A) => S, initialArg: I, init? : I => S,) :S.Dispatch<A>] {
// 1. Create hook
const hook = mountWorkInProgressHook();
let initialState;
if(init ! = =undefined) {
initialState = init(initialArg);
} else {
initialState = ((initialArg: any): S);
}
// 2. Initialize the properties of hook
/ / 2.1 installed hook. MemoizedState/hook. BaseState
hook.memoizedState = hook.baseState = initialState;
// 2.2 Set hook.queue
const queue = (hook.queue = {
pending: null.dispatch: null./ / queue. LastRenderedReducer is imported from outside
lastRenderedReducer: reducer,
lastRenderedState: (initialState: any),
});
// 2.3 Set hook.dispatch
const dispatch: Dispatch<A> = (queue.dispatch = (dispatchAction.bind(
null,
currentlyRenderingFiber,
queue,
): any));
// 3. Return [current state, dispatch function]
return [hook.memoizedState, dispatch];
}
Copy the code
MountState and mountReducer have simple logic: they are mainly responsible for creating hooks, initializing hook properties, and finally returning [current state, dispatch function].
The only difference is the hook. The queue. LastRenderedReducer:
mountState
It’s built inbasicStateReducerfunction basicStateReducer<S> (state: S, action: BasicStateAction<S>) :S { return typeof action === 'function' ? action(state) : action; } Copy the code
mountReducer
External incoming customizations are usedreducer
It can be seen that mountState is a special case of mountReducer, that is, useState is also a special case of useReducer, and is also the simplest case.
UseState can be converted to useReducer:
const [state, dispatch] = useState({ count: 0 });
/ / equivalent to the
const [state, dispatch] = useReducer(
function basicStateReducer(state, action) {
return typeof action === 'function' ? action(state) : action;
},
{ count: 0});// When you need to update the state, there are two ways
dispatch({ count: 1 }); // 1. Set directly
dispatch(state= > ({ count: state.count + 1 })); // 2. Set by callback function
Copy the code
Example of userReducer official website:
const [state, dispatch] = useReducer(
function reducer(state, action) {
switch (action.type) {
case 'increment':
return { count: state.count + 1 };
case 'decrement':
return { count: state.count - 1 };
default:
throw new Error();
}
},
{ count: 0});// There is only one way to update the state
dispatch({ type: 'decrement' });
Copy the code
UseState is the basic encapsulation of useReducer. It has a special reducer built-in (useState, useReducer, useState as an example). The return value [hook.memoizedState, Dispatch] (DisPATH) actually calls the Reducer function after hook creation.
State initialization
MemoizedState = hook. BaseState = initialState; , the initial state is saved at the same time to hook. BaseState, hook. MemoizedState.
hook.memoizedState
: Current statushook.baseState
:basis
State, as mergedhook.baseQueue
(described below).
We finally return [hook. MemoizedState, dispatch], so we use hook. MemoizedState in function.
Status updates
There is the following code:
import { useState } from 'react';
export default function App() {
const [count, dispatch] = useState(0);
return (
<button
onClick={()= > {
dispatch(1);
dispatch(3);
dispatch(2);
}}
>
{count}
</button>
);
}
Copy the code
When count = 0 for the first rendering, then the memory state of hook object is as follows:
Click on the button to update with the Dispatch function, which is actually dispatchAction:
function dispatchAction<S.A> (fiber: Fiber, queue: UpdateQueue
, action: A,
,>) {
// 1. Create an update object
const eventTime = requestEventTime();
const lane = requestUpdateLane(fiber); // Legacy mode returns SyncLane
const update: Update<S, A> = {
lane,
action,
eagerReducer: null.eagerState: null.next: (null: any),
};
// 2. Add the update object to the hook.queue.pending queue
const pending = queue.pending;
if (pending === null) {
// The first update creates a circular linked list
update.next = update;
} else {
update.next = pending.next;
pending.next = update;
}
queue.pending = update;
const alternate = fiber.alternate;
if( fiber === currentlyRenderingFiber || (alternate ! = =null && alternate === currentlyRenderingFiber)
) {
// Update at render time to do global markup
didScheduleRenderPhaseUpdateDuringThisPass = didScheduleRenderPhaseUpdate = true;
} else {
/ /... The performance optimization section is omitted
// 3. Initiate scheduling updates and enter the input phase of the Reconciler operational process.scheduleUpdateOnFiber(fiber, lane, eventTime); }}Copy the code
The logic is clear:
- create
update
Object in whichupdate.lane
Stands for priority (reviewableFiber Tree structure (foundation preparation)In theUpdate the priority
). - will
update
Object tohook.queue.pending
Circular linked list.Circular linked list
Features: To facilitate the addition of new elements and quickly get the first element (bothO(1)
), sopending
The pointer points to the last element in the list.- The use of the list can refer to the React algorithm of the list operation
- Initiate a schedule update: call
scheduleUpdateOnFiber
And into theReconciler operation process
In the input phase.
From the time scheduleUpdateOnFiber is called, you enter the React-Reconciler package, and all the logic in the react-Reconciler process can be reviewed. This section only discusses the logic related to state hooks.
Note: Although 3 dispatches are executed simultaneously in this example, 3 dispatches are requested and only one render is performed due to the scheduler’s throttling optimization
During the fiber tree construction (contrast update), function is called again, and the corresponding function for useState is updateState
function updateState<S> (
initialState: (() => S) | S,
) :S.Dispatch<BasicStateAction<S> >]{
return updateReducer(basicStateReducer, (initialState: any));
}
Copy the code
UpdateReducer is actually called.
Before updateReducer is executed, the memory structure associated with the hook is as follows:
function updateReducer<S.I.A> (reducer: (S, A) => S, initialArg: I, init? : I => S,) :S.Dispatch<A>] {
// 1. Obtain the workInProgressHook object
const hook = updateWorkInProgressHook();
const queue = hook.queue;
queue.lastRenderedReducer = reducer;
const current: Hook = (currentHook: any);
let baseQueue = current.baseQueue;
// 2. Linked list concatenation: concatenate hook.queue.pending to current.basequeue
const pendingQueue = queue.pending;
if(pendingQueue ! = =null) {
if(baseQueue ! = =null) {
const baseFirst = baseQueue.next;
const pendingFirst = pendingQueue.next;
baseQueue.next = pendingFirst;
pendingQueue.next = baseFirst;
}
current.baseQueue = baseQueue = pendingQueue;
queue.pending = null;
}
// 3. State calculation
if(baseQueue ! = =null) {
const first = baseQueue.next;
let newState = current.baseState;
let newBaseState = null;
let newBaseQueueFirst = null;
let newBaseQueueLast = null;
let update = first;
do {
const updateLane = update.lane;
// 3.1 Priority extraction update
if(! isSubsetOfLanes(renderLanes, updateLane)) {// Not priority enough: join baseQueue and wait for the next render
const clone: Update<S, A> = {
lane: updateLane,
action: update.action,
eagerReducer: update.eagerReducer,
eagerState: update.eagerState,
next: (null: any),
};
if (newBaseQueueLast === null) {
newBaseQueueFirst = newBaseQueueLast = clone;
newBaseState = newState;
} else {
newBaseQueueLast = newBaseQueueLast.next = clone;
}
currentlyRenderingFiber.lanes = mergeLanes(
currentlyRenderingFiber.lanes,
updateLane,
);
markSkippedUpdateLanes(updateLane);
} else {
// Priority enough: state merge
if(newBaseQueueLast ! = =null) {
/ / update the baseQueue
const clone: Update<S, A> = {
lane: NoLane,
action: update.action,
eagerReducer: update.eagerReducer,
eagerState: update.eagerState,
next: (null: any),
};
newBaseQueueLast = newBaseQueueLast.next = clone;
}
if (update.eagerReducer === reducer) {
// Performance optimization: if update.eagerReducer exists, use update.eagerState. Avoid calling reducer repeatedly
newState = ((update.eagerState: any): S);
} else {
const action = update.action;
// Call Reducer to obtain the latest state
newState = reducer(newState, action);
}
}
update = update.next;
} while(update ! = =null&& update ! == first);// 3.2. Update properties
if (newBaseQueueLast === null) {
newBaseState = newState;
} else {
newBaseQueueLast.next = (newBaseQueueFirst: any);
}
if(! is(newState, hook.memoizedState)) { markWorkInProgressReceivedUpdate(); }// Update the result on the workInProgressHook
hook.memoizedState = newState;
hook.baseState = newBaseState;
hook.baseQueue = newBaseQueueLast;
queue.lastRenderedState = newState;
}
const dispatch: Dispatch<A> = (queue.dispatch: any);
return [hook.memoizedState, dispatch];
}
Copy the code
UpdateReducer, which is a relatively long code, but it’s logical:
-
Call updateWorkInProgressHook to get the workInProgressHook object
-
Linked list concatenation: Concatenate hook.queue.pending to current.basequeue
-
State calculation
update
Low priority: join baseQueue and wait for the next renderupdate
Priority sufficient: State merge- Update the attributes
Performance optimization
The dispatchAction function optimizes the performance of the update object before calling scheduleUpdateOnFiber.
queue.pending
Contains only the currentupdate
Is the current timeupdate
isqueue.pending
The first inupdate
- Direct call
queue.lastRenderedReducer
To calculate theupdate
The state after that, I’ll call iteagerState
- if
eagerState
withcurrentState
If it is the same, exit directly without initiating dispatch updates. - Has been mounted to
queue.pending
On theupdate
Will be next timerender
Is merged again.
function dispatchAction<S.A> (fiber: Fiber, queue: UpdateQueue
, action: A,
,>) {
/ /... Omit irrelevant code... Keep only the performance optimization part of the code:
/ / the following if judgment, can guarantee the current to create the update, is ` queue. Pending ` first ` the update `. Why is that? If 'fiber. Lanes && Alternate. Lanes' are not changed, this will be the first update from' fiber
if (
fiber.lanes === NoLanes &&
(alternate === null || alternate.lanes === NoLanes)
) {
const lastRenderedReducer = queue.lastRenderedReducer;
if(lastRenderedReducer ! = =null) {
let prevDispatcher;
const currentState: S = (queue.lastRenderedState: any);
const eagerState = lastRenderedReducer(currentState, action);
// Save 'eagerReducer' and 'eagerState' temporarily. If you reduce ==update.eagerReducer in the render phase, you can use the reducer directly without recalculating
update.eagerReducer = lastRenderedReducer;
update.eagerState = eagerState;
if (is(eagerState, currentState)) {
// Fast channel, eagerState is the same as currentState, no need to schedule updates
// Note: Update was added to queue.pending, but not dropped. When you need to update later, the update will still work
return; }}}// Initiate scheduling updates to enter the input phase of the Reconciler operational process.
scheduleUpdateOnFiber(fiber, lane, eventTime);
}
Copy the code
To verify the above optimization, check out this demo:
Asynchronous update
The above examples are in Legacy mode, so they are synchronous updates. So the update object will be fully merged, and hook. BaseQueue and hook.
While there is no entry to Concurrent mode in the V17.x release, the upcoming V18.x release will be fully asynchronous, so this section previews the logic of the update asynchronous merge. At the same time, deepen the understanding of hook. BaseQueue and Hook. BaseState.
Suppose you have a queue.pending linked list where updates have different priorities, with green representing high priority, gray representing low priority, and red representing the highest priority.
Before updateReducer is executed, hook. MemoizedState has the following structure (where update3 and update4 are low priority):
-
Linked list splicing:
- Consistent with the synchronized update, directly
queue.pending
Joining together tocurrent.baseQueue
- Consistent with the synchronized update, directly
-
State calculation:
- Can only extract
update1, update2
These two are of high priorityupdate
So in the endmemoizedState=2
- Keep the rest of the lower priority ones
update
Waiting for the next timerender
- From the first low priority
update3
The beginning, all that followsupdate
Will be added tobaseQueue
Because ofupdate2
It’s already high priority. It will be setupdate2.lane=NoLane
Raise the priority to the highest level (red). - while
baseState
Represents the first low priorityupdate3
Before thestate
In this case,baseState=1
- Can only extract
* Update = function * update = function * update = function * update = function * update = function * update = function * update = function * update = function After a period of time, the low priority update3, update4 is compliant with rendering, in which case the updateReducer is executed again and the previous step is repeated.
-
Linked list splicing:
- Due to the
queue.pending = null
, so there is no substantial change before and after splicing
- Due to the
-
State calculation
- Now all
update.lane
Meet theRender priority
, so the final memory structure is consistent with the synchronized update (memoizedState=4,baseState=4
).
- Now all
Conclusion: although the priority of the update list is different, the intermediate render may be multiple times, but the final update result is equal to the update list in order to merge.
conclusion
This section deeply analyzes the internal principle of state Hook, that is, useState, understands the merging mode of update objects from synchronous and asynchronous update, and the final result is stored in Hook. MemoizedState and supplied to function.
Write in the last
This article belongs to the diagram react source code in the state management plate, this series of nearly 20 articles, really in order to understand the React source code, and then improve the architecture and coding ability.
The first draft of the graphic section has been completed and will be updated in August. If there are any errors in the article, we will correct them as soon as possible on Github.