This article was originally published at github.com/bigo-fronte… Welcome attention, reprint.
Talk about React Fiber versus sharding
The React idea and the advent of Fiber
The idea behind React is:
React is, in our opinion, the premier way to build big, fast Web apps with JavaScript.
However, we sometimes have a very long and deep DOM list (without list optimization). After setState is created and updated, React will compare the nodes before and after creation (Reconcilation stage). The process of comparing is uninterruptable, because the main thread of the web page contains not only the JS execution, The style calculation also includes the rearrangement/redrawing required for rendering, which means that when the current Reconcilation(JS execution task) takes up too much time on the main thread, it can affect normal browser rearrangement/redrawing, and also normal user interaction (input, click, select, etc.).
To take an extreme example, we have a very deep list (1500 levels) that changes frequently:
function App() {
const [randomArray, setRandomArray] = useState(
Array.from({ length: 1500 }, () = > Math.random())
);
useEffect(() = > {
changeRandom()
}, []);
const changeRandom = () = > {
setRandomArray(randomSort(randomArray));
cancelAnimationFrame(raf);
raf = requestAnimationFrame(changeRandom);
};
const finalList = randomArray.reduce((acc, cur) = > {
acc = (
<div key={cur} style={{ color: randomColor() }}>
{cur} {acc}
</div>
);
return acc;
}, <div></div>);
return (
<div>
<section>{finalList}</section>
</div>
);
}
Copy the code
fromperformance
The panel, toochangeRandom
The wholeJs to perform tasks
Take up161ms
From the event loop,changeRandom
Function performssetState
Enter thereconcilation
Phase, but because the list is so deep and the process is uninterruptible, it takes too long to include other tasksThe keyboard output
.Style calculation
.rearrangement
.redraw
Such as:
From a browser frame, when the abovereconcilation
The task was blocked for too long, causing the view to not be updated at 16.6ms per frame at normal refresh rate, causing the frame drop problem
So based on React’s philosophy, in order to address the above issues and make the Reconcilation process interruptible, as well as other experimental features such as Concurrent Mode, and priority scheduling, React decided to rewrite the underlying implementation using Fiber
Fiber data structure and Fiber tree construction
Again, let’s take a random rendered DOM element
Health review | right right store in the corresponding DOM labelas globalVariable | switch to the consoleconsole| input temp1. __reactInternalInstance $mszvvg3x40p (behind there will be a smart tips)Copy the code
You can see the Fiber information corresponding to the current node
List the main Fiber data structures:
- Tag: indicates the type of the Fiber node
export const FunctionComponent = 0; // Fiber node corresponding to FC
export const ClassComponent = 1;
export const IndeterminateComponent = 2; // Before we know whether it is function or class
export const HostRoot = 3; / / root Fiber
export const HostPortal = 4; // A subtree. Could be an entry point to a different renderer.
export const HostComponent = 5; // The node of the DOM document corresponds to Fiber, div,section...
export const HostText = 6; // Text node
export const Fragment = 7;
export const Mode = 8;
export const ContextConsumer = 9;
export const ContextProvider = 10;
export const ForwardRef = 11;
export const Profiler = 12;
export const SuspenseComponent = 13;
export const MemoComponent = 14;
export const SimpleMemoComponent = 15;
export const LazyComponent = 16;
export const IncompleteClassComponent = 17;
export const DehydratedSuspenseComponent = 18;
export const EventComponent = 19;
export const EventTarget = 20;
export const SuspenseListComponent = 21;
Copy the code
- Type: HostComponent indicates the DOMElement type of the React element, or Fiber, or the component’s class or function
- StateNode: Points to the real DOM object that was created
- Return Child and Sibling: Corresponding to the parent Fiber, the first child Fiber and sibling Fiber nodes of the current Fiber respectively
- Alternate: Dual cache use
current.alternate = workInProgress
workInProgress.alternate = current
Copy the code
The demo builds the Fiber tree like this:
React renders two processes
- Render/Reconcilation (interruptible
interruptible
)beginWork
The main function is to create initialized and updated subnodes of the current Fiber node, and return the first child of the current Fiber node to start the next performUnitOfWorkcompleteWork
The process of creating fiber. stateNode is based onbeginWork
Generated new Fiber calldocument.createElement
Create a DOM node to store in fiber. stateNode, and then incommit
Process time to goappend
Into the real DOM
- Commit (do not interrupt, otherwise the DOM may change, UI DOM instability); The main task is to commit stateNode generated in the Render phase to the real DOM node
Scheduler: Scheduling module, scheduling tasks of render/ Reconcilation phase, and dividing tasks into 5ms each, which can be interrupted
Enable Concurrent Mode and fragmentation
Fiber time sharding requires Concurrent Mode to be enabled. This means that ReactDOM. Render, which we use by default, does not use time sharding even though Fiber is used.
Enabling Concurrent Mode requires only two steps:
- Load up the experimental React and React-DOM packs
npm install react@experimental react-dom@experimental
Copy the code
- use
ReactDOM.createRoot
To create aFiberRoot
againrender
alternativeReactDOM.render
ReactDOM.createRoot(rootNode).render(<App />)
Copy the code
The opening is complete.
Except for what we normally use, of courseReactDOM.render
theLegacy Mode
As well asConcurrent Mode
There’s one on React Blocking Mode
In fact, it is to own the partConcurrent Mode
An intermediate version of the function, created as follows:ReactDOM.createBlockingRoot(rootNode).render(<App />)
Comparison of the three modes:You can see that under ConcurrentMode, you have SuspenseList, which controls an order of Suspense components, supports priority rendering, interrupt pre-rendering, etc., and some new oneshook
, such as usinguseTransition
collocationSuspense
Can be used for load optimization,useDefferredValue
To do somestate
Value cache, for some of the low priority but time-consuming updates, you can not update immediately, but getdeffer
Delayed state, etc., but these are still in the trial package and may change at any time, so I won’t go into details, if you are interested, check them out:Suspense for Data Fetching
Performance and User Experience comparison after Sharding Enabled (Concurrent mode vs Legacy Mode)
So let’s go back to the original demo, and let’s compare openingConcurrent Mode
Before and after sharding is enabledperformance
Panel comparison: Before sharding:You can see that every update on the main thread is made bychangeRandom
Initiate and then proceedreconcilation
Phase andcommit
Phase, the entire method is contained in oneTask
Inside:
After the fragment is enabled: After the fragment is enabled, you can viewrender/reconcilation
The phases are divided into tasks, there are manyTask
Are all5ms
The task of
At this point we add an input box to test whether the user experience has improved a lot and whether the reconcilation sharding is really that strong.
Add an input field, and don’t let the input field affect the random div deep List, so let’s pull the List out separately and wrap it in react. memo, so the setState of the input field doesn’t cause a rerendering of the List:
The outer layer:
function App() {
const [filterText, setFilterText] = useState(' ');
return (
<div>
<input
value={filterText}
onChange={(e)= > setFilterText(e.target.value)}
/>
<button>button</button>
<section>
<List />
</section>
</div>
);
}
Copy the code
A random List:
const List = React.memo(function List (props) {
const [randomArray, setRandomArray] = useState(
Array.from({ length: 1500 }, () = > Math.random())
);
useEffect(() = > {
changeRandom()
}, []);
const changeRandom = () = > {
setRandomArray(randomSort(randomArray));
cancelAnimationFrame(raf);
raf = requestAnimationFrame(changeRandom);
};
const finalList = randomArray.reduce((acc, cur) = > {
acc = (
<span key={Math.random()} style={{ color: randomColor() }}>
{cur} {acc}
</span>
);
return acc;
}, <span></span>);
return <div>{finalList}</div>
})
Copy the code
inLegacy
Mode, you can see that the input box has a little bit of a lag, mainly the wholerender/reconcilation
Tasks take up too much timeAnd then we turn onConcurrent Mode
It’s still blocked. It’s still a little bit stuck. Lookperformance
The discovery is mainly bycommit
andlayout/paint
Two processes are stuck,
Of course when the input doesn’t just get stuck inrender/reconcilation
Is not going to berender/reconcilation
Itself blocked, so it can be summed up:render/reconcilation
Shard and achieve the effect, at the interval of the shard can be inserted to perform other users of higher priority accordingly. However, to better demonstrate the effects of sharding, I decided to exclude itcommit
The process andLayout/Paint
Rearrange the impact of redrawing.
Look beyond the Layout/Paint process and the commit process to see the performance optimization that sharding brings
- Put aside the rearrangement and redraw effects: cover the List component directly
style={{ display: 'none' }}
So that therender/reconcilation
There is no rearrangement or redrawing after the phase is completed, but it is generatedFiber
It will still be generated, but the final commit to the DOM will not render
<section style={{ display: 'none'}} ><List />
</section>
Copy the code
To make the effect more obvious, I directly copied two more lists and increased each List to 3000.
const List = React.memo(function List (props) {
const [randomArray, setRandomArray] = useState(
Array.from({ length: 3000 }, () = > Math.random())
);
useEffect(() = > {
changeRandom()
}, []);
const changeRandom = () = > {
setRandomArray(randomSort(randomArray));
cancelAnimationFrame(raf);
raf = requestAnimationFrame(changeRandom);
};
const finalList = randomArray.reduce((acc, cur) = > {
acc = (
<span key={Math.random()} style={{ color: randomColor() }}>
{cur} {acc}
</span>
);
return acc;
}, <span></span>);
return <div>{finalList}{finalList1}{finalList2}</div>
})
Copy the code
So, inLegacy
In the “performance” mode, we see that there is no performanceLayout/Paint
Tasks like this to block our input, leaving us withcommit
Now, if I increase the number of lists,render/reconcilation
It is blocked for about 500 ms. It is very jammed.
Look at it nowConcurrent Mode
CommitRoot (commitRoot) is no longer commitRoot (Concurrent Mode)display:none
A. commit B. commit C. commit D. commit
Summary and other Features of Concurrent Mode
Of course, we use exaggerated examples (deep nodes, removal, rearrangement and redrawing) to look at the optimization of render/ Reconcilation in Concurrent Mode alone. Of course, shilling is only a small part of the functionality of Fiber. The Fiber architecture unlocks a number of new features for Concurrent Mode:
, useTransition, useDeferredValue, etc. These are, of course, tentative. In our demo, we can use useDeferredValue to delay state. For example, if we get a long list in real time, but don’t want to block input boxes or other user operations, we can temporarily save the old state with useDeferredValue. Then we pass the old version of state to the component with memo. In this case, we reduce the timeliness of the list in exchange for improved user interaction, and our original state is always up to date, so it is not quite the same as increasing the polling time. Overall, Concurrent Mode unlocks a number of new features, some of which are trial, but you can expect improvements in performance and user experience when Concurrent Mode is in use.
Welcome to leave a message and discuss, I wish a smooth work, happy life!
I’m bigo front end, see you next time.