preface
This article is my humble opinion, if there are mistakes, please point out the audience.
nextTick
💠Why do I need nextTick?
Why do we need nextTick? Consider the following scenario. If foo changes every time, it will trigger the watch update synchronously. So, if the Watch contains a lot of time-consuming operations, it can cause serious performance problems. So in Vue’s source code, the watch update happens after nextTick.
const Demo = createComponent({ setup() { const foo = ref(0) const bar = ref(0) const change = () => { for (let i = 0; i < 100; i++) { foo.value += 1 } } watch(foo, () => { bar.value += 1 }, { lazy: true }) return { foo, bar, change } }, render() { const { foo, bar, change } = this return ( <div> <p>foo: {foo} < / p > < p > bar: {bar} < / p > {/ * click on the button bar, in fact, only updated once * /} < button onClick = {change} > change < / button > < / div >)}})Copy the code
Unit testing
The best way to quickly understand the source code is to read the corresponding unit tests. It can help us quickly understand the specific meaning and usage of each function and each variable, as well as the processing of some boundary cases.
NextTick source file directory: Packages /runtime-core/ SRC /scheduler.ts
NextTick Unit test file file directory location: Packages /runtime-core/__tests__/ Scheduler.spec.ts
First single test
NextTick creates a microtask. When the macro task job2 is completed, the microtask queue is cleared and job1 is executed. At this point the length of the calls array is equal to 2.
it('nextTick'.async() = > {const calls: string[] = []
const dummyThen = Promise.resolve().then()
const job1 = (a)= > {
calls.push('job1')}const job2 = (a)= > {
calls.push('job2')
}
nextTick(job1)
job2()
expect(calls.length).toBe(1)
// Wait for the microtask queue to clear
await dummyThen
expect(calls.length).toBe(2)
expect(calls).toMatchObject(['job2'.'job1'])})Copy the code
Second single test
There is a new function involved, queueJob. It is unclear what the internal implementation will be, but we can see it from the single test. A queueJob takes a function as an argument. The queueJob stores the arguments in a queue in order, and executes the functions in the queue in turn when the macro task finishes executing and the microtask starts executing.
it('basic usage'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')}const job2 = (a)= > {
calls.push('job2')
}
queueJob(job1)
queueJob(job2)
expect(calls).toEqual([])
await nextTick()
// Execute in sequence
expect(calls).toEqual(['job1'.'job2'])})Copy the code
The third single test
QueueJob will avoid pushing the same function (job) to the queue multiple times. QueueJob contains the processing of deduplicating.
it('should dedupe queued jobs'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')}const job2 = (a)= > {
calls.push('job2')
}
queueJob(job1)
queueJob(job2)
queueJob(job1)
queueJob(job2)
expect(calls).toEqual([])
await nextTick()
expect(calls).toEqual(['job1'.'job2'])})Copy the code
The fourth single test
If the call to queueJob(job2) occurs inside job1. Job2 will execute at the same time after job1. It won’t wait until the next time you perform a microtask.
it('queueJob while flushing'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')
queueJob(job2)
}
const job2 = (a)= > {
calls.push('job2')
}
queueJob(job1)
await nextTick()
// Job2 will be executed during the same microtask queue execution
expect(calls).toEqual(['job1'.'job2'])})Copy the code
Fifth single test
There is a new function queuePostFlushCb. The internal implementation is still unclear, but we can see from the single test that queuePostFlushCb takes a function as an argument, or an array of functions as an argument.
In queuePostFlushCb, the parameters are stored in a queue in sequence. When the macro task is executed and the microtask queue is emptied, each function in the queue is executed in sequence.
it('basic usage'.async() = > {const calls: string[] = []
const cb1 = (a)= > {
calls.push('cb1')}const cb2 = (a)= > {
calls.push('cb2')}const cb3 = (a)= > {
calls.push('cb3')
}
queuePostFlushCb([cb1, cb2])
queuePostFlushCb(cb3)
expect(calls).toEqual([])
await nextTick()
// Execute the functions in the order in which the queue was added
expect(calls).toEqual(['cb1'.'cb2'.'cb3'])})Copy the code
The sixth single test
QueuePostFlushCb does not add the same function to the queue repeatedly.
it('should dedupe queued postFlushCb'.async() = > {const calls: string[] = []
const cb1 = (a)= > {
calls.push('cb1')}const cb2 = (a)= > {
calls.push('cb2')}const cb3 = (a)= > {
calls.push('cb3')
}
queuePostFlushCb([cb1, cb2])
queuePostFlushCb(cb3)
queuePostFlushCb([cb1, cb3])
queuePostFlushCb(cb2)
expect(calls).toEqual([])
await nextTick()
expect(calls).toEqual(['cb1'.'cb2'.'cb3'])})Copy the code
The seventh single test
If the call to queuePostFlushCb(cb2) occurs inside CB1. Cb2 will be executed at the same time as CB1. It won’t wait until the next time you perform a microtask.
it('queuePostFlushCb while flushing'.async() = > {const calls: string[] = []
const cb1 = (a)= > {
calls.push('cb1')
queuePostFlushCb(cb2)
}
const cb2 = (a)= > {
calls.push('cb2')
}
queuePostFlushCb(cb1)
await nextTick()
expect(calls).toEqual(['cb1'.'cb2'])})Copy the code
The eighth single test
Allow queuePostFlushCb to be nested within queuePostFlushCb
it('queueJob inside postFlushCb'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')}const cb1 = (a)= > {
calls.push('cb1')
queueJob(job1)
}
queuePostFlushCb(cb1)
await nextTick()
expect(calls).toEqual(['cb1'.'job1'])})Copy the code
The ninth single test
Job1 is executed in a higher order than CB2. The priority of queueJob is higher than queuePostFlushCb.
it('queueJob & postFlushCb inside postFlushCb'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')}const cb1 = (a)= > {
calls.push('cb1')
queuePostFlushCb(cb2)
queueJob(job1)
}
const cb2 = (a)= > {
calls.push('cb2')
}
queuePostFlushCb(cb1)
await nextTick()
expect(calls).toEqual(['cb1'.'job1'.'cb2'])})Copy the code
The tenth single test
Allow queuePostFlushCb to be nested in queueJob
it('postFlushCb inside queueJob'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')
queuePostFlushCb(cb1)
}
const cb1 = (a)= > {
calls.push('cb1')
}
queueJob(job1)
await nextTick()
expect(calls).toEqual(['job1'.'cb1'])})Copy the code
The eleventh single try
Job2 will execute before CB1. The priority of queueJob is higher than postFlushCb.
it('queueJob & postFlushCb inside queueJob'.async() = > {const calls: string[] = []
const job1 = (a)= > {
calls.push('job1')
queuePostFlushCb(cb1)
queueJob(job2)
}
const job2 = (a)= > {
calls.push('job2')}const cb1 = (a)= > {
calls.push('cb1')
}
queueJob(job1)
await nextTick()
expect(calls).toEqual(['job1'.'job2'.'cb1'])})Copy the code
conclusion
nextTick
Take a function as an argument, andnextTick
Creates a microtask.queueJob
Take a function as an argument,queueJob
Will push the parameter toqueue
Queue, to clear the queue after the execution of the current macro task.queuePostFlushCb
Take a function or an array of functions as an argument,queuePostFlushCb
Will push the parameter topostFlushCbs
Queue, to clear the queue after the execution of the current macro task.queueJob
Execution has a higher priority thanqueuePostFlushCb
queueJob
andqueuePostFlushCb
Allows new members to be added during queue emptying.
Without further ado, let’s look directly at the source code.
The source code parsing
// ErrorCodes Specifies the type enumeration of internal errors
// callWithErrorHandling includes the error handling function executor
import { ErrorCodes, callWithErrorHandling } from './errorHandling'
import { isArray } from '@vue/shared'
// Job queue. The queueJob function adds parameters to the queue array
const queue: Function[] = []
// the cb queue. The queuePostFlushCb function adds arguments to the postFlushCbs array
const postFlushCbs: Function[] = []
// The state of the Promise object is resolve
const p = Promise.resolve()
Copy the code
nextTick
NextTick is very simple, creating a microtask. Execute fn after the current macro task finishes.
function nextTick(fn? : () => void): Promise<void> { return fn ? p.then(fn) : p }Copy the code
queueJob
Example Add a job to a queue. Call queueFlush to begin processing the queue.
Function queueJob(job: () => void) {function queueJob(job: () => void) { queue.includes(job)) { queue.push(job) queueFlush() } }Copy the code
queuePostFlushCb
Add the CB to the postFlushCbs queue. Call queueFlush to begin processing the queue.
function queuePostFlushCb(cb: Function | Function[]) {
// In this case, postFlushCbs is not rewritten
if(! isArray(cb)) { postFlushCbs.push(cb) }else {
// If cb is an array, expand it. To the postFlushCbs queue.postFlushCbs.push(... cb) } queueFlush() }Copy the code
queueFlush
QueueFlush calls nextTick to start a microtask. FlushJobs is used to process queue queue and postFlushCbs after the current macro task completes.
// isFlushing, isFlushPending
let isFlushing = false
let isFlushPending = false
queueFlush() {
if(! isFlushing && ! isFlushPending) {// Set isFlushPending to true to prevent queueJob and queuePostFlushCb from calling flushJobs repeatedly
isFlushPending = true
FlushJobs: flushJobs flushJobs: flushJobs flushJobs: flushJobs flushJobs
nextTick(flushJobs)
}
}
Copy the code
In flushJobs, queue is processed first, followed by postFlushCbs
function flushJobs(seen? : CountMap) {
isFlushPending = false
isFlushing = true
let job
if (__DEV__) {
seen = seen || new Map()
}
// 1. Clear queue
while ((job = queue.shift())) {
if (__DEV__) {
// In a development environment, check whether the number of job calls exceeds the maximum number of recursionscheckRecursiveUpdates(seen! , job) }// Execute the job in the queue using the callWithErrorHandling executor
// If the job throws an error, the callWithErrorHandling executor catches the error
callWithErrorHandling(job, null, ErrorCodes.SCHEDULER)
}
// 2. Call flushPostFlushCbs to process the postFlushCbs queue
flushPostFlushCbs(seen)
isFlushing = false
// If there is no queue, the postFlushCbs queue is not flushed
// Call flushJobs recursively to flush the queue
if (queue.length || postFlushCbs.length) {
flushJobs(seen)
}
}
Copy the code
FlushPostFlushCbs Deduplicates the postFlushCbs queue. And clear the postFlushCbs queue.
// Use Set to deduplicate the postFlushCbs queue
const dedupe = (cbs: Function[]) :Function[] => [...new Set(cbs)]
function flushPostFlushCbs(seen? : CountMap) {
if (postFlushCbs.length) {
// postFlushCbs Deduplicate the queue
const cbs = dedupe(postFlushCbs)
postFlushCbs.length = 0
if (__DEV__) {
seen = seen || new Map()
}
// Clear the postFlushCbs queue
for (let i = 0; i < cbs.length; i++) {
if (__DEV__) {
// In the case of a development environment, check whether the number of cb calls exceeds the maximum number of recursionscheckRecursiveUpdates(seen! , cbs[i]) }/ cb/execution
cbs[i]()
}
}
}
Copy the code
If the same job or CB is called more than 100 times, the maximum recursion number is exceeded and an error is thrown.
// The maximum number of recursive levels
const RECURSION_LIMIT = 100
type CountMap = Map<Function.number>
function checkRecursiveUpdates(seen: CountMap, fn: Function) {
if(! seen.has(fn)) { seen.set(fn,1)}else {
const count = seen.get(fn)!
// If the number of calls exceeds 100, an error is thrown
if (count > RECURSION_LIMIT) {
throw new Error(
'Maximum recursive updates exceeded. ' +
"You may have code that is mutating state in your component's " +
'render function or updated hook or watcher source function.')}else {
// The number of calls is increased by one
seen.set(fn, count + 1)}}}Copy the code
💠Why do you need to use checkRecursiveUpdates to check the number of job or CB calls?
In Vue3, the watch callback is pushed to the queue after the dependency update and executed after the nextTick. Consider the following code. The update of foo will cause the watch callback (update) to be pushed to the queue repeatedly, and the queue will never be cleared, which is obviously wrong. So we need to use checkRecursiveUpdates to check the number of layers of recursion and throw errors in time.
const foo = ref(0)
const update = (a)= > {
foo.value += 1
}
watch(foo, update, {
lazy: true
})
foo.value += 1
Copy the code