The opening
In V8, all javascript objects are allocated via the heap.
Memory is a generational
As long as the V8 garbage collection strategy is based on generational garbage collection (new generation vs. old generation), the overall size of the V8 heap is the memory used by the new generation plus the memory used by the old generation
The new generation
Objects in the Cenozoic era are short-lived objects
The old generation
Objects in the old generation are objects that live for a long time or reside in memory
Garbage collection mechanism
Scavenge.
define
The Cheney algorithm is used in the application of Scavenge. Cheney algorithm is a garbage collection algorithm implemented by copying. Divide the heap into two parts, each of which is called semispace. Of the two semispace Spaces, only one is in use and the other is idle. Semispace Spaces that are in use are called From Spaces, and Spaces that are idle are called To Spaces.
When garbage collection begins, live objects in the From space are checked. These live objects are copied To the To space, and space occupied by non-live objects is freed. Once the replication is complete, the roles of the From and To Spaces are switched, and the total is that, during garbage collection, the surviving objects are copied between the two Semispace Spaces.
Since only live objects can be copied, and only a small number of live objects are available for scenarios with short life cycles, it has excellent performance in time efficiency.
When an object survives multiple copies, it is considered a long-lived object. Objects of this life cycle are moved to the old generation and managed with a new algorithm. The process of moving objects from the new generation to the old generation is called promotion.
To be promoted is based on the Scavenge avenge and the insane memory usage.
Have you been screcycled
To Memory exceeds the limit
disadvantages
The Scavenge avenge uses only half of the memory and is determined by the space division and replication mechanism. And it is a typical space-for-time algorithm, so it cannot be applied to all garbage collection on a large scale
Mark-sweep
The marking and scavenging phases, in contrast to the Scavenge phase, where half of space is not wasted, mark-sweep iterates through all objects in the heap and marks living objects, and only unmarked objects are cleared in the subsequent scavenging phase.
The objects marked in black below are those that died
disadvantages
After mark-sweep performs a Mark Sweep, the memory space is discontinuous. This fragmentation can cause problems for subsequent memory allocation. It is likely that a large object will need to be allocated, and all of the debris space will not be able to complete the allocation, triggering the garbage collection ahead of time.
Mark-compact (Markup processing)
It mainly solves the problem of Mark-sweep fragmentation and is evolved from Mark-sweep. The difference lies in that after the object is marked dead, the living object is moved to one end in the process of cleaning. After the move is completed, the memory of the boundary is directly cleared
In the figure below, the white grid is the living object, the dark grid is the dead object, and the light grid is the empty space after the moving of the living object
Incremental Marking
In order to avoid inconsistencies between the JavaScript application logic and what the garbage collector sees, the basic algorithms of all three methods of garbage collection need to pause the application logic and resume the application logic after garbage collection. This behavior is called “stop-the-world “. Older VERSIONS of V8 tend to be larger and have many live objects, and the pauses caused by the marking, cleaning, and collating actions of full garbage collection can be scary.
To reduce the pause time caused by whole-heap garbage collection, Incremental Incremental was used to complete the pause in one go Marking is a process in which the JavaScript application logic executes for a while until the Marking is complete
The pause time is reduced to 1/6 of the original
Causes of memory leaks
- The cache
- Queue consumption is not timely
- Scope is not released
Memory as cache
Once an object is used as a cache in Node, it means that it will live in the old generation in the future. The more keys are stored in the cache, the more long-lived objects will be, making garbage collection useless for scanning and collating.
Modules also have a caching mechanism, and modules are resident old generation (Node module);
Processes cannot share caches, so the solution is to use out-of-process caches where processes do not store state themselves. External cache software has good cache expiration policies and free memory management
The queue
Queues often act as intermediates between consumers and producers, which is easy to ignore. In most application scenarios, the consumption speed is far greater than the production speed, and memory leakage is not easy to occur. Once the consumption speed is lower than the production speed, accumulation will be formed.
The superficial solution is to switch to the technology with higher consumption speed, while the deep solution is to monitor the length of the queue. Once the queue piles up, an alarm should be generated through the monitoring system and relevant personnel should be notified. Another solution is that any asynchronous call should contain a timeout limit, and once the response completes within a limited time, pass a timeout exception through the callback function so that the callbacks of any asynchronous call have a response time
reference
Node.js is easy to understand
conclusion
Primary school students in front end!!