preface

This series started with preparing yourself for an interview. Later found more and more sorting, almost 120,000 characters, finally decided to share to everyone.

In order to share the sorting out, I spent a lot of time, at least three times as much time as ONLY myself. If you like, welcome to collect, follow me! Thank you very much!

The article links

  • Front – end interview check and fill gaps -(1) tremble and throttling
  • (2) Garbage collection mechanism
  • (3) Cross-domain and common solutions
  • (4) Front-end local storage
  • (5) Rendering mechanism and redraw and backflow
  • Front – end interview -(six) browser cache
  • (7) XSS attack and CSRF attack
  • (8) Front-end encryption
  • (9) HTTP and HTTPS
  • Check and fill gaps in front interview –(10) Front authentication
  • (11) Front-end software architecture pattern MVC/MVP/MVVM
  • (12) From the input URL to the page to see the whole process (including three handshake, four wave detailed explanation)
  • Front – end interview leak -(13) memory leak
  • Front – end interview omission and filling -(xiv) algorithm and sorting
  • (15) Event Loop

Collection of articles:

The Index (120,000 character collection) contains more than a dozen other articles in the series written so far. The following new value-added articles will not be added in each link, strongly suggest that the comments like, pay attention to the collection!!!! , thank you! ~

Follow-up Update Plan

Design pattern, front-end engineering, project process, deployment, closed loop, vUE often test knowledge and other contents will be added in the future. If you think the content is good, welcome to collect, follow me! Thank you very much!

Ask for an extrapolation

At present, I am also preparing for job-hopping. I hope you and HR sister can promote a reliable front-end position in Wuhan! Email :[email protected]. Thanks! ~

Garbage collection mechanism

JavaScript has automatic garbage collection (GC), which means that the execution environment is responsible for managing the memory used during code execution. Developers no longer have to worry about memory usage, and the allocation of needed memory and the collection of unused memory are fully automated.

Memory life cycle

Memory allocated in the JS environment generally has the following life cycle:

  1. Memory allocation: When we declare variables, functions, objects, and execute them, the system automatically allocates memory for them
  2. Memory usage: reading and writing memory, that is, using variables, functions, etc
  3. Memory reclamation: The garbage collection mechanism automatically reclaims the memory that is no longer used

Garbage collection policy

Marker clearing algorithm

The most common method of garbage collection in JavaScript is mark-and-sweep.

This algorithm simplifies “whether an object is no longer needed” to “whether an object is available.”

The algorithm assumes setting an object called root (in Javascript, root is a global object). The garbage collector will periodically scan objects in memory, starting from the root (which in JS is global objects). Anything that can be reached from the root is still needed. Objects that cannot be reached from the root are marked as unused and recycled later.

The algorithm can be divided into two phases: mark and Sweep.

  1. During the tag phase, the garbage collector starts with the root object. Each object that is accessible from the root object has an identity added to it, and the object is identified as reachable.
  2. In the cleanup phase, the garbage collector makes a linear traversal of the heap memory from beginning to end. If an object is not identified as reachable, the memory occupied by the object is reclaimed and the original reachable identity is cleared for the next garbage collection operation.

In the marking phase, B can be accessed from the root object 1, and E can be accessed from the root object B. In the same way, F, G, J and K are all reachable objects.

During the collection phase, all objects that are not marked as reachable are collected by the garbage collector.

When does garbage collection start? In general, unreferenced objects are not immediately reclaimed when the marker cleanup algorithm is used. Instead, garbage objects accumulate until memory runs out. When memory runs out, the program is suspended and garbage collection begins.

Added: Since 2012, all modern browsers have used the mark-clean garbage collection algorithm. All of the improvements to JavaScript garbage collection algorithms are based on improvements to the mark-clean algorithm, not the mark-clean algorithm itself and its simplified definition of whether an object is no longer needed.

Mark the defects in the clearing algorithm

  • Objects that cannot be queried from the root object will be cleared
  • Garbage collection can cause a lot of memory fragmentation, as shown in the above image, there are three memory fragmentation after garbage collection, assuming that one square represents one unit of memory, if an object needs three units of memory, it will cause the Mutator to stay suspended. The Collector, on the other hand, keeps trying to garbage collect until it’s Out of Memory.

Reference counting algorithm

This is the most rudimentary garbage collection algorithm. No browser uses this algorithm anymore.

This algorithm simplifies the definition of “is the object no longer needed” to “has the object been referenced by other objects”. If there is no reference to the object (zero reference), the object is collected by the garbage collection mechanism.

The meaning of reference counting is to keep track of the number of times each value has been referenced. When a variable is declared and a reference type value is assigned to it, the number of references to the value is 1. If the same value is assigned to another variable, the number of references to that value is increased by one. Conversely, if the variable containing the reference to this value takes another value, the number of references to this value is reduced by one. When the number of references to the value goes to 0, it means that the value is no longer accessible, so the memory it occupied can be reclaimed. This way, the next time the garbage collector runs, it frees the memory occupied by values that have been referenced zero times.

Reference count defect

This algorithm has one limitation: it cannot handle circular references. If two objects are created and reference each other, a loop is formed. After they are called, they leave the function scope, so they are useless and can be recycled. However, the reference counting algorithm takes into account that they all have at least one reference to each other, so they are not recycled.

Chrome V8 garbage collection algorithm

Chrome’s V8 engine is a generational recycling strategy. This is consistent with the idea of a Java reclamation strategy. The aim is to distinguish between “temporary” and “persistent” objects; More collection of “temporary objects” (younggeneration) and less collection of “persistent objects” (tenured generation), reducing the number of objects that need to be traversed each time, thus reducing the time of each GC.

Memory limits for V8

There is a limit to how much memory javascript can use in Node.

  1. About 1.4GB for 64-bit systems.
  2. About 0.7GB for 32-bit systems.

Corresponds to generational memory, by default.

  1. 32-bit the new generation memory is 16MB, and the old generation memory is 700MB.
  2. In a 64-bit system, the new generation memory is 32MB, and the old generation memory is 1.4GB.

The new generation is divided into two equal chunks of memory, called semispace, each of 8MB (32-bit) or 16MB (64-bit) size.

This restriction can be adjusted when node starts by passing –max-old-space-size and –max-new-space-size, as in:

Js // The unit is MB. Node --max-old-space-size= 1024 app.js // The unit is KBCopy the code

The above parameters are valid when V8 is initialized and cannot be changed dynamically once they are valid.

Reason for memory limit:

As for why V8 limits the size of the heap, there’s a superficial reason: V8 was originally designed for browsers, and you’re unlikely to encounter scenarios that use a lot of memory. Deep reason: limitations of V8’s garbage collection mechanism. Officially, with 1.5GB of garbage collection heap memory as an example, V8 takes more than 50 milliseconds to do a small garbage collection and more than a second to do a non-incremental garbage collection. This is the amount of time in garbage collection that causes the JS thread to suspend execution, and at this cost the performance and responsiveness of the application plummets.

V8 Generation GC

V8 garbage collection strategy is mainly based on generational garbage collection mechanism. The modern garbage collection algorithm divides the memory garbage collection into different generations according to the lifetime of the object, and then applies more efficient algorithms to different generations of memory.

V8 memory generation:

In V8, the memory is mainly divided into the new generation and the old generation. The new generation memory stores objects with a short lifespan, while the old generation memory stores objects with a long lifespan or resident memory, as shown in the figure below:

The overall size of the V8 heap is the memory used by the new generation plus the memory used by the old generation.

V8 Scavenge:

On the basis of generation, the Scavenge algorithm is used to collect garbage. In the implementation of Scavenge, the Cheney algorithm is mainly used

The Cheney algorithm is a garbage collection algorithm implemented by copying. It splits the heap memory into two parts, each of which is called semispace. Of the two Semispace Spaces, only one is in use and the other is idle. A semispace space that is in use is called a From space, and a space that is idle is called a To space.

When we allocate an object, we first allocate it in the From space. When garbage collection begins, the living objects in the From space are checked, and these living objects are copied To the To space, while the space occupied by non-living objects is freed. When the copy is complete, the roles of the From space and the To space are reversed (i.e. the former From space is released and becomes To; The To space becomes the From space after the living objects are copied. In short, during garbage collection, this is done by copying the living objects between the two Semispace Spaces.

Exploiture. You can use only half of the heap memory, due to the partitioning and replication mechanism.

Exploiture. Scavenge has an excellent performance in time efficiency because it replicates only surviving objects, and accounts for a small proportion of surviving objects in scenarios with a short life cycle. Scavenge is a typical spatial-for-time algorithm, so it cannot be applied to all garbage collections on a massive scale. However, it can be found that Scavenge is very suitable for application in the Cenozoic era, because the life cycle of the object in the Cenozoic is short, it is exactly suitable for this algorithm.

Promotion: The actual heap memory used is the sum of the two semispace space sizes of the new generation and the memory size used by the old generation. An object is considered long-lived when it survives multiple copies. Such long-lived objects are then moved to the old generation and managed using a new algorithm. The process by which an object moves from the new generation to the old generation is called promotion.

Be insane. In Scavenge, an object that lives in the From space is copied To the To space, and the role swap is performed between the From space and the To space. However, in the case of generational garbage collection, surviving objects in the From space need To be checked before being copied To the To space. Under certain conditions, it is necessary to move long-lived objects to the old generation, that is, to complete object promotion.

Exploiture. An object can be promoted if it has undergone a Scavenge, or if its To space exceeds the 25% limit.

Avenge the To space will be changed To the From space when the application is completed, and the next memory allocation will be made in this space. If the proportion is too high, subsequent memory allocation will be affected. After an object is promoted, it will be treated as an object with a long lifetime in the old generation space and will be processed by the new collection algorithm.

V8 Old Sweep algorithm (Mark-Sweep && Mark-Compact)

To the object in the old generation, due to the large proportion of the surviving object, there will be two problems to apply Scavenge. One is that there are many surviving objects, the efficiency of copying the surviving object will be very low; The other problem is still wasting half the space. To this end, V8 used a combination of Mark-Sweep and Mark-Compact for garbage collection in the old generation.

Mark-sweep: Mark-sweep means Mark Sweep. It is divided into two stages: Mark Sweep and Sweep. In contrast to Scavenge, Mark-Sweep does not split the memory space in half, so there is no waste of half the space. Unlike Scavenge, which replicates a living object, Mark-Sweep traverses all objects in the heap in the marking phase and tags the living objects, and in the subsequent clearing phase, only unmarked objects are cleared. Scavenge replicates only living objects, while Mark-Sweep cleans only dead objects. Living objects only account for a small part of the Cenozoic generation, and dead objects only account for a small part of the old generation, which is why the two recycling methods are efficient.

Below is a schematic of Mark-Sweep marking in the old generation space, with the black part marking dead objects

Mark-sweep’s biggest problem: the memory space becomes discontinuous after a marked Sweep reclamation. This type of memory fragmentation can cause problems for subsequent memory allocation, because it is very likely that a large object will need to be allocated, and all the debris will not be able to complete the allocation, triggering an early garbage collection, which is unnecessary. Don’t think of memory as a liquid. It’s solid, like a scattered mah-jongg, that needs to be sorted — mark-compact.

Mark-compact: Mark-Compact was introduced to solve the memory fragmentation problem of Mark-Sweep. Mark-compact is an evolution of Mark-sweep. The difference is that after the object is marked as dead, the living object is moved to one end during the process of deorganizing. After the movement is complete, the memory outside the boundary is cleaned up. The following figure shows the schematic diagram of mark-Compact after it has marked and moved the living object. The white grid is the living object, the dark grid is the dead object, and the light grid is the hole left by the moving of the living object.

After the move is complete, the memory area behind the rightmost living object can be directly cleared to complete the reclamation.

A simple comparison of Mark-Sweep, Mark-Compact and Scavenge

Recovery algorithm Mark-Sweep Mark-Compact Scavenge
speed medium The slowest The fastest
The space overhead Few (fragmentary) Few (no fragments) Double space (no debris)
Move object or not no is is

As you can see from the table, mark-Sweep and Mark-Compact are the main trade-offs between mark-Sweep and Mark-Compact. Since Mark-Compact has to move objects, it can’t be performed very fast. Mark-compact is used when there is insufficient space to allocate objects promoted from the next generation.

Incremental Marking:

  • In order to avoid the inconsistency between THE JS application logic and the garbage collector, the three basic algorithms of garbage collection all need to suspend the application logic and resume the application logic after the garbage collection is completed. This behavior is called “stop-the-world”. In V8’s generational garbage collection, a small garbage collection only collects the younger generation, and because the younger generation is smaller by default and usually has fewer surviving objects, even if it is completely paused, the impact is not significant. However, the old generation of V8 is usually configured to be large, and there are many surviving objects, the full garbage collection (FULL garbage collection) marking, cleaning, sorting and other actions caused by the pause will be more terrible, need to be improved (PS: If the heap memory of V8 is 1.5GB, V8 takes more than 50ms to do a small garbage collection and more than a second to do a non-incremental garbage collection.) .
  • In order to reduce the pause time caused by the full heap garbage collection, V8 starts with the marking phase and changes the actions that should be completed in one single stop to incremental marking, that is, breaking up into many small “steps”. After each “step”, the JS application logic is executed for a short time. Garbage collection alternates with the application logic until the markup phase is complete.
  • With V8’s incremental markup, the maximum pause time for garbage collection can be reduced to about a sixth of its original size.
  • V8 later introduced lazy sweeping and incremental compaction, making cleanup and compaction actions incremental as well. There are also plans to introduce parallel markup and parallel cleanup, further leveraging multi-core performance to reduce time per pause.

Reducing the performance impact of garbage and collection:

The following two points should be noted:

  • Keep garbage collection to a minimum, especially full heap garbage collection. There’s not much we can do about this, relying on V8’s own optimization mechanics.
  • Avoid memory leaks and let memory be released in time. This is the part that we need to pay attention to. Specific can view, this series of memory leaks chapter, there are super detailed explanation.

Thanks and Reference

  • JS garbage collection notes
  • MDN Memory management
  • V8’s garbage collection mechanism and memory limits (if you want to know about garbage collection in Node or more generally)