Plan to learn an open source NPM package every week. This is Lru-cache

All content of this article is open source at github.com/ahwgs/weekl…

The opening

In general business scenarios, some caching may be needed to improve system performance, but it is not necessary to use Redis and other caching services, this can be used

Lru-cache to solve our problem

Today we will focus on what lRU is and how to use lRU-cache

What is the LRU

LRU (Least recently used) algorithm filters out data based on historical access records. Its core idea is that “if data has been accessed recently, it has a higher chance of being accessed in the future”.

1. Insert new data into the head of the list

2. Whenever a cache hit (that is, cached data is accessed), the data is moved to the head of the list;

3. When the list is full, discard the data at the end of the list.

1. At the beginning, the memory space is empty, so there is no problem entering A, B, and C in sequence

2. When D is added, there is A problem and the memory space is insufficient. Therefore, according to the LRU algorithm, A in the memory space has been waiting for the longest time, and A is selected to eliminate it

3. When B is referenced again, B in the memory space is active again, and C becomes the longest unused memory in the memory space

4. When E is added to the memory space again, the memory space is insufficient again, and THE object stored in the memory space is E->B->D

How to use

The installation

 npm install lru-cache --save
Copy the code

use

Please refer to the official documentation for usage

const LRU = require("lru-cache");

const cache = new LRU({
  max: 4.maxAge: 20000}); cache.set("key1"."ahwgs");
cache.set("key2".123123);
cache.set("key3", { name: "ahwgs" });
cache.set("key4".true);
cache.set("key5".true);

console.log("result", cache.get("key1")); // 'ahwgs'
console.log("result", cache.get("key2")); / / 123123
console.log("result", cache.get("key3")); // {name:'ahwgs'}
console.log("result", cache.get("key4")); // true
console.log("result", cache.get("key5")); // undefined because max4
Copy the code

Lru-cache provides a rich configuration and API that can be directly referenced in the official documentation

The source code

Check out github.com/isaacs/node…

Handwriting simple LRU-cache algorithm

class LRUCache {
  constructor(capacity) {
    this.cache = new Map(a);this.capacity = capacity;
  }
  get(key) {
    if (this.cache.has(key)) {
      // Existing is updated
      let temp = this.cache.get(key);
      this.cache.delete(key);
      this.cache.set(key, temp);
      return temp;
    }
    return -1;
  }
  set(key, value) {
    if (this.cache.has(key)) {
      // Update as existing (delete after added)
      this.cache.delete(key);
    } else if (this.cache.size >= this.capacity) {
      // Join if it does not exist
      // If the cache exceeds the maximum value, remove the recently unused cache
      this.cache.delete(this.cache.keys().next().value);
    }
    this.cache.set(key, value); }}const cache = new LRUCache(4);

cache.set("a"."123");

console.log(cache.get("a"));
Copy the code

Application scenarios

1. Multiple login accounts: In some service scenarios, you need to limit the number of login accounts on multiple devices. It can be used if you do not use Redis

2. Application evolution of LruCache in MEituan DSP system

3. [vue keep alive – the implementation of the principle and the caching policy] (www.cnblogs.com/everlose/p/)…

disadvantages

Because LRU is a memory-based caching scheme, it cannot be implemented in a distributed project and cannot be synchronized between different machines

reference

1. Cache elimination algorithm –LRU algorithm

2. Application evolution of LruCache in MEituan DSP system