preface
Now cache technology can be seen everywhere in the project, but one thing is, after all, cache is still rare, after all, it is not as extensive as hard disk resources, so how to efficiently use cache, not waste, timely storage of hot data that is very important. Now let’s take a look at the famous LRU and LFU, and how it is implemented in code
What is LRU?
LRU (Least Recently Used) is an algorithm that changes the order of the linked list according to the access time to achieve cache elimination. Frequently used data is at the head of the linked list and cold data is at the tail. Once the linked list capacity is insufficient (cache space is full), the tail deletion policy is implemented. In general, linked lists are fast to add and delete, but there is a weakness is slow query, LRU in order to get elements fast, certainly can’t get the elements one by one, so there is a HashMap to get the elements you need, complexity O(1).
What is LFU?
LFU is the most Frequently Used policy. The data that is Used the Least Frequently in a period of time is eliminated. Least-used (LFU) is a caching algorithm for managing computer memory. It records and tracks the number of blocks used, and when the cache is full and more space is needed, the system will clear the memory at the lowest block usage rate. The simplest way to use the LFU algorithm is to assign a counter to each block loaded into the cache. Each time the block is referenced, the counter increases by one. When the cache reaches its capacity and a new block of memory is waiting to be inserted, the system searches for the block with the lowest counter and removes it from the cache. Redis has LFU when recycling is enabled
Comparison of LRU and LFU
In fact, LFU is more than LRU is added each time, get the increment record. Insert elements at the beginning of the list, count each insert, reorder the list by number of insertions, or by insertion time if the number of insertions is the same, and select obsolete data from the end of the list.
LRU and LFU have different emphases. LRU is mainly reflected in the use time of elements, while LFU is mainly reflected in the use frequency of elements. The drawback of LFU is that certain caches are accessed so frequently over a short period of time that they are immediately promoted to hot data, guaranteed not to be obsolete, and thus remain in system memory. In fact, this part of data is only briefly accessed with high frequency, and then will not be accessed for a long time. Transient high frequency access will accelerate the reference frequency of this part of data, and some newly added caches are easy to be deleted quickly, because their reference frequency is very low.
Four, LFU code implementation (read LFU naturally understand LRU)
1, LFU class
Insert the code slice herepublic class LFU<K.V> {
/** * Capacity */
private int capacity;
private Map<K, Node> caches;
/**
* 添加
*
* @param key
* @param value
*/
public void put(K key, V value) {
Node node = caches.get(key);
if (node == null) {
if (capacity == caches.size()) {
// Remove inactive data
K leastKey = removeLeastCount();
caches.remove(leastKey);
}
node = new Node(key, value, System.nanoTime(), 1);
caches.put(key, node);
} else {
node.value = value;
node.setCount(node.getCount() + 1);
node.setTime(System.nanoTime());
}
sort();
}
/** * gets the element **@param key
* @return* /
public V get(K key) {
Node node = caches.get(key);
if(node ! =null) {
node.setCount(node.getCount() + 1);
node.setTime(System.nanoTime());
sort();
return (V) node.value;
}
return null;
}
/** ** sort */
private void sort(a) {
List<Map.Entry<K, Node>> list = new ArrayList<>(caches.entrySet());
Collections.sort(list, new Comparator<Map.Entry<K, Node>>() {
@Override
public int compare(Map.Entry<K, Node> o1, Map.Entry<K, Node> o2) {
// Call the compable method overridden by Node
returno2.getValue().compareTo(o1.getValue()); }}); caches.clear();for(Map.Entry<K, Node> kNodeEntry : list) { caches.put(kNodeEntry.getKey(), kNodeEntry.getValue()); }}/** * removes the ** with the smallest number of statistics or time@return* /
private K removeLeastCount(a) {
Collection<Node> values = caches.values();
The //min method calls the compable method overridden by Node
Node min = Collections.min(values);
return (K) min.getKey();
}
public LFU(a) {}public LFU(int size) {
this.capacity = size;
caches = new LinkedHashMap<>(size);
}
public int getCapacity(a) {
return capacity;
}
public void setCapacity(int capacity) {
this.capacity = capacity;
}
public Map<K, Node> getCaches(a) {
return caches;
}
public void setCaches(Map<K, Node> caches) {
this.caches = caches; }}Copy the code
2, the Node type
package com.cloud.order.util;
public class Node implements Comparable<Node> {
Object key;
Object value;
long time;
int count;
@Override
public int compareTo(Node o) {
// Compare times in equal numbers
int compare = Integer.compare(this.count, o.count);
if (compare == 0) {
return Long.compare(this.time, o.time);
}
return compare;
}
public Node(a) {}public Node(Object key, Object value, long time, int count) {
this.key = key;
this.value = value;
this.time = time;
this.count = count;
}
public Object getKey(a) {
return key;
}
public void setKey(Object key) {
this.key = key;
}
public Object getValue(a) {
return value;
}
public void setValue(Object value) {
this.value = value;
}
public long getTime(a) {
return time;
}
public void setTime(long time) {
this.time = time;
}
public int getCount(a) {
return count;
}
public void setCount(int count) {
this.count = count; }}Copy the code
3, test,
package com.cloud.order.util;
import java.util.Map;
public class TestLru {
public static void main(String[] args) {
LFU<Integer, String> lfuList = new LFU<>(5);
lfuList.put(1."A");
lfuList.put(2."B");
lfuList.put(3."C");
lfuList.put(4."D");
lfuList.put(5."E");
lfuList.put(6."F");
Map<Integer, Node> caches = (Map<Integer, Node>) lfuList.getCaches();
for(Map.Entry<Integer, Node> nodeEntry : caches.entrySet()) { System.out.println(nodeEntry.getValue().value); }}}Copy the code
Write at the end, thanks for the likes
Welcome to follow my wechat official account [Village of the Apes]! To talk about Java interview and my wechat further communication and learning, wechat mobile search [codeYuanzhicunup] can be added if there are related technical problems welcome to leave a message to discuss, the public number is mainly used for technology sharing, including often meet test analysis, as well as source code interpretation, micro service framework, technical hot spots, etc..