How to implement LRU cache elimination algorithm
This is the third day of my participation in the November Gwen Challenge. Check out the details: the last Gwen Challenge 2021
The author recently studied “The Beauty of Data Structures and Algorithms”, just take this opportunity to practice while recording their knowledge points. Just get right to it.
What is cache
Caching is a technology that improves data read performance. It has a wide range of applications, such as common CPU caching, database caching, browser caching, and so on.
Common cache elimination strategies
The size of the cache is limited. When the cache is full, which cached data should be purged and which cached data should be retained? It is up to the cache flushing policy to decide whether the cached data will stay or go. There are three common cache flushing strategies:
- FIFO: Data in the cache is first entered, and data is first eliminated.
- Least-used policy LFU: The cache data that has been used least recently is eliminated first.
- Least Recently Used policy LRU: Data that has been accessed least recently is eliminated first.
Here to help you take an example to remember these three elimination strategies, such as last year’s Singles’ Day bought a pile of useless but not returned things, this year’s Singles’ Day cut hands to buy a pile of useless things, the room is full, which should be cleared? Does the idea of what we’re cleaning up coincide with all three of these elimination strategies. Next, let’s look at the LRU’s least recently used strategy.
Third, LRU elimination strategy implementation ideas
- When the cache is not full, new data is put to the front of the cache.
- The cache is full, which needs to be discussed in two cases:
- First, the new data is already in the cache, so we need to remove the data from the cache and put the new data in front of the cache.
- Second, the new data is not in the cache. In this case, we need to clear the last data in the cache before putting the new data in the cache.
From this idea, we can see that a feature of LRU cache elimination strategy requires frequent data insertion and deletion operations. What data structures are good at deleting and inserting? The answer is clear: linked lists. Ok, so now we’re going to use linked lists in conjunction with the implementation idea to translate into concrete code.
Four, LRU elimination strategy code implementation
import java.util.LinkedList;
/ * * *@author xuls
* @dateLRU cache elimination algorithm */
public class LRU <T>{
private LinkedList<T> list;
private int capacity; // Cache capacity
public LRU(a) {
this(10);
}
public LRU(int capacity) {
this.capacity = capacity;
list = new LinkedList<>();
}
public void add(T data){
if (list.size() < capacity){
// If the cache is not full, it is added directly to the head of the list to indicate that the data is in recent use
list.addFirst(data);
}else {
// Whether the list contains data
if (list.contains(data)){
// The list contains data, which is removed first
list.remove(data);
}else {
// The list does not contain data, remove the last one, because the last one is the least recently used
list.removeLast();
}
// Add the new data to the frontlist.addFirst(data); }}@Override
public String toString(a) {
return "LRU{" +
"list=" + list +
'} ';
}
public static void main(String[] args) {
LRU<String> stringLRU = new LRU<>();
for (int i = 0; i < 10; i++) {
stringLRU.add("s"+i);
}
System.out.println(stringLRU);
stringLRU.add("s1");
System.out.println(stringLRU);
stringLRU.add("s10"); System.out.println(stringLRU); }}Copy the code