LruCache, first of all from the name can be seen its function. As a common cache strategy, it plays an important role in daily development. For example, Glide and SoftReference cache images in the Engine class to reduce traffic overhead and improve image loading efficiency. The android was developed for the API12. Util. LruCache, when API22 made changes to it, however, introduces the android. Support. The v4. Util. LruCache. What we are analyzing here is the LruCache in the support bag


What is the LruCache algorithm?

Lru (Least Recently Used) is the Least Recently Used algorithm. It maintains a LinkedHashMap internally that determines if the specified memory size is full when putting data. If it is full, it is cleaned using the least recently used algorithm. As for why we use LinkedHashMap storage, because LinkedHashMap is an array and two-way linked list to store data, that is, when we get data from the queue to the head of the queue. Over and over again, the data at the end of the queue is naturally the least used.


How to use LruCache?


Initialize the

In general, we take an eighth of the maximum memory at runtime and override a sizeOf method. In particular, sizeOf must correspond to the sizeOf the memory space.

int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);
LruCache<String, Bitmap> cache = new LruCache<String, Bitmap>(maxMemory / 8) {
            @Override
            protected int sizeOf(@NonNull String key, @NonNull Bitmap value) {
                return bitmap.getRowBytes() * bitmap.getHeight() / 1024; }};Copy the code


API

Public methods
final int createCount()Returns the number of times the value is returnedcreate(Object).
final void evictAll()Clear the cache, callentryRemoved(boolean, K, V, V)Each deleted entry.
final int evictionCount()Returns the number of values that have been expelled.
final V get(K key)returnkeyWhether a value exists in the cache or can be created#create.
final int hitCount()Return to returnget(K)The number of times a value already exists in the cache.
final int maxSize()For caches that are not overriddensizeOf(K, V), which returns the maximum number of entries in the cache.
final int missCount()returnget(K)Returns null or the number of times a new value needs to be created.
final V put(K key, V value)The cachevaluethekey.
final int putCount()returnput(K, V)Number of calls.
final V remove(K key)Delete an entry (keyIf present).
void resize(int maxSize)Set the size of the cache.
final int size()For caches that are not overriddensizeOf(K, V), which returns the number of entries in the cache.
final Map<K, V> snapshot()Returns a copy of the current contents of the cache, sorted from least recently accessed to most recently accessed.
final String toString()
void trimToSize(int maxSize)Delete the oldest entry until the total number of remaining entries is equal to or less than the requested size.


LruCache source code analysis

Let’s start with the constructor:

The constructor

    public LruCache(int maxSize) {
        if (maxSize <= 0) {
            throw new IllegalArgumentException("maxSize <= 0");
        } else {
            this.maxSize = maxSize;
            this.map = new LinkedHashMap(0.0.75 F.true); }}Copy the code

The constructor does two things altogether. Section 1: Determine whether maxSize is less than or equal to 0. Second, initialize maxSize and LinkedHashMap. There’s nothing to talk about. Let’s move on.

SafeSizeOf (Measure element size)

    private int safeSizeOf(K key, V value) {
        int result = this.sizeOf(key, value);
        if (result < 0) {/ / found empty
            throw new IllegalStateException("Negative size: " + key + "=" + value);
        } else {
            returnresult; }}Copy the code

SizeOf (measure the sizeOf an element)

This method must be overridden, otherwise the data will not be stored.

    protected int sizeOf(@NonNull K key, @NonNull V value) {
        return 1;
    }
Copy the code

Put method (add elements)

@Nullable
    public final V put(@NonNull K key, @NonNull V value) {
        if(key ! =null&& value ! =null) {
            Object previous;
            synchronized(this) {
                ++this.putCount;//count is the number of LruCahe caches
                this.size += this.safeSizeOf(key, value);// Plus the size of the value
                previous = this.map.put(key, value);// Save to LinkedHashMap
                if(previous ! =null) {// Subtract the value if the key has been saved before
                    this.size -= this.safeSizeOf(key, previous); }}if(previous ! =null) {
                this.entryRemoved(false, key, previous, value);
            }

            this.trimToSize(this.maxSize);// Determine the memory
            return previous;
        } else {
            throw new NullPointerException("key == null || value == null"); }}Copy the code

In a synchronized block, you enter an insert operation. Let’s go down, my old sun set his eyes, it seems that there is something unusual about trimToSize method?

TrimToSize (Determine whether memory overflow)

public void trimToSize(int maxSize) {
        while(true) {// This is an infinite loop that removes values until memory runs out
            Object key;
            Object value;
            synchronized(this) {
                if (this.size < 0 || this.map.isEmpty() && this.size ! =0) {// If no memory is allocated, an exception is thrown
                    throw new IllegalStateException(this.getClass().getName() + ".sizeOf() is reporting inconsistent results!");
                }

                if (this.size <= maxSize || this.map.isEmpty()) {// If it is smaller than the memory space, just so so ~
                    return;
                }
				// Otherwise, the Lru algorithm will be used for removal
                Entry<K, V> toEvict = (Entry)this.map.entrySet().iterator().next();
                key = toEvict.getKey();
                value = toEvict.getValue();
                this.map.remove(key);
                this.size -= this.safeSizeOf(key, value);
                ++this.evictionCount;// Recycle times +1
            }

            this.entryRemoved(true, key, value, (Object)null); }}Copy the code

The purpose of the TrimToSize method is to determine whether memory is running out. Use an infinite loop to weed out the least used data one by one.

Get method (get elements)

@Nullable
    public final V get(@NonNull K key) {
        if (key == null) {
            throw new NullPointerException("key == null");
        } else {
            Object mapValue;
            synchronized(this) {
                mapValue = this.map.get(key);
                if(mapValue ! =null) {
                    ++this.hitCount;// Number of hits +1, and return mapValue
                    returnmapValue; } + +this.missCount;// Miss count +1
            }
            /* If not, the create method will be used to create the object. The create method needs to be implemented by itself, or null */ will be returned if it is not implemented
            V createdValue = this.create(key);
            if (createdValue == null) {
                return null;
            } else {
                synchronized(this) {
                    // After you create a new object, add it to the map with the same logic as before
                    ++this.createCount;
                    mapValue = this.map.put(key, createdValue);
                    if(mapValue ! =null) {
                        this.map.put(key, mapValue);
                    } else {
                        this.size += this.safeSizeOf(key, createdValue); }}if(mapValue ! =null) {
                    this.entryRemoved(false, key, createdValue, mapValue);
                    return mapValue;
                } else {
                    this.trimToSize(this.maxSize);// Every time you add data, you need to check whether it overflows
                    returncreatedValue; }}}}Copy the code

Create method (try to create an object)

    @Nullable
    protected V create(@NonNull K key) {
        return null;// This method needs to be implemented itself
    }
Copy the code

The get and create methods are already commented in the code, so again the logic is not very complicated. But we need to pay attention to map get method, since LinkedHashMap can implement Lru algorithm, then its internal must not be simple!

Get method of LinkedHashMap

public V get(Object key) {
        Node<K,V> e;
        if ((e = getNode(hash(key), key)) == null)
            return null;
        if (accessOrder)
            afterNodeAccess(e);
        return e.value;
    }
Copy the code

LinkedHashMap first checks if the element is found, and returns null if not. If found, the afterNodeAccess method is called.

AfterNodeAccess method of LinkedHashMap

void afterNodeAccess(Node<K,V> e) { // move node to last
        LinkedHashMapEntry<K,V> last;
    		// if accessOrder is true and the current node is not the last node, sort by accessOrder
        if(accessOrder && (last = tail) ! = e) { LinkedHashMapEntry<K,V> p = (LinkedHashMapEntry<K,V>)e, b = p.before, a = p.after;// Here is the sorting process
            p.after = null;
            if (b == null)
                head = a;
            else
                b.after = a;
            if(a ! =null)
                a.before = b;
            else
                last = b;
            if (last == null)
                head = p;
            else{ p.before = last; last.after = p; } tail = p; ++modCount; }}Copy the code

So that’s it! LinkedHashMap implements sort by access order in this method, which is why our underlying LruCache uses LinkedHashMap as a data structure.

Now that we’ve covered the main methods, let’s look at other methods.

Remove (remove elements)

    @Nullable
    public final V remove(@NonNull K key) {
        if (key == null) {/ / found empty
            throw new NullPointerException("key == null");
        } else {
            Object previous;
            synchronized(this) {
                previous = this.map.remove(key);// Remove value based on key
                if(previous ! =null) {
                    this.size -= this.safeSizeOf(key, previous);// Subtracts value}}if(previous ! =null) {
                this.entryRemoved(false, key, previous, (Object)null);
            }

            returnprevious; }}Copy the code

EvictAll method (remove all elements)

    public final void evictAll(a) {
        this.trimToSize(-1);// Remove all values
    }
Copy the code

Other methods

    public final synchronized int size(a) {
        return this.size;// The current memory size
    }

    public final synchronized int maxSize(a) {
        return this.maxSize;// Maximum memory size
    }

    public final synchronized int hitCount(a) {
        return this.hitCount;// Number of hits
    }

    public final synchronized int missCount(a) {
        return this.missCount;// Number of misses
    }

    public final synchronized int createCount(a) {
        return this.createCount;// Create the number of values
    }

    public final synchronized int putCount(a) {
        return this.putCount;// The number of items put in
    }

    public final synchronized int evictionCount(a) {
        return this.evictionCount;// Remove the number of items
    }

    public final synchronized Map<K, V> snapshot(a) {
        return new LinkedHashMap(this.map);/ / create a LinkedHashMap
    }

    public final synchronized String toString(a) {//toString
        int accesses = this.hitCount + this.missCount;
        inthitPercent = accesses ! =0 ? 100 * this.hitCount / accesses : 0;
        return String.format(Locale.US, "LruCache[maxSize=%d,hits=%d,misses=%d,hitRate=%d%%]".this.maxSize, this.hitCount, this.missCount, hitPercent);
    }
Copy the code

The last

With this article, I believe you are already very clear about how LruCache works! I hope you can correct me if there is anything wrong. There is no end to learning. Let’s cheer up.