LruCache: Android memory optimization

What is a LruCache? What is the implementation principle of LruCache?

In fact, these two questions can be answered as a question. Once you know what LruCache is, you can only know the implementation principle of LruCache. Lru is Least Recently Used. Therefore, we can infer the implementation principle of LruCache: remove the least recently used data from the cache and keep the most frequently used data. How to implement the specific code? Let's look at the source code.Copy the code

Let’s first look at what the member variables are:

  1. Reset the maximum cache value

2) Put method

As you can see, the put() method is not a problem. The important thing is to call the trimToSize() method after the cache object has been added to determine if the cache is full and, if so, to remove the least recently used algorithm.

3) Get the value from the cache

  1. The get method in LinkHashMap

This shows that a collection LinkedHashMap is maintained in LruCache, and the LinkedHashMap is sorted by access order. When the put() method is called, elements are added to the combination, and trimToSize() is called to determine if the cache is full, and if so, the iterator of the LinkedHashMap is used to remove the header element, the least recently accessed element. When the get() method is called to access the cached object, the LinkedHashMap’s get() method is called to get the corresponding collection element, and the element is updated to the end of the queue.

  1. Check for boundaries

The trimToSize() method continuously removes the header elements in the LinkedHashMap, the least recently accessed, until the cache size is less than the maximum. When LruCache’s get() method is called to retrieve a cache object from the collection, it represents a one-time access to the element, and the queue is updated, keeping the entire queue sorted in the order of access. This update process is done in the get() method of the LinkedHashMap.