This is the 18th day of my participation in the August Challenge

Today we are going to talk about Redis’ elimination strategy. When using Redis, when the Redis cache is full, Redis will flush data according to the configured flush policy. There are eight elimination strategies since Redis4.0. Let’s look at each of them.

Noeviction is designed to exclude data obsolescence and is the default configuration of Redis. At this point, when the cache is full and a write request comes in, Redis no longer serves it and returns an error. 2. When the volatile-random cache is full, randomly delete key/value pairs with expiration dates. 3. When the volatile- TTL cache is full, the key value pairs with expiration time are deleted in the sequence of expiration time. The earlier the expiration time is, the earlier the key value pairs are deleted. 4. When volatile- LRU cache is full, lRU algorithm is used to eliminate key/value pairs with expiration time. 5. When volatile- LFU cache is full, lFU algorithm is used to eliminate key/value pairs with expiration time. 6. Allkeys-random When the cache is full, randomly select and delete data from all key-value pairs. 7. After allkeys-LRU cache is full, lRU algorithm is used to filter and delete all data. 8. After allkeys-LFU cache is full, LRU algorithm is used to filter and delete all data.

In daily use, you can configure the corresponding policies according to your data requirements. Here I give you three suggestions.

1. We preferentially use allKeys-LRU policy. In this way, we can use the LRU algorithm to weed out the less frequently used data and put the most recently used data in the cache, thus improving the performance of the application. If your data has a clear distinction between hot and cold, you are advised to use the AllKeys-LRU strategy. 2. If the access frequency of your data is similar, and there is no difference between hot and cold, directly use allkeys-random strategy to randomly select the eliminated data. 3. If your data has a top requirement, such as top news, etc. The volatile- LRU policy is chosen, with no expiration time for the top data, so that the top data is never deleted, and any other data with expiration time is eliminated by the LRU algorithm.

Today we will share here, if you need redis learning materials, you can pay attention to the public number [programmer senior], reply redis.

If it works for you, give it a triple.