\
1. Cache processing process
1.1 Foreground GET Request background service obtains data from cache
1.2 The foreground POST/PUT requests to create or modify data, and the background cache processing process
2. Cache penetration, cache breakdown, cache avalanche
2.1 Cache Penetration
- Cache penetration is data that is not in the cache or the database. Caceh cannot intercept requests repeatedly sent by users. Therefore, the database is directly used to query data, which may affect the database performance.
- The illustration
- Solution (1. Set key-value to null. 2. Intercepting user requests upstream)
2.2 Cache Breakdown
- Cache breakdown refers to data that is not in the cache but is present in the database (usually due to expiration of cache time). Because the user is very many, go to read the data at the same time, cause the database pressure to increase instantly.
- The illustration
- Solution (1. Set the hotspot data not to expire. The Redis recycle mechanism does not touch data that has no expiry date set. Therefore, the cache with no expiration time is valid for a long time.
Dict_lock = {'lock':'开'} # set initial lock state url = ['^RebbitMQ$'] # task message queue routing set RebbitMQ = ['server1'.'server2'Def open_lock(view):"""Obtain data operation permission and lock it."""
dict_lock.get('lock') = 'off'
return Response(0)
def close_lock(view):
"""Data used up, release resources"""
dict_lock.get('lock') = '开'
return Response(1) # queue first in, first out rebbitMq. last.open_lock() # queue last element operation data rebbitmq. lastdelete() # delete the element rebbitmq. close_lock() # release the resourceCopy the code
2.3 Cache Avalanche
- Cache avalanche refers to the fact that a large amount of data in the cache reaches the expiration date, but a large amount of query data causes the database to be overburdened or even down. Unlike a cache breakdown, where the same data is queried concurrently, a cache avalanche is when different data is out of date and a lot of data is not intercepted by the cache, so the database can be queried.
- Diagram (same as cache breakdown)
- The solution
-
- Distributed cache (multiple servers, each with one or more Redis services enabled)
- 1. Set a random expiration time to prevent a large number of data from expiring at the same time
- 2. Set the hotspot data to never expire
- 3. Use distributed cache to evenly distribute hotspot data among different cache databases.
Contributed by Mtooooo
Juejin. Im /post/5e9cfb…
Refer to the article
- Five scenarios tell you: In high-concurrency environments, do the database or cache come first?
- Cache penetration, cache breakdown, cache avalanche differences and solutions
- The most complete Redis operation, read this is enough
▼ clickBecome a community member and click on itIn the see