1. Cache penetration

Cache penetration refers to query a cache and database data, as most caching strategies are passive loading, and for the sake of fault tolerance, do not check if the storage layer data is not written to the cache, this will lead to this there is no data every request to go to the storage layer query, lost the meaning of the cache. Users constantly initiate requests, which may cause great pressure on DB in the case of heavy traffic. It is also a big problem to frequently attack applications with non-existent keys.

Solution:

  1. For data that does not exist in the cache or database, you can set the value of this key to a default value, such as “NULL”, and set a cache expiration time. In this case, all access through this key will be blocked by the cache before the cache expires. If the data corresponding to this key exists in DB, after the cache is invalid, access the data through this key again, and get the new value.
  2. Add verification at the interface layer, such as user authentication verification, basic verification based on the id of the data scenario, directly intercept id<=0.

Second, cache breakdown

Cache breakdown is when a hot Key(such as a second kill item) in the cache expires at a certain point in time. At this point, there is a surge of traffic, and a large number of concurrent requests come to this Key. When the request finds that the cache has expired, it usually loads data from the back-end DB and sets it back to the cache. However, during the time period when the data in the cache is not fully loaded from DB, the concurrency instantly causes a large number of requests to directly break down to DB, forming a huge pressure on DB.

Cache breakdown, also known as the hot key problem, is the most classic of the three problems.

Solution:

  1. For hotspot keys, add mutex.
  2. Set the hotspot data to never expire.
  3. Resources are protected and services are degraded.

Cache avalanche

Cache avalanche refers to a large number of data with the same expiration time set in the cache expire and become invalid at the same time. At this moment, the volume of traffic increases sharply and the cache almost becomes invalid. All requests are turned to DB, and the DB is under excessive pressure and even goes down. Unlike a cache breakdown, which refers to simultaneous searches for the same data, a cache avalanche is when different data is out of date and a lot of data is not available to search the database.

Solution:

  1. Set the hotspot data to never expire.
  2. The expiration time is set randomly to prevent a large number of data from expiring at the same time.
  3. If redis cache is distributed, hot data can be evenly distributed among different cache databases.