Obtaining resources from the Internet is slow and costly. For this reason, the Http protocol includes a part to control the cache, so that Http clients can cache and reuse previously acquired resources to optimize performance and improve the experience. The part of Http about cache control, though, has changed a bit as the protocol has evolved. But I think, as a back-end programmer, when developing a Web service, you just need to focus on the request header if-none-match, the response header ETag, and the response header cache-control. These three Http headers are all you need, and most browsers today support them. All we need to do is make sure that each server response provides the correct HTTP header instructions for when and for how long the browser can cache the response.
Where is the cache?
In the figure above, there are three roles, browser, Web proxy, and server. As shown in the figure, the HTTP cache exists in the browser and Web proxy. Of course, there are various caches within the server itself, but these are not the Http caches discussed in this article. Http Cache Control is a convention that controls the Cache policies of browsers and Web proxies by setting different response headers cache-control, and verifies the effectiveness of the Cache by setting request headers if-none-match and response headers ETag.
Response headers ETag
ETag Entity Tag identifies a resource. In a concrete implementation, an ETag can be a hash value for a resource or an internally maintained version number. However, ETag should reflect changes in the content of the resource, which is the basis for Http caching to work properly.
As shown in the previous example, when the server returns a response, it usually includes some metadata information about the response in the Http header, of which ETag is one. In this case, the value x1323DDx is returned. When the contents of the resource /file change, the server should return a different ETag.
Request header If – None – Match
For the same resource, such as /file in the previous example, after one request, the browser already has a version of /file content and this version of ETag. The next time the user needs this resource, the browser requests the server again. If-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match = if-none-match Instead of returning the contents of /file, the server returns a 304 response telling the browser that the resource has not changed and that the cache is valid.
As shown in the example above, with if-none-match, the server can achieve the same result with very little response, optimizing performance.
Response headers cache-control
Each resource can define its own Cache policy through the Http header cache-Control, which controls who can Cache responses under what conditions and for how long. The fastest requests are those that don’t have to communicate with the server: by having a local copy of the response, we can avoid all the network latency and data costs of data transfer. To this end, the HTTP specification allows the server to return a series of different cache-control instructions that Control how and for how long a response is cached by the browser or other relay Cache.
The cache-control header was defined in the HTTP/1.1 specification, replacing the previous header (such as Expires) used to define response caching policies. Cache-control is supported by all current browsers, so using it is sufficient.
Here are some common instructions that can be set in cache-control.
max-age
This directive specifies the maximum amount of time (in seconds) from the current request for which the retrieved response is allowed to be reused. For example, cache-control :max-age=60 indicates that the response can be cached and reused for another 60 seconds. Note that the browser will not send any requests to the server, including requests to verify that the cache is valid, for the time specified by max-age. That is, if the resources on the server change during this time, the browser will not be notified and will use the resources of the older version. Therefore, be careful when setting the cache duration.
Public and private
If public is set, the response can be cached in the browser or any relay Web proxy. Public is the default value, that is, cache-control :max-age=60 is the same as cache-Control :public, max-age=60.
If the server is set to private such as cache-control :private, max-age=60, this means that only the user’s browser can Cache the private response and no relay Web proxy is allowed to Cache it — for example, The user’s browser can cache HTML web pages that contain the user’s private information, but the CDN cannot.
no-cache
If the server sets no-cache to cache-control :no-cache, the browser must check with the server before using the cached resource to see if the response has been changed. If the resource has not been changed, the download can be avoided. This verifies that the previous response has been modified using the if-none-match and ETag headers described above.
Note that the name no-cache is a bit misleading. This does not mean that the browser will no longer cache data, but that the browser needs to make sure that the data is consistent with the server before using the cache. If no-cache is set and the ETag implementation does not reflect resource changes, the browser’s cache data will never be updated.
no-store
If the response is cache-control :no-store, the browser and any relay Web proxy will not store the data. The next time the resource is requested, the browser can only re-request the server and re-read the resource from the server.
How do you determine the cache-control policy for a resource?
Here’s a flow chart to help you.
Common mistakes
Boot-time cache
What do we do when we find that the application starts very slowly, only to discover that one of the services we rely on has a very slow response time?
In general, this type of problem indicates that the dependent service cannot meet the requirements. If it’s a third-party service and you don’t have control, we might introduce caching.
The problem with introducing caching at this point is that cache invalidation policies are difficult to implement because caching is designed to require as few dependent services as possible.
Early to cache
By “early,” I mean not the life cycle of the application, but the development cycle. Sometimes we see some developers estimate system bottlenecks early in development and introduce caches.
In fact, this approach obscures points where performance tuning is possible. The return value of the service will be cached anyway, so why should I spend my time optimizing this part of the code?
Integrated cache
The “S” in the SOLID principle stands for the Single Responsibility Principle. When an application integrates the cache module, the cache module and the service layer are strongly coupled and cannot run independently without the cache module.
Cache everything
Sometimes it is possible to blindly cache all external calls in order to reduce response latency. In fact, it’s easy for developers and maintainers to be unaware of the cache module and end up making a false assessment of the reliability of the underlying dependency module.
Cascade cache
Caching everything, or just most of the content, can result in other cached data being included in the cached data.
If this cascade cache structure is included in your application, you can end up with an uncontrolled cache expiration time. The uppermost cache needs to wait until each level of cache is invalidated and updated before the data returned is completely updated.
Cannot refresh cache
Typically, caching middleware provides a tool to flush the cache. Redis, for example, provides tools that allow maintenance personnel to delete some data or even refresh the entire cache.
Some temporary caches, however, may not include such a tool. Caches that simply store data in content, for example, usually do not allow external tools to modify or delete cached content. In this case, if the cache data is abnormal, maintenance personnel can only restart the service, which greatly increases o&M costs and response time. Even worse, some caches may write cached contents to the file system for backup. In addition to restarting the service, you also need to ensure that the cached backup on the file system is deleted before the application starts.
The impact of caching
The above mentioned common errors that can result from the introduction of caching are not considered in a cache-free system.
Deploying a system that relies heavily on caching can take a lot of time waiting for the cache to expire. For example, it may take several hours to refresh the CDN configuration and CDN cache after the system is released.
In addition, a performance bottleneck that prioritizes caching can cause performance problems to be hidden and not really addressed. In fact, many times tuning code takes the same amount of time as introducing caching components.
Finally, debugging costs can increase significantly for systems that include caching components. It often happens that half a day of code is traced, and the resulting data comes from the cache and has nothing to do with the actual component that logically should depend on. The same problem can occur when all relevant test cases are executed and the modified code is not actually tested.
How to use cache well?
Drop caching!
Well, a lot of times caching is unavoidable. In internet-based systems, it is difficult to avoid using caching completely. Even the HTTP header contains the Cache configuration: cache-control: max-age= XXX.
Understand the data
If you want to cache data, you first need to understand the data update policy. If you know exactly when the data needs to be updated, you can use the if-Modified-since header to determine whether the data requested by the client needs to be updated by simply returning a 304 Not Modified response for the client to reuse previously locally cached data, or by returning the latest data. In addition, in order to make better use of HTTP caching, it is recommended to use eTag to mark the version of cached data.
Optimize performance instead of using caching
As mentioned earlier, using caching tends to mask potential performance problems. Use performance analysis tools whenever possible to find the real cause of slow application response and fix it. Such as reducing invalid code calls, optimizing SQL according to SQL execution plan, and so on.
Here is the code to clear all caches for your application
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
|
/ *
* Document name: DatacleanManager.java
Description: Main functions are clear internal/external cache, clear database, clear sharedPreference, clear files and clear custom directory
* /
package com.test.DataClean;
import java.io.File;
import android.content.Context;
import android.os.Environment;
/ * *
* This application uses the data cleanup manager
* /
public class DataCleanManager {
/ * *
* to remove this application internal cache (/ data/data/com. XXX. XXX/cache)
*
* @param context
* /
public static void cleanInternalCache(Context context) {
deleteFilesByDirectory(context.getCacheDir());
}
/ * *
* to remove this application all database (/ data/data/com. XXX. XXX/databases)
*
* @param context
* /
public static void cleanDatabases(Context context) {
deleteFilesByDirectory(new File("/data/data/"
+ context.getPackageName() + "/databases"));
}
/ * *
* to remove this application SharedPreference (/ data/data/com. XXX. XXX/shared_prefs)
*
* @param context
* /
public static void cleanSharedPreference(Context context) {
deleteFilesByDirectory(new File("/data/data/"
+ context.getPackageName() + "/shared_prefs"));
}
/ * *
* Clear the application database by name
*
* @param context
* @param dbName
* /
public static void cleanDatabaseByName(Context context, String dbName) {
context.deleteDatabase(dbName);
}
/ * *
* remove/data/data/com. XXX. XXX content/files
*
* @param context
* /
public static void cleanFiles(Context context) {
deleteFilesByDirectory(context.getFilesDir());
}
/ * *
* remove under external cache content (/ MNT/sdcard/android/data/com. XXX. XXX/cache)
*
* @param context
* /
public static void cleanExternalCache(Context context) {
if (Environment.getExternalStorageState().equals(
Environment.MEDIA_MOUNTED)) {
deleteFilesByDirectory(context.getExternalCacheDir());
}
}
/ * *
* Clear files in the user-defined path. Be careful when using the file. Only files in the directory can be deleted
*
* @param filePath
* /
public static void cleanCustomCache(String filePath) {
deleteFilesByDirectory(new File(filePath));
}
/ * *
* Clear all data of the application
*
* @param context
* @param filepath
* /
public static void cleanApplicationData(Context context, String... filepath) {
cleanInternalCache(context);
cleanExternalCache(context);
cleanDatabases(context);
cleanSharedPreference(context);
cleanFiles(context);
for (String filePath : filepath) {
cleanCustomCache(filePath);
}
}
/ * *
* Delete only files in a folder. If the directory passed in is a file, it will not be processed
*
* @param directory
* /
private static void deleteFilesByDirectory(File directory) {
if (directory ! = null && directory.exists() && directory.isDirectory()) {
for (File item : directory.listFiles()) {
item.delete();
}
}
}
}
|
conclusion
Caching is a very useful tool, but it can be easily abused. Do not use caching until the last minute, and prioritize other ways to optimize application performance