preface

It goes like this:

I was trying to think of a business logic, stroking my remaining hair, momentarily lost in meditation……

Suddenly chat icon a sudden flash, open a look, there are colleagues voice;

MemoryCache/Memcached is two to five times more efficient than Redis. MemoryCache/Memcached is two to five times more efficient than Redis.

I was thinking of Memcached, and I was hearing the same thing, thinking: How could it be, if it was bad, it wasn’t that bad;

Because the colleague code was not submitted at that time, and then got into the discussion of ing, and finally did not talk, I went to the colleague to face to face communication (want to fight, no, no, no, civilized person);

In the communication… A few minutes passed, and suddenly a colleague said, He’s using Microsoft MemoryCache.

I can’t pronounce it, but when I hear Microsoft, I feel like I’m a joke.

And then quickly let colleagues open the code, I wipe, really into a joke, but also justified the communication of dozens of minutes.

Why so “justified”?

  • Seriously doubt that Memcached(misheard) is 2 to 5 times better than Redis. Experience tells me it’s not possible, unless Redis usage is questionable;
  • My colleague changed Redis to Memcached(misheard), that will not work, the early technology selection, Memcached does not fit the project application scenario;

What’s the difference between MemoryCache and Memcached?

  • MemoryCache is not distributed cache but stored in memory based on programs. Microsoft encapsulated memory cache library, reasonable use of CPU, good performance; Due to program – based memory allocation, network communication consumption is avoided.
  • Memcached is distributed cache. It is stored on a common machine and used by different programs.

Although Memcached is distributed, it is also memory-based. The logic of memory is different in data storage, but I’m not going to go into the source code here.

Bonus – Why does Redis make co-workers feel bad about performance

The real scenario is that the client starts multi-threading to read Redis data frequently. When there is a lot of access, Redis reads data for more than 20 milliseconds. For Web projects, this is actually ok, and the response of 20 milliseconds is not perceived by the user. However, for a high-performance service program, the communication requirements are relatively high, so a simple analysis of the reasons for the delay, about the following two points:

  • The number of clients increases, resulting in a large number of data corresponding to a commonly used Key (actually not large, but relatively large, slightly affecting the read speed, millisecond level).

    Solution: Colleagues use MemoryCache as an additional layer of cache, storing the frequently used Key directly in memory to improve read performance.

  • Frequently fetching data with commands like Keys * causes some commands to execute in around 20 milliseconds (as can be seen in the slow log);

    Solution: Use a Scan command to obtain data.

  • Redis’ own persistence time;

    Solution: Adjust the Redis persistence strategy so that the persistence frequency is not as high;

The body of the

Now that we are talking about MemoryCache, let’s talk briefly about how MemoryCache can be used in real life projects.

Mainly depends on package: Microsoft. Extensions. Caching. The Memory;

MemoryCache is a simple way to set and retrieve a value from a method; Let’s go to the Demo;

1. Console Demo

In fact, there are a lot of programs based on background services run, not all are Web, so I wrote a console Demo, convenient companion reference;

1.1 Introduction of relevant packages. Since Autofac is used as dependency injection and its section programming in the project, relevant dependency packages need to be introduced. The project structure and package introduction are shown as follows:

1.2 Write sample code and register related services
  • IUserService is a simple interface;

    The Intercept and MyCache features labeled on interfaces and methods are not required, as described below;

  • The implementation of UserService interface;

  • MyCacheAttribute custom feature, used for identification, with no logic in it;

  • MyInterceptor custom interceptor, where the section-oriented logic code is handled;

    Use Autofac to register the service, as follows:

    Note: Autofac is not required, you can choose to use it according to your needs, in order to use Autofac’s aspect programming features.

1.3 Two Methods for Cache processing

Typically in non-Web applications, there are two ways to cache processing:

  • The code is embedded in the business logic and cached or set in the real business logic;

    Such a big disadvantage is that each cached data needs to be manually added to the specified business logic cache processing, the code is not easy to maintain later, the cache function is not easy to control, need to modify the code to meet.

  • Faceted programming eliminates the need to embed extra code into the business

    Through the idea of section-oriented, the method is intercepted by the principle of dynamic proxy and processed before and after the method, as follows:

    The cache logic is handled directly in the interceptor, as follows:

    To register the service, turn on Autofac’s faceted feature

    The second time we get the data from the cache, it’s beautiful:

Note: It is recommended to use section-oriented processing, so that the cache is pluggable and the code is maintainable.

2.WebApiDemo(Project name: MemoryCacheWebApiDemo)

It is easy to use in WebApi. MemoryCache dependencies are already integrated into the framework and can be used by registering services if needed. There are three common ways to cache processing in WebApi:

  • Middleware: Cache logic operations are performed through customized middleware.
  • Filter form: Use MVC related filter to cache logic operation;
  • Business level aspect form: aspect oriented way, do cache in business layer, integrate Autofac, and console Demo aspect;

Here is a filter in the form of Demo, go ~~~

2.1 Writing filters

If the cache can be handled first, it takes precedence, so use the ResourceFilter filter; I wrote a very detailed article about filters before (learn with me. NetCore MVC filter, this article can walk with your head up), there is no need to repeat.

Then register the service and custom filter in startup. cs as follows:

Then start to run, several times request debugging to see the effect (small partners in the filter debugging can be);

WebApi can also use the way of business level to carry out relevant business processing, integration of Autofac; Friends can refer to this article (learn with me. Asp.NetCore integration with Autofac extension). Regarding AOP aspect programming, a separate article will be organized to share;

Source code address: github.com/zyq025/DotN…

conclusion

In some development projects, you might want to use Dictionary for data caching. If so, you might want to try MemoryCache, which is CPU efficient and thread-safe. It can also be used as a multi-level cache in high-concurrency scenarios, as MemoryCache can also set expiration times, which can be used in conjunction with Redis.

A handsome guy who was made ugly by the program, pay attention to “Code variety circle “, learn with me ~~~