Recently, an intern, Xiao Zhang, came to see Caffeine, the cache framework I used in the company’s project. He came to me every day to learn from me, saying that he wanted to thoroughly overcome Caffeine.

Later this matter was the director until, said that there are new people behind, let me summarize the relevant questions and details into a tutorial, right to share good, the tutorial is the first of the whole network, combined with the application and thinking of the business scene in our game, and stepped on the pit.

This is the third installment of Caffeine tutorial, which focuses on the combination of Caffeine and level 2 cache.

Intern zhang: executives say directly to Caffeine sets the maximum number of cache, there is a hidden trouble, that is when the number of players online at the same time more than maximum cache the number of cases, can lead to cache is clear, after causing frequent reading the database load data, let me on the basis of Caffeine, combined with the second level cache to solve this problem.

Yes, for now Caffeine provides a whole set of mechanisms for us to integrate with level 2 cache.

Before getting into the specifics, we’ll introduce the concept of a CacheWriter, which can be thought of as a callback object that is called when Caffeine’s cache puts or removes data.

/ * *

 * @author xifanxiaxue

 * @date 2020/12/5 10:18

 * @desc

* /


public class CaffeineWriterTest {



    / * *

* Acts as a level 2 cache that only lives until the next GC

* /


    private Map<Integer, WeakReference<Integer>> secondCacheMap =

            new ConcurrentHashMap<>();



    @Test

    public void test(a) throws InterruptedException {

        // Set the maximum cache number to 1

        LoadingCache<Integer, Integer> cache = Caffeine.newBuilder()

                .maximumSize(1)

                // Set the put and remove callbacks

                .writer(new CacheWriter<Integer, Integer>() {

                    @Override

                    public void write(@NonNull Integer key, @NonNull Integer value) {

                        secondCacheMap.put(key, new WeakReference<>(value));

                        System.out.println("Trigger cacheWriter. write with key =" + key + "Put it in level 2 cache.");

                    }



                    @Override

                    public void delete(@NonNull Integer key, @Nullable Integer value, @NonNull RemovalCause cause) {

                        switch (cause) {

                            case EXPLICIT:

                                secondCacheMap.remove(key);

                                System.out.println("Trigger CacheWriter" +

                                        ".delete, delete cause: active delete, set key =" + key +

                                        "Clear from level 2 cache");

                                break;

                            case SIZE:

                                System.out.println("Trigger CacheWriter" +

                                        ".delete, clear cause: The number of caches exceeded the upper limit, key =" + key);

                                break;

                            default:

                                break;

                        }

                    }

                })

                .build(new CacheLoader<Integer, Integer>() {

                    @Nullable

                    @Override

                    public Integer load(@NonNull Integer key) {

                        WeakReference<Integer> value = secondCacheMap.get(key);

                        if (value == null) {

                            return null;

                        }



                        System.out.println("Trigger cacheloader. load to read key from level 2 cache =" + key);

                        return value.get();

                    }

                });



        cache.put(1.1);

        cache.put(2.2);

        // Since the cache cleanup is asynchronous, sleep 1 second for the cleanup to complete

        Thread.sleep(1000);

        

        // Cache overload triggers clearing

        System.out.println("Get data from Caffeine, key = 1, value ="+cache.get(1));

    }

}

Copy the code

This example is a little bit complicated, because it needs to be used in combination with secondary cache. If Caffeine is not complicated, there is no way to show the magic of Caffeine. First, take a look at the secondCacheMap object, which I use to act as secondary cache. Thus the life cycle only lives until the next GC.

Porridge: Xiao Zhang, this example can solve the problem of how to combine your second level cache. Would you please tell me whether the value of the final print result is null or non-null?

Zhang: It must be null, because the cache with key 1 was cleared because the number of caches exceeded the upper limit.

People who aren’t familiar with Caffeine’s mechanics can easily make mistakes like Zhang’s and misjudge the results.

In order to clarify the logic of the program, I printed the results of the program

Cachewriter. write is triggered when key = 1 is added to level-2 cache. Cachewriter. write is triggered when key = 2 is added to level-2 cache. Cachewriter. delete (key = 1, value = 1, value = 1, value = 1, value = 1) If the number of caches exceeds the upper limit, key = 2

In this code, we can see that in cacheWriter. delete, I determine the RemovalCause, which is the reason for removing the cache. If the cache exceeds the upper limit, the cache does not remove the secondary cache, but cacheloader.load reads from the secondary cache. So when Caffeine finally loads data with key 1 from Caffeine, instead of null, the data is retrieved from the level 2 cache.

Intern Zhang: What about the last print that triggers cacheWriter. delete because the number of caches has exceeded the upper limit and key = 2?

This is because Caffeine’s call to cacheloader. load returns non-null data to the cache, causing the number of caches to exceed the maximum, so it clears the cache with key 2.

Intern Zhang: Porridge porridge, I want to see how the cache hit rate is. Is there any method?

There are some. If you look at the source code, you can see that Caffeine has quite a lot of log, but we need to enable log when building the cache.

/ * *

 * @author xifanxiaxue

 * @date2020/12/1;

 * @desc

* /


public class CaffeineRecordTest {



    / * *

* Simulate reading data from a database

     *

     * @param key

     * @return

* /


    private int getInDB(int key) {

        return key;

    }



    @Test

    public void test(a) {

        LoadingCache<Integer, Integer> cache = Caffeine.newBuilder()

                // Start recording

                .recordStats()

                .build(new CacheLoader<Integer, Integer>() {

                    @Override

                    public @Nullable Integer load(@NonNull Integer key) {

                        return getInDB(key);

                    }

                });

        cache.get(1);



        / / shooting

        System.out.println(cache.stats().hitRate());

        // The number of culls

        System.out.println(cache.stats().evictionCount());

        // Average time taken to load new values [nanoseconds]

        System.out.println(cache.stats().averageLoadPenalty() );

    }

}

Copy the code

Practical use: used the last time Caffeine was introduced into the gamerecordThis mechanism is only used for testing purposes and is not recommended for production environments. To determine whether Caffeine is worth introducing, I used a thread’s print hit ratio, the number of deletions and the average time it takes to load new values.

Intern Zhang: I already use Caffeine, but there is a problem. If data is forgotten to be saved in the library and then phased out, player data will be lost. Does Caffeine provide a way for developers to do something about phased out?

Porridge: It’s true, Caffeine offers an elimination monitor, so we just need to save in the monitor.

/ * *

 * @author xifanxiaxue

 * @date2020/11/19 sons

 * @descObsolete to inform

* /


public class CaffeineRemovalListenerTest {



    @Test

    public void test(a) throws InterruptedException {

        LoadingCache<Integer, Integer> cache = Caffeine.newBuilder()

                .expireAfterAccess(1, TimeUnit.SECONDS)

                .scheduler(Scheduler.systemScheduler())

             // Added elimination listening

                .removalListener(((key, value, cause) -> {

                    System.out.println("Elimination notice, key:" + key + ", reason:" + cause);

                }))

                .build(new CacheLoader<Integer, Integer>() {

                    @Override

                    public @Nullable Integer load(@NonNull Integer key) throws Exception {

                        return key;

                    }

                });



        cache.put(1.2);



        Thread.currentThread().sleep(2000);

    }

Copy the code

As you can see, I used removalListener to provide the elimination listener, so you can see the following print:

Elimination notification, key: 1, cause: EXPIRED

Intern Zhang: WHEN I saw that the data was eliminated, several causes were provided, that is, the cause. What are the corresponding causes?

There are several reasons why the current data are out:

  • EXPLICIT: If the reason is EXPLICIT, it means that the data has been removed manually.

  • REPLACED: Replaces data. When data is put, old data is overwritten and removed.

  • COLLECTED: This ambiguous point is actually caused by collection, that is, garbage collection, generally weak or soft references will lead to this situation.

  • EXPIRED: Data expires without explanation.

  • SIZE: The number is removed when the number exceeds the upper limit.

These are the reasons for the elimination of data. If necessary, we can deal with different businesses according to different reasons.

Practical application: At present, in our project, we have done the cache entry process when the data is eliminated. After all, some developers did forget to manually save the data after the logical processing, so we can only make a bottom-saving mechanism to avoid data loss.

If you are interested in Caffeine or want to know more about Caffeine, please follow me

About the author: Fan talk about programming, learn technology, learn new technology, learn useful technology, please search wechat: porridge snow.