1. Guava cache principle – GET operation

org.spark_project.guava.cache.LocalCache#get(K, org.spark_project.guava.cache.CacheLoader<? super K,V>)

V get(K key, CacheLoader<? super K, V> loader) throws ExecutionException { int hash = this.hash(Preconditions.checkNotNull(key)); return this.segmentFor(hash).get(key, hash, loader); / / @ 1}Copy the code

@1: Get key value from segment;

V get(K key, int hash, CacheLoader<? super K, V> loader) throws ExecutionException { Object var16; try { if (this.count ! = 0) { // @1 LocalCache.ReferenceEntry<K, V> e = this.getEntry(key, hash); if (e ! = null) { // @2 long now = this.map.ticker.read(); V value = this.getLiveValue(e, now); if (value ! = null) { this.recordRead(e, now); this.statsCounter.recordHits(1); Object var17 = this.scheduleRefresh(e, key, hash, value, now, loader); return var17; } LocalCache.ValueReference<K, V> valueReference = e.getValueReference(); if (valueReference.isLoading()) { Object var9 = this.waitForLoadingValue(e, key, valueReference); return var9; } } } var16 = this.lockedGetOrLoad(key, hash, loader); // @3 } catch (ExecutionException var14) { Throwable cause = var14.getCause(); if (cause instanceof Error) { throw new ExecutionError((Error)cause); } if (cause instanceof RuntimeException) { throw new UncheckedExecutionException(cause); } throw var14; } finally { this.postReadCleanup(); } return var16; }Copy the code

@1: count = count+1; get = count-1;

Code @ 2: e! =null: LLDB etValueReference() : LLDB etValueReference() : LLDB etValueReference(!

Code @3: Loads data from a user-defined load method;

V getLiveValue (LocalCache ReferenceEntry < K, V > entry, long) {if (entry. The getKey () = = null) {/ / / if is empty, Returns an empty enclosing tryDrainReferenceQueues (); return null; } else { V value = entry.getValueReference().get(); if (value == null) { this.tryDrainReferenceQueues(); return null; } else if (this.map.isExpired(entry, now)) { // @1 this.tryExpireEntries(now); return null; } else { return value; }}}Copy the code

Code @1: Determine whether an entry is expired. In this case, determine whether an entry is expired when the GET operation is performed and expire the entry is removed. Instead of using background thread polling, you use this lazy deletion strategy, which can cause a memory leak if you never access.

V lockedGetOrLoad(K key, int hash, CacheLoader<? super K, V> loader) throws ExecutionException { LocalCache.ValueReference<K, V> valueReference = null; LocalCache.LoadingValueReference<K, V> loadingValueReference = null; boolean createNewEntry = true; this.lock(); // @1 LocalCache.ReferenceEntry e; try { long now = this.map.ticker.read(); this.preWriteCleanup(now); int newCount = this.count - 1; // @2 AtomicReferenceArray<LocalCache.ReferenceEntry<K, V>> table = this.table; int index = hash & table.length() - 1; LocalCache.ReferenceEntry<K, V> first = (LocalCache.ReferenceEntry)table.get(index); for(e = first; e ! = null; e = e.getNext()) { K entryKey = e.getKey(); if (e.getHash() == hash && entryKey ! = null && this.map.keyEquivalence.equivalent(key, entryKey)) { valueReference = e.getValueReference(); if (valueReference.isLoading()) { createNewEntry = false; } else { V value = valueReference.get(); if (value == null) { this.enqueueNotification(entryKey, hash, valueReference, RemovalCause.COLLECTED); } else { if (! this.map.isExpired(e, now)) { this.recordLockedRead(e, now); this.statsCounter.recordHits(1); Object var16 = value; return var16; } this.enqueueNotification(entryKey, hash, valueReference, RemovalCause.EXPIRED); } this.writeQueue.remove(e); this.accessQueue.remove(e); this.count = newCount; } break; } } if (createNewEntry) {// @3 loadingValueReference = new LocalCache.LoadingValueReference(); if (e == null) { e = this.newEntry(key, hash, first); e.setValueReference(loadingValueReference); table.set(index, e); } else { e.setValueReference(loadingValueReference); } } } finally { this.unlock(); this.postWriteCleanup(); } if (createNewEntry) { // @4 Object var19; try { synchronized(e) { var19 = this.loadSync(key, hash, loadingValueReference, loader); } } finally { this.statsCounter.recordMisses(1); } return var19; } else { return this.waitForLoadingValue(e, key, valueReference); }}Copy the code

The @1 segment is a lock that inherits the ReentrantLock, and is locked to address cache penetration

At sign 2 for count-1, which is how active the key is

Code @3 creates a new reference, and Entry holds one of it

Code @4: Synchronously call the load method to get the user-supplied value and assign the value to loadingValueReference

V loadSync(K key, int hash, LocalCache.LoadingValueReference<K, V> loadingValueReference, CacheLoader<? super K, V> loader) throws ExecutionException {
            ListenableFuture<V> loadingFuture = loadingValueReference.loadFuture(key, loader); //@1
            return this.getAndRecordStats(key, hash, loadingValueReference, loadingFuture);
}
Copy the code

Code @1: Loads the value set by the user and assigns it to loadingValueReference

public ListenableFuture<V> loadFuture(K key, CacheLoader<? super K, V> loader) { this.stopwatch.start(); Object previousValue = this.oldValue.get(); try { if (previousValue == null) { V newValue = loader.load(key); // @1 return (ListenableFuture)(this.set(newValue) ? this.futureValue : Futures.immediateFuture(newValue)); // @2 } else { ListenableFuture<V> newValue = loader.reload(key, previousValue); return newValue ! = null ? newValue : Futures.immediateFuture((Object)null); } } catch (Throwable var5) { } }Copy the code

Code @1 loads the user’s value

Code @2 assigns to and assigns to loadingValueReference