Two years ago, I wrote a well-received blog post on the front-end API request caching solution for business caching, covering how to cache data, promises, and time-out deletions (including how to build decorators). If you’re not familiar with this, read blogs to learn more.

But the previous code and solution were simpler and more intrusive to the business. This is not good, so the author began to re-learn and think about Proxy.

Proxy can be understood as a layer of “interception” before the target object. All external access to the object must pass this layer of interception. Therefore, Proxy provides a mechanism for filtering and rewriting external access. The word Proxy is used to mean that it acts as a Proxy for certain operations. For the introduction and use of Proxy, please read Ruan Yifong’s Introduction to Proxy 6.

Evolution of the project

No project is easy to write. Here’s how to write a Proxy cache library. I hope I can be of some help to you.

Proxy Handler Adds the cache

Of course, the handler parameter in the proxy is also an object, so it is possible to add data items since it is an object, so we can write memoize functions based on the Map cache to improve the recursive performance of the algorithm.

type TargetFun<V> = (. args:any[]) = > V

function memoize<V> (fn: TargetFun<V>) {
  return new Proxy(fn, {
    // At this point you can only skip or add a mid-tier integration Proxy and object.
    // Add a cache to the object
    // @ts-ignore
    cache: new Map<string, V>(),
    apply(target, thisArg, argsList) {
      // Get the current cache
      const currentCache = (this as any).cache
      
      // Directly generate Map keys based on data parameters
      let cacheKey = argsList.toString();
      
      // Not currently cached, perform the call, add cache
      if(! currentCache.has(cacheKey)) { currentCache.set(cacheKey, target.apply(thisArg, argsList)); }// Return the cached data
      returncurrentCache.get(cacheKey); }}); }Copy the code

We can try memoize Fibonacci functions, which have a very big performance improvement after the proxy function:

const fibonacci = (n: number) :number= > (n <= 1 ? 1 : fibonacci(n - 1) + fibonacci(n - 2));
const memoizedFibonacci = memoize<number>(fibonacci);

for (let i = 0; i < 100; i++) fibonacci(30); // ~5000ms
for (let i = 0; i < 100; i++) memoizedFibonacci(30); // ~50ms
Copy the code

Custom function parameters

We can still generate unique values using the function described in the previous blog, but we no longer need the function name:

const generateKeyError = new Error("Can't generate key from function argument")

// Generate unique values based on function arguments
export default function generateKey(argument: any[]) :string {
  try{
    return `The ${Array.from(argument).join(', ')}`
  }catch(_) {
    throw generateKeyError
  }
}
Copy the code

Although the library itself can provide unique values based on function parameters, this is certainly not sufficient for a variety of different businesses, requiring user-definable parameter serialization.

// If the normalizer function is available, use it directly. Otherwise, use the default function
constnormalizer = options? .normalizer ?? generateKeyreturn new Proxy<any>(fn, {
  // @ts-ignore
  cache,
  apply(target, thisArg, argsList: any[]) {
    const cache: Map<string.any> = (this as any).cache
    
    // Generate a unique value based on the formatting function
    const cacheKey: string = normalizer(argsList);
    
    if(! cache.has(cacheKey)) cache.set(cacheKey, target.apply(thisArg, argsList));returncache.get(cacheKey); }});Copy the code

Adding the Promise cache

In a previous blog post, I mentioned the drawbacks of caching data. Multiple calls at the same time will result in multiple requests because the request is not returned. So we need to add a cache of promises as well.

if(! currentCache.has(cacheKey)){let result = target.apply(thisArg, argsList)
  
  // Cache a promise if it is a promise.
  // If the current function has then, it is a Promise
  if(result? .then) { result =Promise.resolve(result).catch(error= > {
      // An error occurred, delete the current PROMISE, otherwise a second error will be raised
      // Due to asynchracy, the current delete call must be after set,
      currentCache.delete(cacheKey)
    
      // Generate the error
      return Promise.reject(error)
    })
  }
  currentCache.set(cacheKey, result);
}
return currentCache.get(cacheKey);
Copy the code

At this point, we can cache not only the data, but also the Promise data request.

Add the expiration deletion function

We can add the timestamp of the current cache to the data when the data is generated.

/ / cache entries
export default class ExpiredCacheItem<V> {
  data: V;
  cacheTime: number;

  constructor(data: V) {
    this.data = data
    // Add the system timestamp
    this.cacheTime = (new Date()).getTime()
  }
}

// Edit the Map cache middle layer to determine whether it is expired
isOverTime(name: string) {
  const data = this.cacheMap.get(name)

  // There is no data (because the current saved data is ExpiredCacheItem), so we will look at the successful timeout
  if(! data)return true

  // Get the current timestamp of the system
  const currentTime = (new Date()).getTime()

  // Get the past seconds of the current time and the stored time
  const overTime = currentTime - data.cacheTime

  // If the number of seconds in the past is greater than the current timeout, null is also returned to fetch data from the server
  if (Math.abs(overTime) > this.timeout) {
    // This code can be absent without a problem, but if it is present, entering the method again can reduce the judgment.
    this.cacheMap.delete(name)
    return true
  }

  / / no timeout
  return false
}

// The cache function has data
has(name: string) {
  // Determine whether a timeout occurs in the cache
  return !this.isOverTime(name)
}
Copy the code

At this point, we can do all the functionality described in the previous blog. However, it would be unsatisfying to end there. Let’s continue learning about other libraries to optimize my library.

Adding Manual Management

Generally, these cache libraries have manual management capabilities, so I have also provided manual management of the cache for business management purposes. Here we use the Proxy GET method to intercept property reads.

 return new Proxy(fn, {
  // @ts-ignore
  cache,
  get: (target: TargetFun<V>, property: string) = > {
    
    // If manual management is configured
    if(options? .manual) {const manualTarget = getManualActionObjFormCache<V>(cache)
      
      // If the currently called function is in the current object, call directly, otherwise access the original object
      // Even if the current function has this property or method is not considered, because you have configured manual management.
      if (property in manualTarget) {
        return manualTarget[property]
      }
    }
   
    // No manual management is configured. Access the original object directly
    return target[property]
  },
}


export default function getManualActionObjFormCache<V> (
  cache: MemoizeCache<V>
) :CacheMap<string | object.V> {
  const manualTarget = Object.create(null)
  
  // Add cache operations such as set get delete clear through closures
  manualTarget.set = (key: string | object, val: V) = > cache.set(key, val)
  manualTarget.get = (key: string | object) = > cache.get(key)
  manualTarget.delete = (key: string | object) = > cache.delete(key)
  manualTarget.clear = () = >cache.clear! (a)return manualTarget
}
Copy the code

Reflect is recommended if the current situation is not too complicated and we can call it directly.

Add WeakMap

When we use cache, we can also provide WeakMap (WeakMap does not have clear and size methods), here I extract the BaseCache base class.

export default class BaseCache<V> {
  readonly weak: boolean;
  cacheMap: MemoizeCache<V>

  constructor(weak: boolean = false) {
    // Whether to use weakMap
    this.weak = weak
    this.cacheMap = this.getMapOrWeakMapByOption()
  }

  // Obtain Map or WeakMap according to the configuration
  getMapOrWeakMapByOption<T>(): Map<string, T> | WeakMap<object, T>  {
    return this.weak ? new WeakMap<object, T>() : new Map<string, T>()
  }
}
Copy the code

After that, I added various types of cache classes as a base class.

Adding a cleanup function

When deleting the cache, you need to clean the value and provide the user’s Dispose function. This class inherits from BaseCache and provides the Dispose call.

export const defaultDispose: DisposeFun<any> = () = > void 0

export default class BaseCacheWithDispose<V.WrapperV> extends BaseCache<WrapperV> {
  readonly weak: boolean
  readonly dispose: DisposeFun<V>

  constructor(weak: boolean = false, dispose: DisposeFun<V> = defaultDispose) {
    super(weak)
    this.weak = weak
    this.dispose = dispose
  }

  // Clean up a single value (called before calling delete)
  disposeValue(value: V | undefined) :void {
    if (value) {
      this.dispose(value)
    }
  }

  // Clear all values (called before calling the clear method, if the current Map has iterators)
  disposeAllValue<V>(cacheMap: MemoizeCache<V>): void {
    for (let mapValue of (cacheMap as any)) {
      this.disposeValue(mapValue? .1])}}}Copy the code

If the current cache is WeakMap, there is no clear method and iterator. I would like to add an intermediate layer to do all this (still under consideration, not currently). If WeakMap calls the clear method, I am providing the new WeakMap directly.

clear() {
  if (this.weak) {
    this.cacheMap = this.getMapOrWeakMapByOption()
  } else {
    this.disposeAllValue(this.cacheMap)
    this.cacheMap.clear! ()}}Copy the code

Add count reference

In learning about the other memoizee I saw the following uses:

memoized = memoize(fn, { refCounter: true });

memoized("foo".3); // refs: 1
memoized("foo".3); // Cache hit, refs: 2
memoized("foo".3); // Cache hit, refs: 3
memoized.deleteRef("foo".3); // refs: 2
memoized.deleteRef("foo".3); // refs: 1
memoized.deleteRef("foo".3); // refs: 0, clear foo's cache
memoized("foo".3); // Re-executed, refs: 1
Copy the code

So I followed suit and added RefCache.

export default class RefCache<V> extends BaseCacheWithDispose<V.V> implements CacheMap<string | object.V> {
	// Add the ref count
  cacheRef: MemoizeCache<number>

  constructor(weak: boolean = false, dispose: DisposeFun<V> = () => void 0) {
    super(weak, dispose)
    // Generate WeakMap or Map according to the configuration
    this.cacheRef = this.getMapOrWeakMapByOption<number>()
  }
  

  // get has clear, etc. Not listed
  
  delete(key: string | object) :boolean {
    this.disposeValue(this.get(key))
    this.cacheRef.delete(key)
    this.cacheMap.delete(key)
    return true;
  }


  set(key: string | object.value: V): this {
    this.cacheMap.set(key, value)
    // add ref to set
    this.addRef(key)
    return this
  }

  // You can also add counts manually
  addRef(key: string | object) {
    if (!this.cacheMap.has(key)) {
      return
    }
    const refCount: number | undefined = this.cacheRef.get(key)
    this.cacheRef.set(key, (refCount ?? 0) + 1)}getRefCount(key: string | object) {
    return this.cacheRef.get(key) ?? 0
  }

  deleteRef(key: string | object) :boolean {
    if (!this.cacheMap.has(key)) {
      return false
    }

    const refCount: number = this.getRefCount(key)

    if (refCount <= 0) {
      return false
    }

    const currentRefCount = refCount - 1
    
    // If the current refCount is greater than 0, set, otherwise clear
    if (currentRefCount > 0) {
      this.cacheRef.set(key, currentRefCount)
    } else {
      this.cacheRef.delete(key)
      this.cacheMap.delete(key)
    }
    return true}}Copy the code

Modify the proxy main function:

if(! currentCache.has(cacheKey)) {let result = target.apply(thisArg, argsList)

  if(result? .then) { result =Promise.resolve(result).catch(error= > {
      currentCache.delete(cacheKey)
      return Promise.reject(error)
    })
  }
  currentCache.set(cacheKey, result);

  // refCounter is currently configured
} else if(options? .refCounter) {// If it is called again and is already cached, increment it directlycurrentCache.addRef? .(cacheKey) }Copy the code

Add the LRU

LRU stands for Least Recently Used. LRU is definitely more efficient than other data structures for caching.

Consider adding Max values as well as maxAge (here I use two maps for LRU, which increases memory consumption, but performs better).

If the current cached item is equal to Max, we simply set the current cacheMap to oldCacheMap and new cacheMap.

set(key: string | object, value: V) {
  const itemCache = new ExpiredCacheItem<V>(value)
  // If there is a value before, modify it directly
  this.cacheMap.has(key) ? this.cacheMap.set(key, itemCache) : this._set(key, itemCache);
  return this
}

private _set(key: string | object, value: ExpiredCacheItem<V>) {
  this.cacheMap.set(key, value);
  this.size++;

  if (this.size >= this.max) {
    this.size = 0;
    this.oldCacheMap = this.cacheMap;
    this.cacheMap = this.getMapOrWeakMapByOption()
  }
}
Copy the code

If cacheMap has a value that has not expired, return it. If it does not, go to oldCacheMap and search for it. If it does, delete the old data and place it in the new data (using _set).

get(key: string | object): V | undefined {
  // If cacheMap has one, return value
  if (this.cacheMap.has(key)) {
    const item = this.cacheMap.get(key);
    return this.getItemValue(key, item!) ; }// If oldCacheMap has
  if (this.oldCacheMap.has(key)) {
    const item = this.oldCacheMap.get(key);
    // There is no expiration date
    if (!this.deleteIfExpired(key, item!) ) {// Move to new data and delete old data
      this.moveToRecent(key, item!) ;returnitem! .dataasV; }}return undefined
}


private moveToRecent(key: string | object, item: ExpiredCacheItem<V>) {
  // Delete old data
  this.oldCacheMap.delete(key);
  
  // New data setting, focus on !!!! If the current value is equal to Max, clear oldCacheMap so that the value does not exceed Max
  this._set(key, item);
}

private getItemValue(key: string | object.item: ExpiredCacheItem<V>): V | undefined {
  // query if maxAge is currently set, otherwise return
  return this.maxAge ? this.getOrDeleteIfExpired(key, item) : item? .data; }private getOrDeleteIfExpired(key: string | object.item: ExpiredCacheItem<V>): V | undefined {
  const deleted = this.deleteIfExpired(key, item);
  return! deleted ? item.data :undefined;
}
  
private deleteIfExpired(key: string | object, item: ExpiredCacheItem<V>) {
  if (this.isOverTime(item)) {
    return this.delete(key);
  }
  return false;
}  
Copy the code

Organize the memoize function

At this point, we can free ourselves from the details of the previous code and look at the interfaces and main functions based on these functions.

// Interface oriented, regardless of whether other types of cache classes may be added later
export interface BaseCacheMap<K, V> {
  delete(key: K): boolean;

  get(key: K): V | undefined;

  has(key: K): boolean;

  set(key: K, value: V): this; clear? () :void; addRef? (key: K):void; deleteRef? (key: K):boolean;
}

// Cache configuration
export interface MemoizeOptions<V> {
  /** serialize parameters */normalizer? :(args: any[]) = > string;
  /** Whether to use WeakMap */weak? :boolean;
  /** Maximum number of milliseconds, delete */ when obsoletemaxAge? :number;
  /** Maximum number of items, over which */ is deletedmax? :number;
  /** Manually manage memory */manual? :boolean;
  /** Whether to use reference counting */refCounter? :boolean;
  /** Callback during cache deletion */dispose? : DisposeFun<V>; }// The function returned (with a list of methods)
export interface ResultFun<V> extends Function {
  delete? (key:string | object) :boolean; get? (key:string | object): V | undefined; has? (key:string | object) :boolean; set? (key:string | object.value: V): this; clear? () :void; deleteRef? () :void
}

Copy the code

The final memoize function is pretty much the same as the original one, only doing three things

  • Check the parameters and throw an error
  • Get the appropriate cache based on the parameters
  • Return to the agent
export default function memoize<V> (fn: TargetFun
       
        , options? : MemoizeOptions
        
       ) :ResultFun<V> {
  // Check the arguments and throw an error
  checkOptionsThenThrowError<V>(options)

  // Fix serialization function
  constnormalizer = options? .normalizer ?? generateKeylet cache: MemoizeCache<V> = getCacheByOptions<V>(options)

  // Return the proxy
  return new Proxy(fn, {
    // @ts-ignore
    cache,
    get: (target: TargetFun<V>, property: string) = > {
      // Add manual management
      if(options? .manual) {const manualTarget = getManualActionObjFormCache<V>(cache)
        if (property in manualTarget) {
          return manualTarget[property]
        }
      }
      return target[property]
    },
    apply(target, thisArg, argsList: any[]): V {

      const currentCache: MemoizeCache<V> = (this as any).cache

      const cacheKey: string | object= getKeyFromArguments(argsList, normalizer, options? .weak)if(! currentCache.has(cacheKey)) {let result = target.apply(thisArg, argsList)

      
        if(result? .then) { result =Promise.resolve(result).catch(error= > {
            currentCache.delete(cacheKey)
            return Promise.reject(error)
          })
        }
        currentCache.set(cacheKey, result);
      } else if(options? .refCounter) { currentCache.addRef? .(cacheKey) }return currentCache.get(cacheKey) asV; }})as any
}
Copy the code

The complete code is in Memoizee-Proxy. Everyone operate and play by themselves.

The next step

test

Test coverage isn’t everything, but the JEST test library has given me a lot of help in implementing the library, helping me rethink the functionality and parameter validation that every class and function should have. In the previous code, I always checked the main entry of the project without thinking deeply about the parameters of each class or function. In fact, this is not robust enough. Because you can’t decide how users use your library.

The Proxy in-depth

In fact, the application scenarios of agents are infinite. This is already proven by Ruby (learn Ruby metaprogramming).

Developers can use it to create coding patterns such as (but far from limited to) tracking property access, hiding properties, preventing modification or deletion of properties, function parameter validation, constructor parameter validation, data binding, and observables.

Of course, Proxy comes from ES6, but the API still requires a higher browser version, and proxy-Pollfill offers limited functionality. However, it is already 2021, so I believe it is time to learn more about Proxy.

Into the cache

Caching is bad! There is no doubt about that. But it was too fast! So we need to understand more about the business, what data needs to be cached, and what data can be cached.

The current written cache is only for one method. Can later written items return data in a more fine-grained combination? Or do you think bigger and write a cache layer?

Steps to develop

In developing the project, I took small, fast steps and constantly reworked. In the very beginning of the code, only the expiration deletion function was added.

But every time I finished a new feature, I started to clean up the logic and flow of the library again, trying to make the code elegant enough every time. And because I don’t have the ability to think through everything the first time I write it. However, I hope to make continuous progress in the future work. This also reduces rework of the code.

other

Function creates

In fact, when I added manual management to the current library, I considered copying functions directly, since functions themselves are objects. Add methods such as set to the current function. But there is no way to copy the scope chain over.

I didn’t succeed, but I learned something, and there are also two code for creating functions.

We basically use new Function to create functions when we create functions, but browsers don’t provide a constructor that can create asynchronous functions directly, so we need to get it manually.

AsyncFunction = (async x => x).constructor

foo = new AsyncFunction('x, y, p'.'return x + y + await p')

foo(1.2.Promise.resolve(3)).then(console.log) / / 6
Copy the code

For global functions, we can also create functions directly with fn.tostring (), while asynchronous functions can also be constructed directly.

function cloneFunction<T> (fn: (... args:any[]) => T) : (. args:any[]) = >T {
  return new Function('return '+ fn.toString())();
}
Copy the code

To encourage the

If you think this article is good, I hope you can give me some encouragement and help me star under my Github blog.

Blog address

The resources

The front-end API requests a caching scheme

ECMAScript 6 Getting Started with Agents

memoizee

memoizee-proxy