preface
When building a website using the Koa framework, you typically return index.html as an entry file, which requires a specific Response header.
Because of Koa’s own minimalist design, I believe that those of you who have not seen the source code will write the following code at the first time:
const Koa = require("koa");
const fs = require("fs");
const app = new Koa();
app
.use(ctx= > {
ctx.set("Content-Type"."text/html");
ctx.body = fs.readFileSync("index.html");
})
.listen(3000);
Copy the code
This is certainly doable and can enhance your understanding of HTTP Headers, but people who are really familiar with Koa will disdain such a primitive approach.
Although Koa’s source code is short and compact, it still provides developers with many convenient features, such as ctx.type, which this article focuses on.
Take you knowctx.type
To start with the code, just set the context (CTX) type to “HTML” :
app
.use(ctx= > {
ctx.type = "html";
ctx.body = fs.readFileSync("index.html");
})
.listen(3000);
Copy the code
Httpie. IO /). Httpie. IO /
In fact, it is very simple to implement, by passing in different types, automatically set the corresponding content-Type response header.
The source code is lib/response.js.
module.exports = {
// ...
set type(type) {
type = getType(type);
if (type) {
this.set("Content-Type", type);
} else {
this.remove("Content-Type"); }}};Copy the code
So what is getType? The original use of third-party package “mysterious power”.
const getType = require("cache-content-type");
Copy the code
The keyword “cache” appeared, and as a professional front-end engineer (Chettuzai), I realized that there must be some optimization.
To test my guess, I looked at the file’s submission history on GitHub.
Sure enough, in July of ’18, a PR on performance optimization was incorporated into the Master.
Take a look at what’s changed:
Solve crimes! The first is the miME-types library.
JSON file (mime-db/db.json), return a content-Type mapping such as HTML -> text/ HTML, and add charset= UTF-8 as appropriate.
// db.json
{
"text/html": {
"source": "iana"."compressible": true."extensions": ["html"."htm"."shtml"]}}Copy the code
Here’s a nice site to run the Node.js package online: npm.runkit.com/mime-types.
You can play with these libraries online in your browser, as well as view source code, readme.md, and more.
Koa optimization path: LRU
Turning to the present, the MIME-types library in Koa has been ruthlessly abandoned in favor of the Cache-Content-Type library.
The source code is simpler, with just 15 lines:
"use strict";
const mimeTypes = require("mime-types");
const LRU = require("ylru");
const typeLRUCache = new LRU(100);
module.exports = type= > {
let mimeType = typeLRUCache.get(type);
if(! mimeType) { mimeType = mimeTypes.contentType(type); typeLRUCache.set(type, mimeType); }return mimeType;
};
Copy the code
The core idea is to maintain a Cache:
- When there is no mimeType in the Cache, a lookup is performed and the result mimeType is stored in the Cache
- When type is found in the Cache for the second time, the system returns the corresponding mimeType and does not search for type
Because the JSON files that store THE MIME information are very large, each lookup is time-consuming, so minimize the number of lookups.
The typeLRUCache has a maximum capacity of 100. After storing more than 100 types, can’t the results of subsequent type lookups be cached?
No, because in the codetypeLRUCache
It’s not the same Cache I’m talking about.
The Cache I’m talking about is a simple way to exchange space for time, but we can’t mindlessly expand the space. Suppose the capacity is 1000000 or 1E10, which will dramatically increase the running memory of the program, causing the computer to stall and crash.
Thanks to the YLRU library, we have a compromise.
Because it implements the Least Recently Used (LRU) algorithm, what does this algorithm do? Let me give you an example to make it easier for you to understand.
There are only two parking Spaces P1 and P2 in A community, but there are four car owners named A, B, C and D.
-
Tesla owner A used P1 on the first day
-
The next day BMW owner B uses P2
-
On the third day, Porsche owner C came to park his car and found no parking space. What should he do?
- Out of consideration of humanization, the property owner of the community forced A to move out of the parking space, assuming that the owner of Tesla A had passed away, since BMW owner B had just used P2 one day ago and Tesla owner A had used P1 two days ago
- Porsche owner C successfully parked his car in P1
-
On the morning of the fourth day, BMW owner B left the parking space
-
On the afternoon of the fourth day, BMW owner B entered the parking space
-
On the fifth day, Mercedes car owner D came to park his car and found the parking space was full. What should he do?
- For the consideration of humanization, the owner of the residential property, BMW B, frequently used P2 a day ago, while the owner of Porsche C has been quiet for a day, even if you are a luxury car, it is useless, so the owner of Porsche C was forced to move out of the parking space
- Mercedes owner D successfully parked his car in P1
Conclusion: The total number of parking Spaces remains the same, but at least the rights and interests of car owners who frequently use parking Spaces are protected. Parking Spaces are always yours, not based on who comes first or whose car is expensive.
Similar to typeLRUCache, only 100 cached results can be stored, preventing unlimited expansion of the capacity, but ensuring that the results of the recently frequently searched type are always cached, indicating the efficiency of subsequent search.
I wrote a simple LRU demo here:
class LRUCache {
constructor(capacity) {
this.cache = new Map(a);this.capacity = capacity;
}
get(key) {
if (this.cache.has(key)) {
const temp = this.cache.get(key);
this.cache.delete(key);
this.cache.set(key, temp);
return temp;
}
return -1;
}
put(key, value) {
if (this.cache.has(key)) {
this.cache.delete(key);
} else if (this.cache.size >= this.capacity) {
this.cache.delete(this.cache.keys().next().value);
}
this.cache.set(key, value); }}const cache = new LRUCache(2);
cache.put("A".1);
cache.put("B".2);
cache.get("A"); / / returns 1
cache.put("C".3); // This operation invalidates 'B'
cache.get("B"); // return -1 (not found)
cache.put("D".4); // This operation invalidates 'A'
cache.get("A"); // return -1 (not found)
cache.get("C"); / / return 3
Copy the code
At the end
This article is happy ending, from the setting of the content-Type, we have looked at the source code of 4 libraries, this wave of blood profit.
If you are interested in Koa source code, please click Koa source analysis & implementation, teach you to hand write a “copy” Koa.