“This article has participated in the call for good writing activities, click to view: the back end, the big front end double track submission, 20,000 yuan prize pool waiting for you to challenge!”
background
What is the Polyfill
As we all know, web sites today are not ie678 compatible, and will not be in this lifetime. Ah, that smells good.
Okay, it’s not a big deal to be compatible. We can’t use fancy arrow functions, we can’t use fancy iterators, we can’t use promises, we can’t use async await, we can’t use import, we can’t use array. map, array. filter… It’s not like you can use it if you want. So everything started 10 years ago.
As a programmer with ambitions, he can’t be defeated by reality, so all kinds of hacks have sprung up. Such as
window.Promise = (function(window){ if(window.Promise){ return window.Promise }else{ window.Promise = function(){ } // Compatible code}})(window)Copy the code
Such code is everywhere in the project directory, with such compatible code, we knock up some SAO operation, black technology that is a handy. A code like this is called a polyfill.
There is no way in the world, but when more people walk, there is a way
Mr. Lu Xun is competent after all, to the point. After all, introducing polyfill into a project like this was not a long-term solution, and Babel was born.
Babel is a widely used transcoder. It automatically transcodes ES6 code to the specified version of JS. Such as
Input. map(item => item + 1); Input. map(function (item) {return item + 1; });Copy the code
However, Babel only converts JS syntax by default, not new apis such as Iterator, Generator, Promise, etc., so you need to use polyfills provided by Babel to compile these syntax.
Here are a few ways to introduce Polyfill into a project
@babel/polyfill
Introduced in the project
require("babel-polyfill")
// or
import 'babel-polyfill'
Copy the code
All compatibility issues will be resolved at once, and the global object will be mounted first, but this will also introduce all polyfills, and the js package will be much larger. Browsers that support features are also wasteful and not suitable for use in development frameworks or libraries.
@babel/preset-env
Use the babel-Preset -env plug-in and the useBuiltIns property, which is mainly configured in package.json
{
"presets": [
["env", {
"modules": false,
"targets": {
"browsers": ["ie >=9"]
},
"useBuiltIns": true,
"debug": true
}]
]
}
Copy the code
The advantage of this is that you can specify the browser version number and, most importantly, use the useBuiltIns attribute to ensure that the project only introduces the required polyfills and does not pack the full amount into the project, which is obviously much better than the previous one.
polyfill.io
This method uses CDN to introduce the required features into the project by script, and can dynamically determine whether polyfill is needed according to the UA of the user’s browser and the parameters of the connection.
Polyfill. IO/v2 / polyfill…
This is more elegant than the above two methods in that you don’t need to package Polyfill into your own project, and you can use Polyfill more accurately without wasting user performance. Of course, the disadvantage is that you have to understand the API used in the framework and library you are working on.
research
According to the above comparison of Polyfill, we mainly study the third way of using CDN this time.
The top one is the CDN provided by Polyfill and the bottom one is the CDN provided by Ali
Polyfill.alicdn.com/static/sdk?…
We can directly use these CDN for company or individual projects, but such risks will be greatly increased:
Depends on the stability of third-party services
Code that depends on a third party
Based on these two points, it is necessary for us to investigate how to basically deploy our own Polyfill service in the company.
polyfill-service
According to the polyfill. IO/v3 /
As well as locally based on github.com/Financial-T… The project builds a service
It is also possible to generate fixed code through free selection.
After investigation, the above code found the following problems
-
Node provides back-end services
-
The code structure is too complex
-
Introducing too many external links
-
Introducing too many unfamiliar tools (Terraform, VCL, NJK…)
-
Add to the burden of maintenance costs and security costs later
The code analysis
According to the above summary, in addition, after investigating the code, we found that although the service is complex, the core library is still under the server folder. That is, all the major logic is in there, so let’s do a simple analysis.
Polyfill (v3 version of router) can be handled by ignoring service startup, health check, and process daemon
It turns out that all polyfill and polyfill. Min route returns are in the Polyfillio package. So we skipped one of the hoops and went straight to the package to see the details
polyfill-library
This package is available in the Github project
Github.com/financial-t…
As you can see from the Readme, the library provides only five apis that can be described as very concise. Look at the organization of the project
The project entry file is in lib/index.js and you can see the project export. In polyfills there are too many projects. The other three files in lib are all tool-aids, so I won’t go through them all, but I’ll give you a general overview of the main index methods
index.js
GetPolyfills getPolyfillString getPolyfills getPolyfills
So there are two of them
The two functions are simply represented by two links
- getPolyfill
Processing parameters → Get current browser UA → check whether there is UA cache → get and sort feature’s dependent alias → filter the source code according to UA → Filter it from Excludes → filter the method used by the system → Return all remaining feature objects
- getPolyfillString
Process parameters and declare prompts variables → call getPolyfill to get feature objects required by the user → Group features according to their dependent names (featureNodes, featureEdges, Unknown…) → Add comment attributes for the feature → sort and resend the grouped ones → load corresponding headers according to whether to use Minify → traverse all dependencies → determine whether to load attributes according to feature’s gated parameters → load corresponding source code according to Minify → Load ending code → description of corresponding features set to all → handle callbacks with callback parameters → return all characters
As described above, look at the screenshots of the polyfills directory
It can be seen that each feature object has three problems.
Min is the compressed compatible code and RAW is the complete code. Meta is the full description of the file.
F. This is one of the meta explanatory files. Browsers are compatible conditions, with all browsers’ UA matching on
This third party package, and the project cache is based on
Such a cache to save some performance comparison cache is used with the mnemonist/lru-cache tool.
The development of
The above is a brief survey of the library required by Polyfill server. What kind of architecture do we really want to build?
I drew a simple service architecture diagram above, where LB and KUbe are not the focus of this time, we mainly from nginx and NodeJS processing.
So we need to do two things
-
Nodejs server, which parses URL parameters and user UA to dynamically return polyfill code
-
Nginx caches calculated code based on URLS and UAs to relieve the strain on node servers
koa2
It’s not that complicated to implement the first feature in NodeJS. There are two well-known NodeJS Web frameworks, Express and KOA. This article is not about comparing the two frameworks or explaining the NodeJS Web framework, so we will simply use the KOA framework for development.
Install directly in the working directory
npm init
npm install koa
yarn add koa
yarn add koa-bodyparser koa-router lodash moment polyfill-library
Copy the code
Configure package.json and write some app.js
const Koa = require('koa') const app = new Koa() const port = process.env.PORT || 3344 App. use(require(' koa-bodyParser ')()) const router = require('./routers/index.js' app.use(router.routes()) app.use(router.allowedMethods()) app.listen(port, () => { console.log(`server started at localhost:${port}`) }) "scripts": { "start": "NODE_ENV=production node ./app.js", "test": "NODE_ENV=test node ./app.js", "dev": "NODE_ENV=development node ./app.js", "lint": "eslint --ext .js ." },Copy the code
Direct YARN can start the service
router
After starting the service above, we need to think about the design of the router. According to the parameter Settings provided by Polyfill – Library, we only make the simplest version of Polyfill service at present.
- Using polyfill.min.js returns the compressed version of js, and using polyfill.js returns the uncompressed JS
- Follow the path with feature= array. of&feature=set to determine the polyfill feature to be obtained
In the router setting of KOA, we adopt the principle of convention rather than configuration. As shown in the figure, all routers are in the folder and index is the exit
The code for index.js is as follows
const fs = require('fs') const path = require('path') const Router = require('koa-router') const router = new Router() const ROUTERS_PATH = './routers/' function useRouter (customerRouters, controllerPath, prefix = '') { Object.keys(customerRouters).forEach(key => { let item = customerRouters[key] let method = item.method let action = item.action let uri = item.ignoreActionName ? `${prefix}` : `${prefix}/${action.name}` router[method](uri, async (ctx) => { console.log( 'make router succeed.', { 'httpMethod': method, 'routerPath': uri, 'filePath': controllerPath, 'actionName': action.name } ) await action(ctx) }) }) } function readAllRouterFileRecursive (routersPath = path.resolve(ROUTERS_PATH), prefix = '') { let list = fs.readdirSync(routersPath).filter(item => item ! == 'index.js') list.forEach(item => { let resolvePath = routersPath + '/' + item let stat = fs.statSync(resolvePath) let isDir = stat.isDirectory() let isFile = stat.isFile() if (isDir) { readAllRouterFileRecursive(resolvePath, `${prefix}/${item}`) } else if (isFile) { useRouter(require(resolvePath), resolvePath, `${prefix}/${item}`) } }) } readAllRouterFileRecursive() module.exports = routerCopy the code
The core is to read the current directory, take the folder name as the path, and then return the function in the action, so polyfill.js and polyfill.min.js are configured as follows
const generatorPolyfill = require('.. /lib/generatorPolyfill') module.exports = [ { method: 'get', ignoreActionName: true, action: async (ctx, next) => { const result = await generatorPolyfill(ctx, '') ctx.body = result } } ] const generatorPolyfill = require('.. /lib/generatorPolyfill') module.exports = [ { method: 'get', ignoreActionName: true, action: async (ctx, next) => { const result = await generatorPolyfill(ctx, 'min') ctx.body = result } } ]Copy the code
The difference is just whether the min parameter is selected when generatorPolyfill is done.
The code in GeneratorPolyfill.js is
const polyfillLibrary = require('polyfill-library') module.exports = async (ctx, isMin) => { const { request, query } = ctx const { features } = query const { header } = request let fkey = [] if (typeof features === 'string') { fkey.push(features) } else if (Array.isArray(features)) { fkey = [].concat(features) } let paramsFeatures = fkey.reduce((result, item) => { result[item] = { flags: ['gated']} return result}, {}) // const uaString = 'Mozilla/4.0 (compatible; MSIE 5.0; Windows NT 6.1; Trident / 4.0) 'const uaString = header [' the user-agent']. The console log (' uaString -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - > ') to the console. The log (uaString) console.log('<-------------end uaString') console.log(paramsFeatures) const result = await polyfillLibrary.getPolyfillString({ uaString: uaString, minify: isMin === 'min', features: paramsFeatures }) fkey = [] paramsFeatures = {} return result }Copy the code
I’m going to go to getPolfillString based on the user’s user-agent and the path’s parameters and then I’m going to go back to getPolfillString, which is kind of easy to understand, and once we’ve configured that, we can access it.
Based on the tests of the previous visits, the service is working fine.
nginx
After nodejs service runs normally, we still need to consider another problem, when the same UA, the same feature will return the same data, so whether to cache the data in the front layer of nodeJS service nginx, in fact, we do so. Let’s first look at the nginx configuration. Assuming we use a www.test.com to proxy localhost:3344, the nginx configuration is
proxy_cache_path /data/nginx/cache/polyfill keys_zone=polyfill:10m use_temp_path=off; server { listen 80; server_name test.com; location ~ ^/polyfill.*\.js { proxy_cache polyfill; proxy_cache_key "$request_uri$http_user_agent"; proxy_ignore_headers Cache-Control; proxy_cache_min_uses 1; proxy_cache_valid 200 206 304 301 302 1y; add_header X-Cache-Status $upstream_cache_status; # add line proxy_pass http://127.0.0.1:3344; }}Copy the code
Then modify the hosts configuration
127.0.0.1 test.com
This will allow normal access
Let’s verify that caching works
Logs are printed each time a node is accessed
However, when we refresh the configuration a second time, emulating the same request twice, node will not print the log, the request will be in nginx and the cache will be hit (// TODO here is the GIF)
The complete deployment process is complete, but the NodeJS service is not.
Github.com/wuyxp/polyf…
The deployment of
After the service is complete, the deployment can be another task, because it involves nodeJS daemon, log collection, service monitoring, etc. We will see you later.
reference
www.ruanyifeng.com/blog/2016/0…
www.babeljs.cn/docs/babel-…
www.thebasement.be/working-wit…
Juejin. Cn/post / 684490…
www.jianshu.com/p/3b27dfc67…
Yq.aliyun.com/articles/29…
Cbutech.net/index.php/a…
www.helplib.com/GitHub/arti…