Performance optimization analysis
As a front-end developer you can say more or less about performance optimization, but you’ll find it’s often disorganized. So how do we organize our performance optimization analysis?
This paper will analyze the whole process from the user input a URL to request a page to see where it is very performance consuming, and then optimize it accordingly.
- The browser sends a request
- The DNS
- Establishing a TLS Connection (HTTPS)
- Set up TCP connections (TCP three-way handshake) (The browser limits the number of connections to be set up in the same domain name to six. If six connections have been set up before, the domain name is queued.)
- After the connection is established, the browser side will construct the request line, request information, etc., and attach the data related to the domain name, such as cookies, to the request header, and then send the constructed request information to the server.
- The server parses the request
- If response 301,302 redirects (the browser needs to re-send the request based on the Location field in the request header)
- If the response is 200, the browser starts parsing the retrieved resource (assuming the resource is index.html)
- The browser starts building the DOM Tree by parsing the HTML sequentially from top to bottom
- Parse to the style tag
<link href="foo.css" rel="stylesheet">
I’ll download the style file and convert it to CSSDOM - Parse to script tags
<script src="foo.js"></script>
The script will be downloaded and executed, which will cause a block - Parse to image tags
<img src="foo.jpg" />
Will download the relevant images (as well as audio and video resources, here is only the image resources as an example) - The browser starts the layout based on the generated DOMTree and CSSDOM and generates a LayoutTree
- The LayerTree is generated by LayoutTree (z-index style hierarchy is different)
- The drawing list is generated according to LayerTree and then rasterized. Finally, the bitmap is generated and displayed on the screen
- Finally, the user interacts with the web
This process is divided into three stages in terms of cost of performance
- Network phase
- Resource loading phase
- User interaction phase
Network phase
- The DNS
- A TCP connection
- HTTP request/response
There is very little the front end can do for DNS resolution and TCP connections, so we focused on HTTP optimization. There are two main approaches to HTTP optimization.
- Try to avoid redirects
- Reduce the time spent on a single request
Try to avoid redirects
Obviously after a series of operations, you tell me to redirect, and then you have to start all over again with a series of requests, so it should be avoided.
Reduce the time spent on a single request
1. Reduce the size of cookies
We know that every request sent to the server is accompanied by Cookie information, so we should try to keep cookies as concise as possible
2. Enable Gzip compression on the server
The Gzip compression algorithm can greatly reduce file size
3. Use CDN
The full name of CDN is Content Delivery Network, namely, Content Delivery Network. It can redirect users’ requests to the service node nearest to users in real time according to comprehensive information such as Network traffic, connection of each node, load status, distance to users and response time. Its purpose is to enable users to obtain the content needed nearby, solve the situation of crowded Internet network, improve the response speed of users to visit the website
CDN has two core points, one is cache, one is back source.
Both of these concepts are very easy to understand. “Caching” means that we copy a copy of the resource to the CDN server, and “back source” means that the CDN finds that it does not have the resource (usually the cached data expires) and turns to the root server (or its upper layer server) for the resource.
Benefits of using CDN:
- We can access the resource server closest to us physically and shorten the access link;
- CDN already has a caching policy to provide access speed;
- Static resources can be deployed to different CDN services to break the connection limit of the browser.
- Since the domain name of the CDN server is different from this domain name, the access will naturally not carry the Cookie of this domain name.
4. Use Service work
A Service Worker can be thought of as a proxy server between a client and a server. There are many things you can do in a Service Worker, such as intercepting client requests, sending messages to the client, making requests to the server, etc. One of the most important functions is to cache resources.
In the current Chrome architecture, Service workers run in the browser process (the main process). Because the browser process has the longest life cycle, it can provide services for all pages during the browser’s life cycle.
Service workers can only be used in HTTPS or local localhost environments.
Workbox 3 encapsulates the underlying API of Service Worker, which is easy to understand. The following is a cache strategy written using Workbox 3
/ / first introduced importScripts Workbox framework (' https://storage.googleapis.com/workbox-cdn/releases/3.3.0/workbox-sw.js'); // If you have some static resources that need to be cached offline forever and not updated until you come back online, The precache workbox. In the process of the cache should be what you expect. Precaching preacheAndRoute (['/styles/index. 0 c9a31. CSS ', '/ scripts/main. 0 d5770. Js', { url: '/index.html', revision: '383676' }, ]); / / HTML cache strategy workbox. Routing. RegisterRoute (new RegExp ('. * \. HTML '), workbox.strategies.net workFirst ()); / / js, CSS cache policy workbox. Routing. RegisterRoute (new RegExp ('. * \. (? :js|css)'), workbox.strategies.cacheFirst() ); / / image caching policy workbox. Routing. RegisterRoute (/. * \. (? :png|jpg|jpeg|svg|gif)/g, new workbox.strategies.CacheFirst({ cacheName: 'my-image-cache', }) ); / / the CDN cache policy workbox. Routing. RegisterRoute (new RegExp (' https://your\.cdn\.com/ '), workbox.strategies.staleWhileRevalidate() ); / / the CDN cache policy workbox. Routing. RegisterRoute (new RegExp (' https://your\.img\.cdn\.com/ '), workbox.strategies.cacheFirst({ cacheName: 'example:img' }) );Copy the code
The CacheFirst staleWhileRevalidate above the code is the corresponding cache policy, so let’s take a look at which cache policies are available.
Stale-While-Revalidate
If the requested route has the corresponding Cache result, it will directly return the Cache result. When the Cache result is returned, it will make a network request in the background to get the request result and update the Cache. If there is no Cache, it will directly send a network request and return the result. This is a very secure policy for users, which can ensure that users can get the result of the request as quickly as possible. However, it also has some disadvantages, that is, some network requests still occupy the user’s network bandwidth.
Cache First
If there is no result in the Cache, the network request is made, the network request is retrieved, the result is updated to the Cache, and the result is returned to the client. This strategy is more suitable for the request that the results do not change much and the demand for real-time is not high.
Network First
When request routing is matching, it USES the network strategy, priority is the priority to try to get the network request to return the result, if get the result of the network request, just return the results to the client and write Cache Cache, if network attempt failed, it was finally Cache Cache Cache the results will be returned to the client, This strategy is generally applicable to the request whose return result is not very fixed or has requirements on real-time performance, so as to take the bottom for the failure of network request.
Network Only
The strategy of directly forcing normal network requests and sending the results back to the client is suitable for requests with high real-time requirements.
Cache Only
Using cached results directly and sending them back to the client is a better strategy for static resource requests that do not change once online.
Resource loading phase
Common resource types include HTML documents, CSS, JavaScript, image and so on. Let’s look at what aspects of resources can be optimized?
Packaging optimization
Resource packaging tools let’s take Webpack as an example to explain.
Resources compression
Webpack4. x Compresses JS and CSS resources
const HtmlWebpackPlugin = require('html-webpack-plugin'); const webpack = require('webpack'); Const OptimizeCSSAssetsPlugin = require("optimize- CSS-assets-webpack-plugin "); // introduce compression CSS plugin const path = require('path'); const config = { mode: 'production', entry: './path/to/my/entry/file.js', output: { filename: 'my-first-webpack.bundle.js', path: path.resolve(__dirname, 'dist') }, optimization: { minimizer: [new OptimizeCSSAssetsPlugin ({}) / / compress CSS code]}, the module: {rules: [{test: / \. (js | JSX) $/, use: 'babel-loader' } ], { test:/\.css$/, use:[ 'css-loader' ] }, }, plugins: [new webpack. Optimize. UglifyJsPlugin (), / / compression JS code new HtmlWebpackPlugin ({template: 'the. / SRC/index. HTML'})]}. module.exports = config;Copy the code
According to the need to load
On-demand loading is implemented primarily using the import() syntax provided by ES6
function getComponent(){
return import(/* webpackChunkName: "lodash" */ 'lodash').then(({default: _})=>{
var element = document.createElement('div');
element.innerHTML = _.join(['hello','webpack'],'-');
return element;
})
}
document.addEventListener('click',()=>{
getComponent().then(component=>{
document.body.appendChild(component);
});
});
Copy the code
The LoDash library is loaded when a click event occurs, rather than when initialized, which greatly reduces the first load time.
Preload and prefetch
<link rel="prefetch"></link> // Resources marked as prefetch will be loaded by the browser at idle time. <link rel="preload"></link> // Preload is usually used for key resources used in this page, including key JS, fonts, CSS files. Preload will increase the load order weight of resources, so that key data can be downloaded in advance, optimize the speed of page opening.Copy the code
Webpack by identifying code of magic annotation to prefetch | preload package
const element = document.createElement('button'); Element. innerHTML = "login "; element.onclick = function(){ import(/* webpackPrefetch: true */ /* webpackChunkName: "login" */ "./login.js").then(({default:loginFunc})=>{ loginFunc(); })}; document.body.appendChild(element);Copy the code
Webpack determines that a code block annotated with /* webpackPrefetch: true */ will be packaged as and the core is to take advantage of the prefetch and preload capabilities provided by the browser.
The above code uses Prefetch to load a login box module. Since the login box can only be used when the login button is clicked, it does not need to be loaded directly upon entering the page. In this way, the first load time is saved.
tree shaking
The essence of Tree Shaking is to eliminate useless JavaScript code
For example, we introduced the LoDash library
import { forEach } from "lodash"; ForEach ([1, 2, 3], () = > {});Copy the code
Webpack will automatically tree shaking for us and will only pack in the forEach modules that we need.
The prerequisite is to use the ES6 import and export module syntax. Because it is generated during the code static parsing phase. This means that the loDash forEach module is already known before the code runs, so webPack analysis can eliminate useless code and pack only used modules.
Compare the CommonJS
- The CommonJS module is run time loaded, and the ES6 module is compile time output interface.
- CommonJS loads an object (that is, the module.exports property) that is generated only after the script runs. An ES6 module is not an object, and its external interface is a static definition that is generated during the code static parsing phase.
Resource loading location optimization
Make functionality available as quickly as possible by optimizing where resources are loaded, changing when resources are loaded, and displaying page content as quickly as possible.
1. The CSS file is placed in the head
- Download CSS styles first to avoid pages that already have content but no style
- If the CSS is not first downloaded in the head, but contains for example
display:none
Such a style would cause the browser to regenerateCSSDOM
And then apply to the corresponding delta thetaLayerTree
Then regenerate it into a bitmap and render it. This will lead to redrawing and rearranging.
2. Put the JS file at the bottom of the body
Placing all