Summary of front-end performance optimization
Gzip compression
Gzip compression efficiency is very high, can reach 70% compression rate
// NPM I -d compression-webpack-plugin installs plugin dependencies
configureWebpack: config= > {
const CompressionPlugin = require('compression-webpack-plugin')
config.plugins.push(new CompressionPlugin())
}
Copy the code
Remove the console log
In the production environment, logs do not need to be printed. By configuring webpack, console.log is automatically removed during packaging
//npm i -D terser-webpack-plugin
configureWebpack:config= >{
const TerserPlugin = require('terser-webpack-pulugin')
config.optimzation.minimizer.push(
new TerserPlugin({
extractComments:false. terserOptions: {compress: {drop_console:true}} // Plug-in configuration item removes console }) ) } Copy the code
Remove SourceMap
It will be very difficult to tune bug location after code compression, so sourcemap is introduced to record the position information before and after compression. When errors occur, it will directly locate the position before compression, which will greatly facilitate our debugging.
Sourcemap comes with a lot of information, and if a build needs to generate sourcemap, it will greatly slow down the build and increase the size of the package.
/ / in the vue
module.exports = {
productionSourceMap: false.}
/ / the react / / open the webpack. Config. Prod. Js const shouldUseSourceMap = false Copy the code
CDN
Content distribution network, it can realtime according to the network traffic and the connection of each node, load status and the distance to the user and response time and other comprehensive information to redirect the user’s request to the nearest service node. Its purpose is to enable users to obtain the content needed nearby, solve the situation of crowded Internet network, improve the response speed of users to visit the website. Therefore, resources can be deployed on CDN to improve response speed and user experience
pre-rendered
Simply put, the work of rendering JavaScript dynamically to the browser is done in the packaging phase (only static data is built). In other words, during the build process, WebPack generates statically structured HTML using the prerender-SPa-plugin
// Install prerender-spa-plugin
npm install prerender-spa-plugin --save-dev
// install vue-meta-info
npm install vue-meta-info --save-dev
// 3. Related configurations <! -- webpack.prod.conf.js -->// Pre-render configuration: add it to webpack.prod.conf const PrerenderSPAPlugin = require('prerender-spa-plugin') const Renderer = PrerenderSPAPlugin.PuppeteerRenderer // Add to plugins new PrerenderSPAPlugin({ // The generated file path can also be the same as webpakc package. // This is very important!! // This directory can only have one level. If the directory level is greater than one level, there will be no errors during generation and it will just be stuck during pre-rendering. staticDir: path.join(__dirname, '.. /dist'), // Corresponding to its own routing file. For example, if a has parameters, write it as /a/param1. routes: ['/'.'/first'.'/second'.'/third'.'/fourth'.'/userCenter/userFirst'.'/userCenter/userSecond'.'/userCenter/userThird']. // This is very important, if this section is not configured, it will not be precompiled renderer: new Renderer({ inject: { foo: 'bar' }, // headless: false, renderAfterDocumentEvent: 'render-event'.// Document.dispatchEvent (new Event('render- Event ')) in main.js. args: ['--no-sandbox'.'--disable-setuid-sandbox'] }) }) // 4, in main.js import MetaInfo from 'vue-meta-info' new Vue({ el: '#app'. router, components: { App }, template: '<App/>'. // Add mounted or precompile will not be performed mounted () { document.dispatchEvent(new Event('render-event')) } }) Copy the code
Note: Routing mode must be history. If you do not set history mode, you can run and generate files, and the contents of each index.html file will be the same
Service Worker
ServiceWorker is a SECTION of JS running in the browser background process. It can do many things, such as intercepting client requests, sending messages to clients, and making requests to servers. One of the most important functions is offline resource caching.
The ServiceWorker has rich and flexible control over the cache process. When a page request is made to the ServiceWorker, the ServiceWorker requests the cache and the network at the same time, directly delivers the cached content to the user, and then overwrites the cache
Note: You need HTTPS to use ServiceWorker
HTTP cache
HTTP caches generally fall into two categories: strong caches (local caches) and negotiated caches (304 caches)
A normal refresh enables negotiated caching
Strong caching is enabled only when a web address is entered in the address bar, a resource is imported through a link, and so on
Strong cache (200). Local caching is the fastest way of caching. As long as the resource is still in the cache, the browser will read it directly locally without asking the server. In the Network option of the Chrome console, you can see that the request returns a status code of 200 and Size displays from Disk cache or from Memory cache. Strong caching can be implemented by setting two HTTP headers: Expires and cache-Control.
Negotiation cache (304). Negotiation cache, as the name implies, is negotiated between the browser and the server, after deciding whether to read the local cache. If the server notifies the browser that it can read the local cache, the 304 status code will be returned, and the negotiation process is very simple, only the header message will be sent, not the response body.
Negotiated caching can be implemented by setting two HTTP headers: Last-Modified and ETag
First, Etag is superior to Last-Modified in accuracy
Second, in terms of performance, Etag is inferior to Last-Modified, because last-Modified only takes time to record, whereas Etag requires the server to compute a hash value through an algorithm
Third, server verification takes Etag as the priority
Service Worker -> Memory Cache -> Disk Cache -> Push Cache
Push Cache exists only for sessions, is released at the end of the session, and is cached for a short time
HTTP2
Four new features of HTTP2:
- Multiplexing eliminates the need for multiple TCP connections because it allows multiple requests to be made on a single HTTP2 connection and therefore does not rely on establishing multiple TCP connections.
- Binary framing, which encodes all messages to be transmitted in binary and divides the information into smaller message blocks.
- Header compression: Uses the HPACK technology to compress the header and reduce the packet size
- Server push. The server can send data before the client initiates a request. In other words, the server can send multiple responses to a request from the client, and the resources can be properly cached.
server {
listen 443 ssl http2;
}
Copy the code
Resource preloading
Resources are pre-loaded so that they can be rendered directly from the local cache when the user needs them.
preload
During page loading, it is loaded before the browser begins the main rendering
// Preload sty1e.cs5 and index.js with pre1oad
<link rel="preload" href="style.css" as="style">
<link rel="preload" href="index.js" as="script">
Copy the code
prefetch
After the page is loaded, use the idle time to load in advance
// Perform prefetch preloading on the resource
<link rel="prefetch" href="next.css">
<link rel="prefetch" href="next.js">
Copy the code
dns-prefetch
// Preresolve specific domain names
// Placing static resources under only one domain can effectively reduce DNS requests
< link rel = "DNS - prefetch" href ="/ / fonts.googleapis.com ">
Copy the code
Load JS asynchronously without blocking
Load js files asynchronously without blocking page rendering.
Normal Script tags stop parsing the DOM in the middle of parsing and parsing
defer
<script src="d.js" defer></script> <script src="e.js" defer></script>
Copy the code
Execute d.js and e.js in sequence before the DOMContentLoaded event after the other synchronization scripts are executed.
async
<script src="b.js" async></script>
<script src="c.js" async></script>
Copy the code
Execute the script immediately after it is downloaded. (The execution sequence and execution stage are uncertain, which may be before or after the DOMContentLoaded event)
Neither defer nor Async will stop parsing the DOM
“webp“
“Webp” is a new image format, which is only 2/3 of the size of “JPEG”. Changing the image resources to “WebP” can speed up the request
However, webP has browser compatibility problems. Before using webP, check whether the browser supports it
Loading load
By adding a load state, you can make it visually less slow for the user
It can also be loaded through the skeleton screen, and the content appears smoothly and unobtrusive to the senses
Finally, if you want to discuss the exploration front end with me, you can pay attention to my public account, update dry goods from time to time, and join the technical group exchange and discussion.
This article is formatted using MDNICE