An overview of the

Take them down one by one if you find them

Clear optimization direction

From an interview question leads to several aspects of front-end optimization to pay attention to:

What happens between the time the user enters the URL and the time the page loads

I want you to think about this in your mind, and at every stage along the way, optimize your performance. Now let’s briefly review the process: First, the URL needs to be resolved into the corresponding IP address through the DNS(Domain name Resolution System). Then, a TCP connection is established with the server related to the IP address. After the client sends an HTTP request to the server, the server receives and processes the request. The result data is returned to the client in the HTTP response. After obtaining the response data from the server, the browser renders the data. After rendering, the page is displayed in the browser and responds to the user’s further operations at any time.

This process is divided into the following stages for analysis one by one:

  1. The DNS
  2. A TCP connection
  3. The HTTP request request
  4. The server processes the request and the HTTP response returns data
  5. The browser takes the response data, parses it, and presents the results to the user

As you can see, each time a URL changes, the above five phases need to be performed, so we need to consider these five phases when considering performance optimization solutions.

Practice test theory

The next task is to refine and decompose these five stages, put forward questions one by one and break them down one by one

Specific work breakdown

  • DNS parsing takes time. Can I reduce the number of parsing times or front-load the parsing? can
    1. Browser DNS Cache
    2. DNS prefetch
  • Is there a solution to TCP three-way handshake? There are
    1. Long links
    2. Preliminary link
    3. Access SPDY protocol
  • Front-end HTTP requests
    1. Reduce the number of requests
    2. Reduced request volume
  • Put static resources on the CDN close to us at deployment time

The four points mentioned above are all network level optimizations. The following are browser-based optimizations

  • Resource loading optimization
  • Server side rendering
  • Browser caching mechanism
  • DOM tree construction
  • Page layout and rendering process
  • Trade-offs between reflux and redraw
  • Proper DOM manipulation

Optimization at the network level

The five stages analyzed in the previous chapter mainly involve network optimization, including the following three stages

  • The DNS
  • A TCP connection
  • HTTP request/response

The first two phases are mainly related to the server side, so we mainly focus on the optimization of HTTP connection, which can be divided into two main directions:

  • Reduce the number of requests
  • Reduce the volume of a single request

The above two operations correspond to common operations in development: resource compression and consolidation. Today, this is mostly done with build tools. So, let’s focus on using WebPack

Webpack build bottlenecks

The words’ pack ‘and’ compress ‘have been used by every reader who has used WebPack, and the main focus here is on optimizing WebPack performance

Optimization bottlenecks of Webpack are mainly divided into the following two aspects:

  • The build process takes a long time
  • Packing results in large volume

Webpack optimization solution

Analysis of construction Results

After the completion of the project, it is suggested that the developer first hit a version, see the effect, and then according to the results to do optimization, to achieve a targeted

File structure visualization – Webpack-bundle-Analyzer

Install this visualization tool before packaging, which presents the size and dependencies of each module in the package as a rectangular tree diagram for easy analysis and optimization

Install NPM I --save-dev webpack-bundle-AnalyzerCopy the code

When used, it is introduced as a plug-in

const BundleAnalyzerPlugin = require('webpack-bundle-analyzer').BundleAnalyzerPlugin

module.exports = {
    plugins: [
        new BundleAnalyzerPlugin()
    ]
}
Copy the code

Optimization of construction process

Optimizing loader Configuration

The loader process file conversion operation is time-consuming, so it is necessary to keep as few files as possible processed by the Loader

Take babel-loader as an example. Use include/exclude to avoid unnecessary translation and enable caching to improve the efficiency of babel-loader

  • Use absolute paths in include/exclude in favor of include
  • Enable caching in babel-loader? cacheDirectory=true
const path = require('path')
// ...
module: {
    rules: [{test: /\.js[x]$/.// 
            include: [path.resolve(__dirname, 'src')].use: {
                loader: 'babel-loader? cacheDirectory=true'.options: {
                    presets: ['@babel/preset-env'}}}]}Copy the code
Special care for third party libraries

Third-party libraries are typically stored in node_module, and the ideal way to deal with them is to process them only the first time they are packaged, and then not repackaged as long as the version of the current dependent package has not changed. DllPlugin is recommended

Using DllPlugin to process files is divided into the following two steps:

  • DLL library is packaged based on the DLL exclusive configuration file webpack.config.dll
  • Package business code based on webpack.config.js file

DLL configuration file prepared as follows:

const path = require('path')
const webpack = require('webpack')
module.exports = {
    entry: {
        // Array of dependencies
        vender: [
            'prop-types'.'babel-polyfill'.'react'.'react-dom'.'react-router-dom']},output: {
        path: path.join(__dirname, 'dist'),
        filename: '[name].js'.library: '[name]_[hash]'
    },
    plugins: [
        new webpack.DllPlugin({
            // The name attribute of the DllPlugin should be consistent with that of the library
            name: '[name]_[hash]'.path: path.join(__dirname, 'dist'.'[name]-manifest.json'),
            // Context needs to be consistent with webpack.config.js
            context: __dirname
        })
    ]
}
Copy the code

Do a little configuration for the DLL in webpack.config.js:

const path = require('path')
const webpack = require('webpack')

module.exports = {
    mode: 'production'.// Compile entry
    entry: {
        main: './src/index.js'
    },
    // Target file
    output: {
        path: path.join(__dirname, 'dist/'),
        filename: '[name].js'
    },
    // DLL related configuration
    plugins: [
        new webpack.DllReferencePlugin({
            context: __dirname,
            // Manifest is the JSON file that we packaged in the first step
            manifest: require('./dist/vendor-manifest.json')]}})Copy the code
Reduced the number of core jobs and multiple core onlookers -Happypack

Because Webpack is a single thread, even if there are multiple tasks waiting to be processed at the moment, you still need to queue up one by one, which is the disadvantage of single thread. Fortunately, CPU is multi-core, Happypack will make full use of the multi-core advantage of CPU, and decompose multiple tasks into multiple sub-processes to execute concurrently, improving packaging efficiency

Note: It is not recommended to use Happypack when the project is small

Happypack is easy to use, just transfer the corresponding loader configuration to Happypack, before telling Happypack how many concurrent processes we need:

NPM I -d happypackCopy the code
const Happypack = require('happypack')
// Manually create a process pool
const happyThreadPool = Happypack.ThreadPool({size: os.cpus().length})

module.exports = {
    module: {
        rules: [{...text: /\.js$/./ /? The following query parameter specifies the name of the Happypack instance that processes these files, which is unique
                loader: 'happypack/loader? id=happyBabel'. }],},plugins: [
            ...
            new Happypack({
                // The name of the id must correspond to the query parameter above
                id: 'happyBabel'.// Specify a thread pool
                threadPool: happyThreadPool,
                loaders: ['babel-loader? cacheDirectory']]}})Copy the code

Image optimization

At present, the more popular image formats are JPEG/JPG, PNG, Base64, SVG, Webp, etc., and CSS Sprites are still used today. The following mainly introduces its features and application scenarios

JPEG/JPG

Features: Lossy compression, small size, no transparency support

advantages
  • Universal, image format supported by most browsers
  • The compression algorithm is efficient and the image format is small
  • To store a single graph with 24 bits can present more than 16 million colors, color rich
insufficient
  • Image color contrast is strong, resulting in blurred image after compression
  • No support for transparency
Application scenarios

JPG is good for colorful images, even large ones, which are small in size

  • Page background
  • Shuffling figure
  • Display of goods in e-commerce

PNG 8 / PNG 24

Features: support transparency, lossless compression, large volume

advantages
  • Support transparent
  • Lossless compression of high fidelity picture formats
  • PNG 8 supports a maximum of 256 colors
  • PNG 24 supports up to 16 million colors
  • Color expressive force is stronger, more delicate to line outline processing
insufficient
  • Large volume

In this project, it is not recommended to use PNG to process images with complex colors. PNG8 format is preferred

Application scenarios
  • Some small pictures -logo
  • Simple color, contrast images
  • Support transparent

Base64

Encode an image data into a string that replaces the image address URL

Features: Reduced HTTP requests, coding requirements, small icon use

advantages
  • By Base64 encoding images, you can write the results directly to HTML/CSS
  • Reducing HTTP requests
insufficient
  • Increase the size of HTML/CSS files
  • Parsing time increment
  • After Base64 encoding, the image size expands by about 30%
Application scenarios
  • The actual size of the image is small (< 2KB)
  • Images cannot be combined with other small images as Sprite images
  • Image update frequency is low

SVG

Features: small size, no distortion, good compatibility

Scalable Vector Graphics, an XML application, represents graphical information in a concise, portable form

advantages
  • Pictures can be zoomed in and out without distortion
  • The file size is smaller
  • Strong compressibility
insufficient
  • Browser rendering costs are high
  • The cost of learning is high compared to other formats
application
  • Put it directly in HTML and become part of the DOM
    <body>
        <svg xmlns="http://www.w3.org/2000/svg"   width="100" height="100">
            <circle cx="40" cy=45" r="50" />
        </svg>
    </body>
    Copy the code
  • It is stored in.svg format and imported into HTML
    <img src="./xx/ filenames. SVG" />
    Copy the code

WebP

Characteristics: a combination of many strengths, all-round player

Is a picture format developed by Google for the Web to speed up image loading

advantages
  • Support transparent
  • Gifs can be displayed
  • Rich display graphics
insufficient
  • Increase the burden on the server
  • compatibility

To be continued