The environment
- Vue – cli 3.11.0
- Nginx 1.16.1
Write the summary above
Click on me to jump straight to the summary
Project development process
- . What steps of requirements analysis, technology selection, technical architecture validation and so on are omitted here to get straight to business
- Create projects through scaffolding (VUe-CLI)
- Build a front-end architecture
- Encapsulates base components, common business components, common modules, methods, and so on
- Formal development (coding)
- Development is complete, testing locally, packaging, deploying the test environment, testing, and suddenly you find some problems
Deployment is complete
Found, ah, good, can run, a little happy. But the problem soon became apparent:
- Is there something wrong with this performance?
- Why is the first screen loading so slow?
- Am I not optimizing for performance?
- Package optimization?
- Yahoo catch-22?
- . Oh, shit. I don’t think I did anything
In the past, when deploying front-end applications with Apache (jQuery + HTML + CSS), we often do related performance optimizations, such as the most used Yahoo! Catcatch, but now using VUE stack + Nginx?
Okay, let’s start optimizing now
Analysis of the
Vue stack + Nginx (as well as other optimizations using packaging tools like WebPack)? It can be simply divided into the following three categories:
- Optimization at the coding level
- Packaging optimization
- Nginx server configuration
Optimization at the coding level
There are many optimizations at this level, such as the use of v-if, V-show, virtual list, dynamic loading, lazy loading, disable reactive (some places do not need reactive, such as pure data display, i.e. data does not change) and so on
Packaging optimization
This article focuses on packaging optimization
Nginx server configuration
Nginx server configuration, refer to another article, Nginx server installation and configuration
Packaging optimization
Analysis of the
When we do packaging optimization, we first need to know what we need to optimize, which requires us to analyze the packaging results
webpack-bundle-analyzer
This is a plug-in for WebPack that analyzes the packaging results
How to use
Installing a plug-in
npm i webpack-bundle-analyzer -D
Configuration webpack
Vue-cli3 We configure webpack through vue.config.js file
module.exports = {
...
// VUE - CLI3 provides an advanced technique, chain operation
chainWebpack (config) {
...
// Meaning only applies when packing
if (process.env.NODE_ENV === 'production') {
config.plugin('webpack-bundle-analyzer')
.use(require('webpack-bundle-analyzer').BundleAnalyzerPlugin) .end() } ... }... }Copy the code
packaging
npm run build
The default port number is 8888. If you log in to 172.0.1:8888 in the browser, you can see the analysis result of the package. Of course, the command line also outputs some information, such as which packages are available, whether there is a problem with the package, such as the package is too large. Or you have console log information and so on, and if you don’t understand something, you can keep going
Analysis of packing results
The above two pictures are the same except that the second one has information for each package
You can slide the mouse on the page and find that the page is a module diagram, which analyzes the package of the entire application, what parts are composed of, and one information for each part
We analyzed the page and found that:
- The package is a bit too big. The page analysis shows 7.95 MB, and then the dist directory shows 13 MB
optimize
After some analysis just now, we found that the main problem is that the resource pack is too big, it is 13 MB unexpectedly, this is not a big project, how can the package result be so big
Analyzing the Dist directory
source map
The source map file is enabled by default in VUE-cli3, which is convenient for us to tune the bug during development and locate the wrong position accurately. We need to manually close it during packaging
Close the source map
module.exports = {
...
productionSourceMap: false. }Copy the code
Repackage, run NPM run build, and look at the Dist directory again. The dist directory is suddenly smaller, but the results of webpack-bundle-Analyzer analysis are still the same. We just reduced the size of the dist directory after packaging, which is also an optimization. This ensures the volume of code deployed to production, as well as the security of the code
GZip compression
We’ll see some prompts when we run the NPM run build
The resource and entry files are too large, exceeding the default limits of Webpack. This will affect web performance
There are two solutions, change the default configuration of WebPack, for example, increase the size limit, or simply turn off the hint, which is not recommended, because you are not doing any optimization, we use resource compression
How to do
One of two ways
- Nginx server for compression
- Webpack is compressed when packaged
Here we choose the second option for the following reasons:
Nginx service compression, can also, but each request for resources, the server is actually in dynamic real-time compression, which is actually a kind of pressure on the server, such as the CPU for a large number of calculations, especially when the load is relatively high, the request comes, the server for compression, request start to the end of compression, This period of time the user can not see anything, the experience is not good, the user still thought that the card, repeated requests, repeated compression, the load is getting higher and higher, a vicious cycle, so here the use of Webpack compression, nGIXn put after the compression of resources, the performance will be much better, of course ** (emphasis: This way the nginx server also needs to do some configuration, see below) **
Installing a plug-in
Note: There is a problem with the latest 7.1.0
NPM [email protected] – I D
Modify the vue. Config. Js
const CompressionPlugin = require('compression-webpack-plugin')
module.exports = {
...
configureWebpack: config= > {
if (process.env.NODE_ENV === 'production') {
config.plugin('compressionPlugin')
.use(
new CompressionPlugin({
filename: '[path].gz[query]'.algorithm: 'gzip'.test: /\.(js|json|css|jp? eg|png)$/.// Compress resources when they are greater than this value
threshold: 0.// Resources with a compression ratio less than this value will be compressed. The default value is 0.8
minRatio: 1.// Delete source files. With Webpack-bundle-Analyzer, source files cannot be deleted, and static servers need to be configured
deleteOriginalAssets: true}))}}... }Copy the code
packaging
npm run build
Compressed result (see effect)
Here I saved the compression before the screenshots, at least a few megabytes of data missing, readers in the actual combat can try their own
Configure the Nginx server
Upload the packaged resource bundle to the nginx server for deployment, and then visit the page, find nothing, open the control to see, various 404, meaning the resource was not found, what?? Nginx compression: Nginx compression: Nginx compression:
- Dynamic compression
As we said at the beginning, requiring real-time server compression will increase server load and waste CPU
- Static compression (the way we did it)
Using nginx’s GZip Precompression module, which reads compressed files for gZIP-enabled requests (file names ending in.gz. If you look at our packaged files, you’ll see that they also end in.gz), Instead of dynamic compression, for non-GZIP-enabled requests, the source file is read
instructions
Nginx dynamic compression requires the ngx_HTTP_gzip_module module, which is installed by default; Static compression uses the http_gzip_STATIC_module module, which allows you to send precompressed files with.gz extensions instead of regular files. We need to check if nginx already has this module, my environment already has it, or we need to recompile and specify it
Check whether the module has been installed:
nginx -V
If not, install:
./configure –with-http_gzip_static_module make && make install
Niginx is configured with gzip, which is statically compressed
HTTP {# gzip # gzip on; gzip_static on; # Enable gzip compressed minimum file; Gzip_min_length 1k; # gzip level 1-10 gzip_comp_level 2; The type of file to compress, # gzip_types text/plain Application /javascript application/ X-javascript text/ CSS application/ XML text/javascript application/x-httpd-php image/jpeg image/gif image/png; Gzip_vary on is recommended for HTTP header Vary: accept-encoding. }Copy the code
** Check whether the configuration is successful **
The viewer sends a request to view the request information, as shown in the figure, then it succeeds
OK, the compression is finally done, continue to check the analysis results, for the next step of optimization
According to the need to load
After the above compression processing, the packaged file has been much smaller, now the total is only 1.4MB, but we still need to analyze where we can continue to optimize the packaging results, we will find that the volume of the third library accounts for the majority, You can see that Element-UI and Echarts (the V-Charts used in the project) dominate, with 1.49MB and 2.42MB respectively. This is the uncompressed volume, as shown below:
Analysis, why are these two so large, were not configured at development time load on demand? After a check, sure enough, did not do on demand loading
Element-ui loads on demand
Here we refer to the On-demand introduction section in the Element-UI quick Start section
Note * * * *
The. Babelrc file corresponds to the babel.config.js file in VUe-cli3. Es2015 needs to be changed to ‘@babel/preset-env’.
// Babel version conflicts. If you look at package.json file, you will find that babel-core is 7.0
Plugin/Preset files are not allowed to export objects, only functions.
Copy the code
Babel 7 scrapped babel-PRESET – ES201x in favor of the new ENV plugin.
So the final configuration for element-UI loading on demand is:
// babel.config.js
module.exports = {
presets: [
'@vue/app'['@babel/preset-env', { modules: false}]],plugins: [['component',
{
libraryName: 'element-ui'.styleLibraryName: 'theme-chalk'}}]]Copy the code
** The effect of the element-UI configuration loading on demand **
V-charts are loaded on demand (Echarts also has a solution)
On-demand loading of V-Charts requires no additional configuration, just coding as required
.// V-Charts is loaded on demand
import VeLine from 'v-charts/lib/line'
import VePie from 'v-charts/lib/pie'
import VeGauge from 'v-charts/lib/gauge'
import VeHistogram from 'v-charts/lib/histogram'
Vue.component(VeLine.name, VeLine)
Vue.component(VePie.name, VePie)
Vue.component(VeGauge.name, VeGauge)
Vue.component(VeHistogram.name, VeHistogram)
...
Copy the code
According to the above method, v-Charts can be loaded on demand
** V-Charts effect after loading on demand **
After the implementation of on-demand loading of Element-UI and V-Charts, the effect is remarkable. Only the modules we actually use are packaged, and the optimization effect is remarkable
Route lazy loading
This project is not optimized for routing lazy loading, because it was done at the beginning of the project architecture.
export const constRoutes = [
{
path: '/login'.name: 'login'.// Lazy loading of routes
component: () = > import('@/views/Login.vue'),
hidden: true
},
{
path: '/'.component: () = > import('@/views/Home.vue')}, {path: '/dashboard'.redirect: '/'.meta: {
roles: ['admin'.'editor'].// title: 'dashboard'
title: 'home'.icon: 'dashboard'}}]Copy the code
The routing lazy load implementation is simple, just dynamically importing our components
The optimization effect
Analysis diagram after optimization:
The package volume is significantly reduced from 13MB at the beginning to 1MB now, with a 92% reduction in volume and a first-screen loading time of milliseconds, which improves the loading efficiency by 50% compared with before optimization
conclusion
Project optimization can be divided into three categories: coding level optimization, packaging optimization, and configuration of nGINx (static resource) servers
Code level optimization
It’s not in the documentation, but it’s easy to make a list
- v-if vs v-show
- Optimization of virtual list and long list
- Dynamic loading (lazy loading)
- Do you really need responsiveness (responsiveness is both a convenience and a performance penalty)
- Optimization of rendering (reduce redraw and rearrangement)
- Network optimization
- .
Packaging optimization
See what optimizations need to be made with tools like Webpack-Bundle-Analyzer, Lighthouse, Performance, Coverage, and more
- Remove dead code, code that will never be executed
UglifyJS, Tree Shaking, automatically removes code that is not used in a project from the package. UglifyJS also removes console. logs from a project
- Lazy loading
When you remove dead code and find that the resource bundle is still large, you need to consider whether some resources are packaged twice. Especially for multi-portal projects, it is very easy to see this situation. A package is referenced in multiple portals, and common code can be extracted using code Spliting. Webpack 4 uses splitChunks, which can be unpacked and pack common modules.
- Disable source Map in the production environment
- Resource compression, gzip
- Resources are loaded on demand
- If you can use CDN, you can greatly reduce the packaging volume
CDN can be used to load some external resources (Paul global variables library), such as VUE, VUex, VUE-Router, AXIos, Echars, element-UI, etc. The externals attribute of Webpack needs to be configured
- Whether some resources should be split
This is in the eye of the beholder, such as CSS split, whether small images need to Base64 encoding split benefits are: can do cache; Disadvantages: Time-consuming to load for the first time, too many HTTP requests, and large code volume during inlining. You are advised to make decisions based on actual scenarios
- The packaging setup cache increases the initial packaging time but reduces the subsequent packaging time
- Some infrequently changing resources are packaged separately, using splitChunks
Configure the Nginx server
Refer to another article, Nginx server installation and configuration
I’m done. I can go straight back to where I started