React17 + vite + redux + saga + ts, etc., node + koa + mysql + TS, foreground address is: Blogs. Chenliangliang. Top/front end code address: github.com/cll123456/b… Server code address: github.com/cll123456/m…
Problem description
The project was written, but my home page took more than 20 seconds to load.The server is the minimum standard, HTTP protocol)
The effect
I’m using a different browser to give you screenshots so you don’t have to say it’s a cache.
After seeing the effect, everyone will feel more satisfied, but how to do? As you probably know, it’s just a gzip startup. Yes, that’s right.
Train of thought
This step is actually the most difficult, for a person who only knows the concept, but not the principle. So everything starts from principle.
Which end generates the ZIP file?
This is a problem, as most tutorials on the web will tell you, to configure nginx on the server side, and then XXX wave operation like a tiger. But is it really a good idea for beginners? You can’t do that without telling people how it works. So why is he the one asking questions? I don’t understand. I can’t sleep well.
Server generation
Zip files can be generated by the server, for example:
nginx
Nginx has a module called gzip module, and once you turn it on, Nginx will help you to compress the data (static resources and interface data), and then pass it to the client, the client decompresses, and then reads the code. In fact, this step is to save bandwidth, reduce the number of code packages transmitted. Thus saving transmission time. Then the site will open quickly.
node
Node also has a library for Compression and configates options to compress data (resources and interface data) in the same way that the server compresses the data before transmitting it.
Other services also have related libraries, how to use depends on the language, here will not expand
Client generation
Since the server can generate
gzip
The files, the build toolswebpack
.rollup
Why also write some compressed packages? And you’ll find that packages seem to be getting more downloads every week.
Such as:
Why client generation? That’s a good question. We know if the server generates every request to the server, and then the server generates the compressed package. Does the server waste server performance every time it generates a compressed package? Wow! If the client generates, the server first determines whether there is a file named zip suffix, directly go to get it, does not exist in the compression, so is not the server every time to compress things, to the client? Although client-side packaging for code compression is slow. But we package only once when we release the code, and the server is faced with thousands of people to access and so on. I think that makes sense.
In actual combat
Here I chose to use Nginx to configure the server.
Server side for compression
For the server side to compress, the client side does not have to do anything, just need to put the package into the corresponding directory below, and then access when nginx automatically compressed to the client side for parsing.
Nginx configuration
Use the HttpGzip module (this module supports online real-time compression of output data streams) module.
The scope of this command is: HTTP, server, location means that you can add HTTP, server, location to any of the three locations. In order not to affect the other locations, I suggest adding it to the location module, so that the other locations will not be affected.
gzip on;
gzip_buffers 4 16k;
gzip_comp_level 6;
gzip_types text/plain application/javascript text/css application/xml text/javascript application/x-httpd-php;
Copy the code
The command | meaning |
---|---|
gzip on |
Enable or disable the Gzip module |
gzip_buffers 4 16k |
Set the system to get several units of cache for storing gzip’s compressed resulting data stream |
gzip_comp_level 6 |
Gzip compression ratio, 1 Minimum compression ratio, fastest processing speed |
gzip_types text/plain application/javascript |
matchingMIME The “text/ HTML “type is always compressed (whether specified or not) |
The effect
The main thing to load is this application
In customer compression
There are also many compression tools in the client, here I will introduce webPack and Vite client how to compress and then deploy
webpack
The general public uses this tool, using the compression plugin compression-webpack-plugin mentioned above. Then add the plugin configuration information to vue.config.js or webpack.config.js.
const compressionWebpackPlugin = require('compression-webpack-plugin') configureWebpack: { plugins: [new compressionWebpackPlugin({ filename: '[path].gz[query]', algorithm: 'gzip', test: /\.(js|css|svg)$/, threshold: MinRatio: 0.8, deleteOriginalAssets: false})],Copy the code
The effect is as follows:
vite
My project was built using Vite, and I also need to install a plug-in. At first I thought it was rollup-plugin-gzip, but I found it was wrong, and Vite made a plug-in by itself. The use of viet-plugin-compression is simple
import viteCompression from 'vite-plugin-compression';
plugins: [
viteCompression()
],
Copy the code
The effect is as follows:
Nginx configuration
Yes, nginx is also configured to start the gzip module, and then use locally compressed files preferentially.
gzip_static on;
gzip_http_version 1.1;
gzip_proxied expired no-cache no-store private auth;
Copy the code
The command | role |
---|---|
gzip_static on |
Start the module. You should ensure that the timestamps of compressed and uncompressed files match |
gzip_http_version |
Version 1.1 by default, use gzip_static, which is the 1.1 version |
gzip_proxied |
Nginx Results returned by the back-end server when enabled, turned on, or turned off as a reverse proxy |
The effect
reference
- Segmentfault.com/a/119000001… That’s a good explanation
- Nginx.org/ru/docs/htt… Nginx English documentation
- www.nginx.cn/doc/standar… Chinese document
- Blog.csdn.net/weixin_4127… Thank you for your discussion last night
This article uses the article synchronization assistant to synchronize