The maximum number of concurrent requests for the same domain name is set by the browser. The maximum number of concurrent requests is usually 4 to 8. Then we’ll look at the causes and optimizations for browser request concurrency limits.
Number of concurrent browsers
Why do browsers request concurrency limits?
Before we look at optimizations, let’s look at why browsers limit concurrent requests
1. Consider the port resources of the operating system
If the total number of PC ports is 65536, one TCP (HTTP is also TCP) connection takes up one port. Operating systems typically open up half of the total ports to requests, in case the number of ports does not run out quickly.
2. Excessive concurrency leads to frequent switchover, resulting in performance problems
One HTTP request is handled by one thread, so a large number of concurrent requests can cause threads to switch frequently. Thread context switching is sometimes not a lightweight resource. This is not worth the cost, so a link pool is created in the request controller to reuse previous links. Therefore, it can be regarded as the maximum number of link pools under the same domain name is 4 ~ 8. If all the link pools are used, the subsequent request tasks will be blocked and the subsequent tasks will be executed when there are idle links.
3. Prevent a large number of concurrent requests from the same client from exceeding the server concurrency threshold
On the server side, a concurrency threshold is usually set for the same client source to avoid malicious attacks. If the browser does not limit the concurrency of the same domain name, it may result in the server exceeding the concurrency threshold being banned.
4. Client conscience mechanism
In order to prevent the two applications from grabbing resources, the strong party will have unlimited access to resources and the weak party will block forever.
Optimization means
So now that we know the reason for the browser concurrency limitation we can optimize it in the following ways
Domain name is sending out
Separate requests across multiple domains, such as 100A requests -> (25A + 25B + 25C + 25D). But four or so is recommended. Too many domain names may cause DNS resolution performance problems.
cookie free
Cookie free is the storage and carrying of cookies that distinguish primary site requests from other secondary requests. When the cookie size of a website is 5kb, sending 150 requests with cookies will be 750kb. Under the common uplink bandwidth of 1024 Kbps, it will take about 7.5 seconds to complete sending. Although these requests may be executed concurrently, there is little need to carry cookie information on static resource requests. Therefore, we can enable the main site domain name and other domain names to request differentiated cookie carrying.
Small images merged into larger images (Sprite)
Combine several small images into one large image and use CSS background Sprite to locate the display. Sprite images are a common way to reduce the number of requests for image resources
Sets the cache-control Max – age
When we determine that a project’s resources are permanent, we set version Control and cache-control max-age to Cache them for a long time to reduce browser requests for resources.
Loading Image Lazy loading
Lazy loading is actually a large site usually necessary for a means, in order to prevent meaningless loading scenarios. Usually, users do not view all the contents on the first screen, and may have jumped to other pages in the browsing process. Then there is no need to load pictures and create nodes where users browse. You can scroll to the node content area to display and load the content.
PWA (Incremental Application)
A popular optimization method in recent years is to cache the requests requested by the current application to the client through Web Service, so that the user can read the previous response directly from the client’s local machine the next time he visits the page or refreshes the page. You can refine control of cache static resources, API requests, and so on. However, pWA cache is limited: Only HTTPS and domain name requests can be cached. In addition, previously cached requests need to be cleared and refreshed the next time the PWA mechanism starts, which will result in cached resources requiring two visits to the page before updating.