Enjoy learning class guest author: Lao Gu

preface

“Sorry, the online server timed out seriously, the request is very slow, it seems that the number of connections reported too many, how to do? “Kids are giving feedback. The usual way our tech bosses do it is to increase the connection count and thread pool a bit, reboot, and watch. Often this approach is an emergency measure, treating the symptoms rather than the root cause, because the cause of the problem is not known. There is a serious misconception that if the thread pool Settings are too small, it will speed up requests by increasing them. How to set the size of the thread pool reasonably?

The problem

If you have two tasks to deal with, one task A and one task B

Plan 1: One thread executes tasks A and B. After A completes, plan B: Two threads A and B execute tasks A and B at the same timeCopy the code

Which one will be faster? I think A lot of people would say, well, plan two, multithreading tasks A and B in parallel, that’s fast. Is that so? Before we answer that question, let’s take you through a little review.

Threads execute

Multi-threaded execution is scheduled by the CPU. A CPU will execute only one thread at A time, and what we see as thread A and thread B executing concurrently. In order to make the user feel that these tasks are being performed simultaneously, the operating system takes advantage of time slice rotation. Then the state of the current task is saved. After loading the state of the next task, the service of the next task is continued. The state of a task is saved and reloaded, a process called context switching.

The context switching process takes time; Now let’s have a look at the above problem, our friends and let’s have a look at which solution is fast? Some of you might say plan one, because there’s no thread switching. Take it easy and look down

Why multithreading

Given the context switching time of threads, why multithreading? Do you feel confused? What’s going on? Think about the process in our real business.



The flow of the image above:

The Web server parses the request. 3. The database at the back end of the request obtains data

This is the normal request flow when we are dealing with business; Let’s take a look at what computer processing is involved.

1, — — — — – > network request network IO 2, parse request — — — — — > 3, request database CPU — — — — — > network IO 4, mysql query data — — — — – > disk I/o 5, mysql data returned — — — — — > network IO 6, data processing — — — — – > the CPU 7. Return data to user —–> Network IO

In real business, we will not only involve CPU calculation, but also network IO and disk IO processing, which is very time-consuming. If the whole process of a thread is the process shown in the figure above, only 2 nodes are really involved in THE CPU, and the other nodes are all IO processing, then the CPU will be free when the thread is doing IO processing, and the CPU utilization will be low.

Now you know what multithreading is for, yes, to improve CPU utilization.

Ascension QPS/TPS

Measure how the system performance, the main indicators of the system (QPS/TPS).

QPS/TPS: indicates the number of requests and transactions that can be processed per second. Concurrent: indicates the number of requests and transactions that can be processed simultaneously by the system. Response time: indicates the average processing time of a request or transaction

QPS/TPS = Concurrency/response timeCopy the code

That is, the greater the concurrency, the greater the QPS; So many people will think that if you increase the pool size, the number of concurrent tasks will increase and the QPS will also increase. In fact, QPS is inversely proportional to the response time, and the larger the response time, the smaller the QPS.

While increasing the number of concurrent threads can improve the QPS, the number of threads can also affect the response time, because we also mentioned context switching. How to set the number of threads?

How do I set the number of threads

So how do we allocate threads? We offer a formula:

Optimal number of threads = ((thread wait time + thread CPU time)/thread CPU time) * number of cpusCopy the code

Note: This formula is also shared by predecessors. Of course, I always read the article about the optimization practice of Taobao’s front desk system. It is similar to the formula above, but in terms of the number of CPUS, they are more detailed.

We continue the above task, our server CPU number is 4 cores, a task thread CPU time is 20ms, thread wait (network I/o, disk I/o) time is 80ms, the optimal number of threads :(80 + 20)/20 * 4 = 20. The optimal number of threads is 20.

From this formula we can see that the larger the waiting time of a thread, the larger the number of threads should be set, which is exactly in line with our analysis above, to make full use of CPU utilization. From another point of view, the number of threads is set according to our own business, we need to go to the pressure test, set a reasonable value.

Basic general standard

We may ask, because many businesses are concentrated in a thread pool, unlike the above case is relatively simple, in fact there are too many businesses, how to set? This is to stress test to adjust. However, our predecessors have helped us to summarize a basic value (ultimately, it depends on the performance)

CPU intensive: memory processing services. Generally, set the number of threads to number of cpus + 1 or number of cpus x 2. If the number of cores is 4, generally set to 5 or 8 IO intensive: file operation, network operation, database operation, generally set to: CPU number/(1-0.9), generally set to 4, 40

conclusion

Today we introduce the setting of thread size, some mistakes of friends. By now, we have a new understanding of threads, which is not as rude as before. We should analyze why the thread is so slow, where the bottleneck of the system appears, and reduce the time consuming of the bottleneck.

You can recommend redis and nginx. Why are they so fast? In fact, there is something in common with the knowledge point of this article. Thanks for reading!!

Welcome everyone to comment and leave a message below, and will continue to update more selected articles to share. At the same time, you can pay attention to the official public account, there will be many selected sharing and irregular welfare learning materials for free every day!