Why use thread pools
The main task of the thread pool is to control the number of threads running, put tasks into queues during processing, and then start these tasks after the creation of threads. If the number of threads exceeds the maximum number, the redundant threads will queue up and wait for other threads to finish processing, and then pull the task from the queue to execute.
Main features: Thread reuse control maximum number of concurrent management threads
- Reduce resource consumption. Reduce the cost of thread creation and destruction by reusing and creating threads
- Improve response speed. When a task arrives, it does not need to wait for the thread to create it
- Improve thread manageability. Threads are scarce resources. If they are created without limit, they will not only consume system resources, but also reduce system stability. Thread pools can be used for unified allocation, tuning, and monitoring
Introduction to thread pools
Architecture implementation
In Java thread pool is implemented through the executor framework, the framework used to executor interfaces, Executors, ThreadPoolExecutor and ScheduledThreadPoolExecutor class. Executors and Exectors are the same as Those of Collections and Collection and Arrays. The former provides many handy tools and classes.
Commonly used method
There are five types of thread pools, of which the following three are the most common
- Executors. NewFixedThreadPool (int) fixed number of threads: long-term mission, performance better The concrete implementation:
public static ExecutorService newFixedThreadPool(int nThreads) {
return new ThreadPoolExecutor(nThreads, nThreads,
0L, TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>());
}
Copy the code
Main features:
- To create a
Fixed-length thread pool
, can control the maximum number of concurrent threads, exceeding the threads will wait in the queue - NewFixedThreadPool creates thread pools corePoolSize and maximumPoolSize with equal values.
It uses LinkedBolickingQueue
-
Executors. NewSingleThreadExecutor () only one thread Applies to a single task of the current scenario of execution
public static ExecutorService newSingleThreadExecutor(a) { return new FinalizableDelegatedExecutorService (new ThreadPoolExecutor(1.1.0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>())); } Copy the code
Main features:
- create
A single thread
It will only use a single worker thread to execute tasks, ensuring that all tasks are executed in the specified order - NewSingleThreadExector set both corePoolSize and maximumPoolSize to 1,
It uses LinkedBlockingQueue
- create
-
Executors. NewCachedThreadPool dynamic expanding the number of threads (), applicable to perform a lot of short-term asynchronous small program or service load is lighter
public static ExecutorService newCachedThreadPool(a) { return new ThreadPoolExecutor(0, Integer.MAX_VALUE, 60L, TimeUnit.SECONDS, new SynchronousQueue<Runnable>()); } Copy the code
Main features:
- To create a
Cacheable thread pools
If the thread pool length exceeds processing requirements, idle threads can be recycled flexibly. If no thread pool length can be recycled, a new thread can be created. - NewCachedThreadPool sets corePoolSize to 0, maximumPoolSize to integer. MAX_VALUE, and SynchronousQueue, which means that the task is created and executed. When the thread is idle for 60 seconds, Destroy the thread.
As you can see from the above three pieces of code, thread pools are constructed from a single method: ThreadPoolExecutor
- To create a
ThreadPoolExector
When you create a thread pool IDE inside the Exectors, you create a ThreadPoolExecutor object with the default values for the ThreadPoolExecutor constructor. Its construction method is as follows:
public ThreadPoolExecutor(int corePoolSize,
int maximumPoolSize,
long keepAliveTime,
TimeUnit unit,
BlockingQueue<Runnable> workQueue,
ThreadFactory threadFactory,
RejectedExecutionHandler handler) {
if (corePoolSize < 0 ||
maximumPoolSize <= 0 ||
maximumPoolSize < corePoolSize ||
keepAliveTime < 0)
throw new IllegalArgumentException();
if (workQueue == null || threadFactory == null || handler == null)
throw new NullPointerException();
this.acc = System.getSecurityManager() == null ?
null :
AccessController.getContext();
this.corePoolSize = corePoolSize;
this.maximumPoolSize = maximumPoolSize;
this.workQueue = workQueue;
this.keepAliveTime = unit.toNanos(keepAliveTime);
this.threadFactory = threadFactory;
this.handler = handler;
}
Copy the code
The meanings of the parameters are as follows:
- CorePoolSize: number of resident core threads
- MaximumPoolSize: The maximum number of concurrent threads in the thread pool, which must be greater than or equal to 1
- WorkQueue: A queue of tasks submitted that have not yet been executed
- KeepAliveTime: The idle time of a thread, that is, the non-core thread. When too many tasks are queued in the queue, a thread smaller than or equal to the maximum number of threads is created as a temporary thread to execute the tasks in the queue. If the idle time of such temporary threads exceeds the keepAliveTime, they are destroyed and only the core threads are left
- Unit Unit of idle time
- ThreadFactory specifies the threadFactory used to create threads
- RejectedExecutionHandler refused to strategy, said when the queue is full and the worker thread is greater than or equal to the thread for the maximum number of threads, deal with again to receive the thread of strategy
Parameter usage scenarios:
- After creating a thread pool, wait for submitted task requests
- When the execute() method is called to add a request task, the thread pool makes the following judgments:
- If the number of threads running is less than
corePoolSize
Create a new thread to run the task - If the number of running threads is greater than or equal to
corePoolSize
, so put the task inThe queue
In the - If the queue is full and the number of threads running is less than
maximumPool
, then createNon-core thread
Run the task - If the number of threads has reached maximumPool, the thread pool is started
Saturation rejection strategy
To execute.
- If the number of threads running is less than
- When a thread completes its task, it pulls the next task from the queue to execute
- When a thread is idle over
keepAliveTime
, the thread pool determines that if the number of currently running threads is greater than corePoolSize, the thread is destroyed - The thread pool eventually shrinks to corePoolSize () after all the tasks are complete
Thread rejection policy
When the queue is full and the number of threads has reached maximumPoolSize, subsequent threads are subject to rejection policies of four types:
- AbortPolicy (default) : direct selling RejectedExecutionException a prevent normal operation of system
- CallerRunsPolicy:” caller runs “a moderation mechanism that neither abandons the task nor throws an exception, but instead pushes some tasks back to the caller, reducing the traffic of new tasks
- DiscardOldestPolicy: Discards the longest waiting task in the queue and then adds the current task to the queue to try to submit the current task again
- DiscardPolicy: Discards the task without processing or throwing an exception. This is the best solution if you allow task loss
The thread pool is configured with a reasonable number of threads
cpu-intensive
This task requires a lot of computation and has no blocking, the CPU is always running at full speed, cpu-intensive tasks can only be accelerated by multi-threading on a true multi-core CPU. Cpu-intensive tasks are configured with as few threads as possible: number of CPU cores +1 thread pool
IO intensive
If IO – intensive task threads are not executing tasks all the time, configure as many threads as possible, such as CPU cores x 2
- Number of threads: Formula: Number of CPU cores /1- Block coefficient Block coefficient: 0.8-0.9 For example, 8-core CPU: 8/1-0.9 = 80 threads
Final recommendations
If you do not use Executors to create the thread pool, go to ThreadPoolExecutor instead. If you do not use Executors to create the thread pool, go to ThreadPoolExecutor. Executors disadvantages of each method:
- NewFixedThreadPool and newSingleThreadExecutor: The main problem is that the allowed queue length is integer. MAX_VALUE, and the stacked request processing queue can consume a lot of memory, or even OOM
- NewCachedThreadPool and newScheduledThreadPool: The main problem is that the maximum number of threads is integer. MAX_VALUE, which can create a very large number of threads, or even OOM