This is the 8th day of my participation in the More text Challenge. For details, see more text Challenge

Thread pool: Thread pool: thread pool

A thread pool is a pool of threads that are managed by creating a collection of threads called a thread pool. Using a thread pool is a good way to improve performance, the thread pool at system startup is to create a large number of idle threads, program will be a task to the thread pool, the thread pool will start a thread to execute this task, after the execution, the thread will not die, but again returned to the thread pool become idle, waiting for the next mission.

Advantages of thread pool By reusing existing threads, the overhead caused by thread creation and destruction is reduced, and the system response speed is improved. When a task arrives, the thread is directly taken from the thread pool. By reusing existing threads, the task can be executed immediately without waiting. Improved thread manageability. If threads are created without limit, they not only consume system resources, but also reduce system stability. (Excessive memory usage will generate OOM and cause excessive CPU switching)

Java thread pool creation and main parameters

No matter what type of thread pool is created (FixedThreadPool, CachedThreadPool…) , the ThreadPoolExecutor constructor is called. NewFixedThreadPool (fixed number of threads) newCachedThreadPool(pool of cacheable threads) newSingleThreadExecutor(pool of single threads) NewScheduledThreadPool (timed and periodically executed thread pool)

Public ThreadPoolExecutor(int corePoolSize, // The number of threads that can be maintained in the pool for a long time. Long keepAliveTime, TimeUnit Unit, // Idle duration of threads exceeding corePoolSize, // Idle duration of threads exceeding corePoolSize, // Redundant threads will be reclaimed. BlockingQueue<Runnable> workQueue, BlockingQueue<Runnable> workQueue, RejectedExecutionHandler handler) RejectedExecutionHandler (RejectedExecutionHandlerCopy the code

CorePoolSize: The number of threads to keep in the pool, even if they are idle, unless {@code allowCoreThreadTimeOut} is set Whether or not they are idle after they are created. The thread pool needs to hold a corePoolSize of threads unless allowCoreThreadTimeOut is set.

MaximumPoolSize: The maximum number of threads to allow in the pool. Maximum number of threads: maximum number of threads that can be created in a thread pool.

KeepAliveTime: when the number of threads is greater than the core, This is the maximum time that excess idle threads will wait for new tasks before terminating. (Survival time: if more than the number of core threads have not received a new task after the keepAliveTime, then reclaim.)

Unit: The time unit for the {@code keepAliveTime} argument

WorkQueue: the queue to use for holding tasks before they are executed. This queue will hold only the {@code Runnable} tasks Submitted by the {@code Execute} method Queue for tasks to be executed: When the number of submitted tasks exceeds the core thread size, the submitted tasks are stored here. It is only used to hold the Runnable tasks submitted by the Execute method. So I’m not going to translate this as work queue, okay? Don’t dig yourself a hole.

ThreadFactory: The factory to use when the executor creates a new thread. Thread engineering: Used to create a thread factory. For example, you can customize the name of the thread. When you do the virtual machine stack analysis, you can see where the thread came from.

Handler: The handler to use when execution is blocked because the thread bounds and queue of defects are reached. (Reject policy: When the queue is full of tasks and the maximum number of threads are working, then the thread pool that continues to submit tasks cannot handle them. What kind of reject policy should be implemented?)

Thread pool task execution process

When a task is submitted and the number of viable core threads in the pool is less than the number of threads corePoolSize, the pool creates a core thread to process the submitted task.

If the number of core threads in the thread pool is full, that is, the number of threads is equal to the corePoolSize, a new submitted task will be placed in the workQueue for execution.

When the number of threads in the pool is equal to the corePoolSize and the workQueue is full, determine whether the number of threads has reached maximumPoolSize. If not, create a non-core thread to execute the submitted task.

If the current number of threads reaches maximumPoolSize and new tasks come along, the rejection policy is applied directly.

Thread pool rejection policy

Thread pool to refuse strategy is divided into several: AbortPolicy: throw RejectedExecutionException directly, the default policy.

DiscardPolicy: Do nothing, simply discard the task.

DiscardOldestPolicy: Discards the next task to be executed (discarding the oldest task in the execution queue in an attempt to make room for the current submitted task).

CallerRunsPolicy: Execute the task in the main thread (the committer performs the task).

Work queues

Several typical work queues ArrayBlockingQueue: a bounded blocking queue implemented using an array, first-in, first-out feature.

LinkedBlockingQueue: LinkedBlockingQueue (queue) can be set up capacity based on the blocking queue list structure, in accordance with the FIFO scheduling tasks, capacity can choose to set, is not set, will be a blocking queue without borders, The maximum length is integer.max_value and the throughput is usually higher than ArrayBlockingQuene; The newFixedThreadPool thread pool uses this queue;

PriorityBlockingQueue: An unbounded blocking queue with priority implemented using a balanced binary tree heap

DelayQueue:DelayQueue is a queue that delays the execution of a task in a scheduled period. Order from small to large by the specified execution time, otherwise by order of insertion into the queue; NewScheduledThreadPool The thread pool uses this queue;

SynchronousQueue: a blocking queue that does not store elements. Each insert operation is blocked until another thread calls the remove operation. Throughput is usually higher than LinkedBlockingQuene, which is used by the newCachedThreadPool thread pool;

LinkedTransferQueue: An unbounded blocking queue composed of a linked list structure. LinkedTransferQueue has more tryTransfer and transfer methods than other blocking queues.

LinkedBlockingDeque: A two-way blocking queue consisting of a linked list structure. Is a two-way blocking queue consisting of a linked list structure

Common thread pools

SingleThreadExecutor: public static ExecutorService newSingleThreadExecutor() { return new FinalizableDelegatedExecutorService (new ThreadPoolExecutor(1, 1, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue())); }

Create a single thread. It applies to tasks that need to be executed sequentially; And there are no application scenarios where multiple threads are active at any point in time.

SingleThreadExecutor’s corePoolSize and maximumPoolSize are set to 1 so that only one thread can survive at a time. Use the unbounded LinkedBlockingQueue (first in, first out, so that the tasks are done in order) as the work queue for the thread pool.

When there are no threads in the thread pool, a new thread is created to perform the task.

When there is a thread in the current thread pool, add the new task to the LinkedBlockingQueue

After the thread completes its first task, it repeats the task from the LinkedBlockingQueue in an infinite loop. KeepAliveTime of 0

Usage scenario: Applicable to the scenario of serial execution of tasks, task by task execution;

FixedThreadPool:

public static ExecutorService newFixedThreadPool(int nThreads) {
        return new ThreadPoolExecutor(nThreads, nThreads,
                                      0L, TimeUnit.MILLISECONDS,
                                      new LinkedBlockingQueue<Runnable>());
    }
Copy the code

Feature: The number of core threads is the same size as the maximum number of threads. KeepAliveTime is 0L, which means that extra threads immediately terminate the blocking queue as an unbounded LinkedBlockingQueue

FixedThreadPool is a pool with a fixed number of threads. When threads are idle, they are not reclaimed unless the pool is closed. When all threads are active, new tasks wait until a thread becomes idle.

Note: This is because newFixedThreadPool uses the unbounded blocking queue LinkedBlockingQueue. If a thread gets a task and the task takes a long time to execute, the queue will accumulate and the memory usage of the machine will spike. OOM;

Usage scenario: Suitable for processing CPU-intensive tasks, ensure that the CPU is used by worker threads for a long time, and allocate as few threads as possible, that is, suitable for executing long-term tasks.

CachedThreadPool:

public static ExecutorService newCachedThreadPool() {
        return new ThreadPoolExecutor(0, Integer.MAX_VALUE,
                                      60L, TimeUnit.SECONDS,
                                      new SynchronousQueue<Runnable>());
    }
Copy the code

Features:

The maximum number of threads is integer. MAX_VALUE The blocking queue is SynchronousQueue (there can only be one element, and new tasks block and wait). The idle lifetime of non-core threads is 60 seconds

Usage scenario: Perform a large number of short life cycle tasks. Because maximumPoolSize is unbounded, the speed at which a task is submitted > the speed at which a thread in the pool can process a task is constantly creating new threads; Each time a task is submitted, it is immediately processed by a thread, so CachedThreadPool is suitable for processing a large number of tasks with a small amount of time.

ScheduledThreadPool:

MAX_VALUE, the work queue uses DelayedWorkQueue,

Non-core threads have a lifetime of 0, so the thread pool contains only a fixed number of core threads.

There are two ways to submit tasks:

ScheduleAtFixedRate: the schedule is executed periodically at a fixed rate

ScheduleWithFixedDelay: the last task is executed after a fixed delay time

Usage scenario: A scenario where tasks are executed periodically and the number of threads needs to be limited