Hello, today I share the Java thread pool with you. Take out your little book and write it down.

1. Why use thread pools

1. Frequent creation and destruction of single threads wastes resources and leads to frequent GC

2. Threads compete with each other due to lack of unified management

2.ThreadPoolExecutor

ThreadPoolExecutor has four overloaded constructors. The constructor with the most overloaded parameters is used to give you an idea of what the other methods mean:

public ThreadPoolExecutor(int corePoolSize,

                          int maximumPoolSize,

                          long keepAliveTime,

                          TimeUnit unit,

                          BlockingQueue<Runnable> workQueue,

                          ThreadFactory threadFactory,

                          RejectedExecutionHandler handler) 
Copy the code

Description of each parameter:

Here are the 7 parameters (we use more of the 5 parameter constructor in development), OK, let’s see what the 7 parameters mean:

CorePoolSize Specifies the number of core threads in the thread pool

MaximumPoolSize Specifies the maximum number of threads in the thread pool

KeepAliveTime Timeout period of a non-core thread. When the idle time of a non-core thread exceeds keepAliveTime, the thread is reclaimed. If the allowCoreThreadTimeOut property of ThreadPoolExecutor is set to true, this parameter also represents the timeout duration of the core thread

Unit Unit of the third parameter, including nanosecond, microsecond, millisecond, second, minute, hour, day, etc

WorkQueue A queue of tasks in a thread pool. This queue is used to store tasks that have been submitted but not yet executed. The tasks stored here are submitted by the Execute method of ThreadPoolExecutor.

ThreadFactory provides the ability to create new threads for the thread pool, which we generally use as the default

Handler reject strategy, when there is no thread when performing a new task (usually because the number of threads in thread pool has reached maximum number or caused by thread pool closed), by default, when a thread pool is unable to process the new thread, throws a RejectedExecutionException.

  • WorkQueue is introduced

1. The ArrayBlockingQueue: This represents an BlockingQueue with a specified size. The constructor of ArrayBlockingQueue takes an int that represents the size of the BlockingQueue. Elements stored in the ArrayBlockingQueue are accessed in a FIFO (first-in, first-out) fashion.

2. LinkedBlockingQueue: The constructor of LinkedBlockingQueue can pass data of type int. This queue can be created with a size or not. LinkedBlockingQueue is integer. MAX_VALUE.

3. PriorityBlockingQueue: This queue is similar to LinkedBlockingQueue except that the elements in PriorityBlockingQueue are not sorted by FIFO. Instead, the access order is determined by the element’s Comparator. (This feature also reflects that data stored in PriorityBlockingQueue must implement the Comparator interface.)

4. SynchronousQueue will: SynchronousQueue, one of the thread-safe BlockingQueues, SynchronousQueue, where the producer thread inserts must wait for the consumer thread to remove, Synchronous has no data cache. Therefore, the SynchronousQueue cannot be read or traversed, and elements exist only when you attempt to fetch them. We can think of producers and consumers waiting for each other, waiting for each other and then leaving together.

  • Rejection policies

AbortPolicy: Reject directly and throw an exception, which is also the default policy.

CallerRunsPolicy: Directly tell the thread calling the execute method to perform this task.

DiscardOldestPolicy: Discards the oldest unprocessed task and then retry the current new task.

DiscardPolicy: Discards the current task without throwing an exception

When the number of threads does not reach corePoolSize, a new thread is created to execute the task.

When the number of core threads is full, the task is put on the blocking queue.

When the queue is full and the maximum number of threads has not been reached, a new non-core thread is created to perform the task (important).

When the queue is full and the maximum number of threads is reached, a rejection policy is selected to execute.

4. Other thread pools

1.FixedThreadPool

Fixed size thread pool, you can specify the thread pool size, corePoolSize is equal to maximumPoolSize, block queue is LinkedBlockingQueue, size is integer maximum.

The number of threads in the thread pool is always the same. When a new task is submitted, it is executed immediately if there are idle threads in the thread pool. If there are no threads, it is temporarily stored in the blocking queue. For a fixed-size thread pool, there is no change in the number of threads.

Also use an unbounded LinkedBlockingQueue to store executed tasks. When task submissions are frequent, the LinkedBlockingQueue grows rapidly and deplets system resources.

In addition, when the thread pool is idle, that is, there are no runnable tasks in the thread pool, it will not release the worker thread, and will occupy some system resources, requiring shutdown

2.SingleThreadExecutor

As you can see, the blocking queue uses LinkedBolckingQueue and the default size is integer. MAX_VALUE, so that if a large number of incoming requests are placed in the queue, they may result in OOM.

3.Executors.newCachedThreadPool()

Cacheable thread pool: check whether there is any thread in the thread pool. If there is any thread, use it directly. If no new thread is added to the thread pool, cache the thread pool

Typically used to perform asynchronous tasks with a short lifetime; The thread pool is infinite. When the current task is executed, the previous task has been completed, and the thread that executed the previous task is reused instead of creating a new thread each time

Cached threads live for 60 seconds by default. The core pool for threads, corePoolSize, is 0, the core pool is integer. MAX_VALUE, and the blocking queue is SynchronousQueue.

Is a direct commit blocking queue, which always forces the thread pool to add new threads to perform new tasks.

When the idle time of the thread exceeds keepAliveTime (60 seconds) when no task is executed, the worker thread will terminate and be reclaimed. When a new task is submitted,

If there are no free threads, creating a new thread to perform the task incurs some system overhead.

If a large number of tasks are submitted at the same time, and the task execution time is not particularly fast, the thread pool will add an equal number of new tasks, which can quickly exhaust the system’s resources.

4.ScheduledThreadPool

Create a fixed – length thread pool to support scheduled and periodic task execution

A timed thread pool that can be used to perform tasks periodically, usually to periodically synchronize data.

ScheduleAtFixedRate: Indicates that tasks are executed at a fixed frequency. The period refers to the interval between successful tasks.

SchedultWithFixedDelay: A task is executed with a fixed delay. The delay is the time after the last execution succeeds and before the next execution starts.

5. Why does Ali recommend custom thread pools

The newFixedThreadPool and newSingleThreadExecutor methods use the LinkedBlockingQueue task queue. The default size of LinkedBlockingQueue is integer.max_value. The thread pool size defined in newCachedThreadPool is integer.max_value.

If the request queue length of FixedThreadPool and SingleThreadPool is Intege. MAX_VALUE, a large number of requests may accumulate, resulting in OOM.

CachedThreadPool allows the number of threads created to be integer.max_value, which may create a large number of threads, resulting in OOM.

6. Other

1. ShutDown () Shuts down the thread pool without affecting the submitted tasks

2. ShutDownNow () closes the thread pool and attempts to terminate the executing thread

AllowCoreThreadTimeOut (Boolean Value) Allows core threads to be reclaimed when idle timeout occurs

4. Create a thread pool in singleton mode

import com.google.common.util.concurrent.ThreadFactoryBuilder;

import java.util.concurrent.*;

/ * *

  • Asynchronous task processor

* /

public class AsyncTaskExecutor {

Public static final int CORE_POOL_SIZE = 10; Public static final int MAX_POOL_SIZE = 40; Public static final int KEEP_ALIVE_TIME = 1000; Public static final int BLOCKING_QUEUE_SIZE = 1000; /** Private static final ThreadPoolExecutor processExecutor = new ThreadPoolExecutor(CORE_POOL_SIZE, MAX_POOL_SIZE, KEEP_ALIVE_TIME, TimeUnit.MICROSECONDS, new LinkedBlockingQueue<Runnable>(BLOCKING_QUEUE_SIZE),Copy the code

new TreadFactoryBuilder.setNameFormat(“boomoom-thread-pool-%d”).build(),

new TreadPoolExecutor.DiscardPolicy());

private AsyncTaskExecutor() {}; /** * @param task */ public void execute(Runnable task) {processexEcutor.submit (task); }Copy the code

}

The difference between slacker and hungrier

1. Hanhan-style is thread-safe. A static object is created at the same time as the class is created for the system to use, and will not change.

To be thread-safe, double-checked locks must be used and objects must be volatile to prevent object instruction rearrangements

Well, that’s all for today’s article, hoping to help those of you who are confused in front of the screen