Thread pools are one of the most important things to ask about when looking for a Java job. In an interview, someone can speak for five minutes and someone can speak for half an hour. In the depth of knowledge. Below a few interview frequency questions, you can? Won’t? Well, then “Like” bookmark.
- The function of the various parameters of the thread pool, a brief description of the thread pool workflow.
- What are the common thread pools, and where are they applicable?
- Can threads using unbounded queues cause memory spikes?
Here are the latest BAT interview questions, 2020 latest version!! Need friends can click: point this! This point! , code word: J j
Java thread pool concept
As the name suggests, managing a pool of threads has the following advantages over manually creating and running threads
Reduces the overhead of thread creation and destruction
Improve response speed. When the task arrives, it is much faster to grab a thread directly from the thread pool than to create one manually
Improved thread manageability. Threads are scarce resources. If created without limit, they consume system resources and degrade system stability. Thread pools can be used for consent allocation, tuning, and monitoring.
Java thread pool created
No matter what type of thread pool you create (FixedThreadPool, CachedThreadPool…) The ThreadPoolExecutor constructor is called, and the arguments are explained in more detail below
public ThreadPoolExecutor(int corePoolSize,
int maximumPoolSize,
long keepAliveTime,
TimeUnit unit,
BlockingQueue<Runnable> workQueue) {
this(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue,
Executors.defaultThreadFactory(), defaultHandler);
}
Copy the code
**corePoolSize: ** Maximum number of core threads. In layman’s terms, maximum number of resident threads in a thread pool
**maximumPoolSize:** Maximum number of threads running in the thread pool (both core and non-core)
**keepAliveTime:** The maximum length of time an idle thread (non-core thread only) can live in a thread pool
**unit: a keepAliveTime unit used with keepAliveTime
**workQueue: ** The blocking queue that stores tasks
** Handler: ** Thread pool saturation policy
Thread pools execute processes
When a new task is submitted, the thread pool process is as follows:
- Determine whether the number of core threads in the thread pool has reached the threshold corePoolSize. If not, create a new core thread to execute the task
- If the number of core threads reaches the threshold corePoolSize, check whether the workQueue is full. If not, add new tasks to the blocking queue
- If yes, check whether the number of threads in the thread pool reaches the threshold maximumPoolSize. If no, create a non-core thread to execute the task. If the threshold is reached, a thread pool saturation policy is executed.
There are several thread pool saturation strategies:
- AbortPolicy: Throws an exception directly, the default policy
- DiscardPolicy: Discards tasks directly
- DiscardOldestPolicy: Discarding the next task to be executed (oldest task)
- CallerRunsPolicy: Executes tasks in the main thread
From a process perspective, a more graphic picture:
From a structural point of view, a more graphic picture:
Several typical work queues
**ArrayBlockingQueue:** A bounded blocking queue implemented using arrays, with first-in, first-out features.
**LinkedBlockingQueue:** A queue implemented using a linked list, with features first in first out. Its capacity can be set to Interger.MAX_VALUE, with features first in first out.
**PriorityBlockingQueue:** Unbounded blocking queue with priority using balanced binary tree heap.
**DelayQueue:** unbounded blocking DelayQueue. Each element in the queue has an expiration time. When an element is fetched from the queue, only the expired element is removed from the queue. The queue header element is the largest element to expire.
SynchronousQueue:** A blocking queue that does not store elements. Each insertion must wait until another thread calls the remove operation, otherwise the insertion remains blocked.
public static ExecutorService newFixedThreadPool(int nThreads, ThreadFactory threadFactory) {
return new ThreadPoolExecutor(nThreads, nThreads,
0L, TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>(),
threadFactory);
}
Copy the code
Several typical thread pools:
SingleThreadExecutor
public static ExecutorService newSingleThreadExecutor(a) {
return new FinalizableDelegatedExecutorService
(new ThreadPoolExecutor(1.1.0L, TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>()));
}
Copy the code
Create a single thread. It applies to tasks that need to be executed sequentially; And no more than one thread is active at any one point in time. CorePoolSize and maximumPoolSize for SingleThreadExecutor are set to 1, using the unbounded queue LinkedBlockingQueue as the work queue for the thread pool.
When there are no threads in the thread pool, a new thread is created to perform the task.
Add new tasks to LinkedBlockingQueue when there is one thread in the current thread pool
After the thread completes its first task, it repeatedly fetkes the task from LinkedBlockingQueue in an infinite loop to execute.
Application scenario: This mode is applicable to serial execution scenarios
FixedThreadPool
public static ExecutorService newFixedThreadPool(int nThreads, ThreadFactory threadFactory) {
return new ThreadPoolExecutor(nThreads, nThreads,
0L, TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>(),
threadFactory);
}
Copy the code
CorePoolSize equals maximumPoolSize, so there are only core threads in the thread pool, using the unbounded blocking queue LinkedBlockingQueue as the work queue.
A FixedThreadPool is a thread pool with a fixed number of threads. When threads are idle, they are not reclaimed unless the pool is closed. When all threads are active, new tasks wait until a thread is free.
If the number of threads currently running is less than corePoolSize, a new thread is created to execute the task.
After the number of threads reaches corePoolSize, the new task is placed in the LinkedBlockingQueue blocking queue.
After the thread has completed the task in (1), it will repeatedly fetch the task from LinkedBlockingQueue in the loop to execute it.
Usage scenario: Suitable for processing CPU-intensive tasks. Ensure that the CPU is used by workers for a long time, and allocate as few threads as possible, that is, suitable for executing long-term tasks.
CachedThreadPool
public static ExecutorService newCachedThreadPool(a) {
return new ThreadPoolExecutor(0, Integer.MAX_VALUE,
60L, TimeUnit.SECONDS,
new SynchronousQueue<Runnable>());
}
Copy the code
The number of core threads is 0, and the threshold for the total number of threads is integer.max_value, which means that unlimited non-core threads can be created
Execute the process
- Execute the offer method of SynchronousQueue to submit the task and check if there are any free threads in the thread pool to execute
- SynchronousQueue’s poll method to remove tasks. If so, the pairing succeeds and the task is handed over to the idle thread
- Otherwise, the pairing fails and a new thread is created to handle the task
- When threads in the thread pool are idle, the poll method of SynchronousQueue is executed to wait for execution
- SynchronousQueue Newly committed task. If you wait more than 60 seconds, the idle thread terminates
Process image
Structural image
Usage scenario: Perform a large number of short life cycle tasks. Since maximumPoolSize is unbounded, the rate at which a task is submitted > the rate at which a thread in the thread pool can process a task constantly creates new threads; Every time a task is submitted, there is an immediate thread to process it, so CachedThreadPool is suitable for handling a large number of small tasks.
ScheduledThreadPoolExecutor
public static ScheduledExecutorService newScheduledThreadPool(int corePoolSize) {
return new ScheduledThreadPoolExecutor(corePoolSize);
}
public ScheduledThreadPoolExecutor(int corePoolSize) {
super(corePoolSize, Integer.MAX_VALUE, 0, NANOSECONDS,
new DelayedWorkQueue());
}
Copy the code
The thread count threshold is integer.max_value, the work queue uses DelayedWorkQueue, and the non-core thread lifetime is 0, so the thread pool contains only a fixed number of core threads.
There are two ways to submit a task:
- ScheduleAtFixedRate: The command is executed periodically at a fixed rate
- ScheduleWithFixedDelay: The last task is executed after a fixed delay
See the periodic thread pool newScheduledThreadPool for more details
** Use scenario: ** Perform tasks periodically and need to limit the number of threads
** Interview question: ** Does using thread pools with unbounded queues cause memory spikes?
Answer: Yes, newFixedThreadPool uses an unbounded queue called LinkedBlockingQueue. If a thread takes a long time to execute a task, the queue will pile up and the machine memory usage will spike, resulting in OOM.
Finally provide free Java architecture learning materials, learning technology content includes: Spring, Dubbo, MyBatis, RPC, source code analysis, high concurrency, high performance, distributed, performance optimization, micro services advanced architecture development and so on.
Friends in need can click:This point! This point!, code word: J j
There are Java core knowledge points + a full set of architect learning materials and video + first-line factory interview gem + resume template can be received + Ali Meituannetease Tencent Xiaomi IQiyi Quick hand bilibilibilii interview questions +Spring source code collection +Java architecture practice ebook.