-
Executor framework
-
Executor
Two-level scheduling model for the framework (based on HotSpot)- At the upper level, Java multithreaded programs typically break the application into several tasks and then use a user-level scheduler (
Executor
Framework) maps these tasks to a fixed number of threads; - Underneath, the operating system kernel maps these threads to the hardware processor.
A two-level scheduling model for tasks
- At the upper level, Java multithreaded programs typically break the application into several tasks and then use a user-level scheduler (
- structure
- 3 most
- Task. Includes the interfaces that the task being performed needs to implement:
Runnable
Interface orCallable
Interface. - Task execution. Includes the core interface of task execution mechanism
Executor
And inherited fromExecutor
theExecutorService
Interface.Executor
The framework has two key classes implementedExecutorService
Interface (ThreadPoolExecutor
andScheduledThreadPoolExecutor
). - The result of asynchronous computation. Including the interface
Future
And the implementationFuture
Of the interfaceFutureTask
Class.
- Task. Includes the interfaces that the task being performed needs to implement:
Classes and interfaces
- 3 most
- Members of the
ThreadPoolExecutor
: The factory class is usually usedExecutors
To create.SingleThreadExecutor
- Using a single thread, applicable to
You need to make sure you do it sequentially
Tasks; And no more than one thread is active at any one point in time.
- Using a single thread, applicable to
FixedThreadPool
- Using a fixed number of threads, applicable to
Heavy load
Server.
- Using a fixed number of threads, applicable to
CachedThreadPool
- New threads are created as needed, unbounded in size, applicable to
Perform a lot of short-term asynchronous tasks
Small program, orThe load is lighter
Server.
- New threads are created as needed, unbounded in size, applicable to
ScheduledThreadPoolExecutor
: The factory class is usually usedExecutors
To create.- Contains several threads ScheduledThreadPoolExecutor.
- Creating a fixed number of threads applies to scenarios where multiple background threads are required to perform periodic tasks and the number of background threads is limited to meet resource management requirements.
- ScheduledThreadPoolExecutor contains only a single thread.
- Creating a single thread applies to scenarios where a single background thread is required to execute periodic tasks in sequence.
- Contains several threads ScheduledThreadPoolExecutor.
Future
interface- FutureTask implementation class that represents the result of an asynchronous computation.
Runnable
The interface andCallable
interfaceRunnable
No results are returned.Callable
Results can be returned.
ThreadPoolExecutor
Break down- Four components
corePool
: Size of the core thread pool.maximumPool
: Size of the maximum thread pool.BlockingQueue
: a work queue used to temporarily save tasks.RejectedExecutionHandler
: whenThreadPoolExecutor
Closed orThreadPoolExecutor
When saturated (the maximum thread pool size is reached and the work queue is full),execute()
Method to be calledHandler
.
- Three kinds of ThreadPoolExecutor
FixedThreadPool
- Reusable thread pools with a fixed number of threads.
- Use unbounded queues
LinkedBlockingQueue
The work queue that acts as a thread pool (the capacity of the queue is integer.max_value.
SingleThreadExecutor
- Using a single
worker
The threadExecutor
. - Use unbounded queues
LinkedBlockingQueue
The work queue that acts as a thread pool (the capacity of the queue is integer.max_value.
- Using a single
CachedThreadPool
- A thread pool that creates new threads as needed.
- Use the volumetric SynchronousQueue as the work queue of the thread pool.
ScheduledThreadPoolExecutor
Break down- ScheduledThreadPoolExecutor inherited from ThreadPoolExecutor. It is mainly used to run tasks after a given delay, or to perform tasks periodically
- ScheduledThreadPoolExecutor execution mainly divided into two parts
- When calling the scheduleAtFixedRate ScheduledThreadPoolExecutor fang () method or the scheduleWith – FixedDelay () method, Will be added to ScheduledThreadPoolExecutor DelayQueue an implementation of a RunnableScheduledFutur ScheduledFutureTask interface
- The thread in the thread pool retrieves the ScheduledFutureTask from the DelayQueue and executes the task
ScheduledThreadPoolExecutor operation mechanism
-
Blocking queues in Java
- JDK 7 provides seven blocking queues
- ArrayBlockingQueue: A bounded blocking queue composed of array structures.
- LinkedBlockingQueue: A bounded blocking queue consisting of a linked list structure.
- PriorityBlockingQueue: An unbounded blocking queue that supports priority sorting.
- DelayQueue: an unbounded blocking queue implemented using a priority queue.
- SynchronousQueue: A blocking queue that does not store elements.
- LinkedTransferQueue: An unbounded blocking queue consisting of a linked list structure.
- LinkedBlockingDeque: A bidirectional blocking queue consisting of a linked list structure.
- JDK 7 provides seven blocking queues
-
ArrayBlockingQueue: Array bounded blocking queue, default thread not fair access queue, fairness is implemented using reentrant lock
public ArrayBlockingQueue(int capacity, boolean fair) {
if (capacity <= 0)
throw new IllegalArgumentException();
this.items = new Object[capacity];
lock = new ReentrantLock(fair);
notEmpty = lock.newCondition();
notFull = lock.newCondition();
}
Copy the code
- LinkedBlockingQueue: list bounded blocking queue, default length integer.max_value
- PriorityBlockingQueue: Is an unbounded blocking queue that supports a priority. By default, elements are sorted in ascending order
- DelayQueue is an unbounded blocking queue that supports delayed fetching of elements, implemented using PriorityQueue
- Application Scenarios:
- The design of the cache system: we can use the DelayQueue to store the expiration date of cached elements, and use a thread to query the DelayQueue. Once the element can be retrieved from the DelayQueue, the expiration date of the cache is up
- Scheduled task scheduling: The DelayQueue is used to store the tasks that will be executed on the current day and the execution time. Once the tasks are obtained from the DelayQueue, the task will be executed. For example, the TimerQueue is implemented using DelayQueue
- Application Scenarios:
- SynchronousQueue: a blocking queue that does not store elements. It supports fair access queues, which are accessed by threads using non-fair policies by default.
- LinkedTransferQueue: Is an unbounded blocking TransferQueue queue made up of linked lists.
- More tryTransfers and transfer methods than other blocking queues
- When a consumer is waiting to receive an element, the transfer method can immediately transfer the element passed in by the producer to the consumer. When no consumer is waiting to receive an element, the element is stored in the tail node of the queue and will be returned until the element is consumed by the consumer
- As above, test whether the attribute can be passed directly to the consumer, if no consumer, return false.
- More tryTransfers and transfer methods than other blocking queues
- LinkedBlockingDeque: is a two-way blocking queue consisting of a linked list structure