• Executor framework

  • ExecutorTwo-level scheduling model for the framework (based on HotSpot)

    • At the upper level, Java multithreaded programs typically break the application into several tasks and then use a user-level scheduler (ExecutorFramework) maps these tasks to a fixed number of threads;
    • Underneath, the operating system kernel maps these threads to the hardware processor.

    A two-level scheduling model for tasks

  • structure
    • 3 most
      • Task. Includes the interfaces that the task being performed needs to implement:RunnableInterface orCallableInterface.
      • Task execution. Includes the core interface of task execution mechanismExecutorAnd inherited fromExecutortheExecutorServiceInterface.ExecutorThe framework has two key classes implementedExecutorServiceInterface (ThreadPoolExecutorandScheduledThreadPoolExecutor).
      • The result of asynchronous computation. Including the interfaceFutureAnd the implementationFutureOf the interfaceFutureTaskClass.

    Classes and interfaces

  • Members of the
    • ThreadPoolExecutor: The factory class is usually usedExecutorsTo create.
      • SingleThreadExecutor
        • Using a single thread, applicable toYou need to make sure you do it sequentiallyTasks; And no more than one thread is active at any one point in time.
      • FixedThreadPool
        • Using a fixed number of threads, applicable toHeavy loadServer.
      • CachedThreadPool
        • New threads are created as needed, unbounded in size, applicable toPerform a lot of short-term asynchronous tasksSmall program, orThe load is lighterServer.
    • ScheduledThreadPoolExecutor: The factory class is usually usedExecutorsTo create.
      • Contains several threads ScheduledThreadPoolExecutor.
        • Creating a fixed number of threads applies to scenarios where multiple background threads are required to perform periodic tasks and the number of background threads is limited to meet resource management requirements.
      • ScheduledThreadPoolExecutor contains only a single thread.
        • Creating a single thread applies to scenarios where a single background thread is required to execute periodic tasks in sequence.
    • Futureinterface
      • FutureTask implementation class that represents the result of an asynchronous computation.
    • RunnableThe interface andCallableinterface
      • RunnableNo results are returned.
      • CallableResults can be returned.
  • ThreadPoolExecutorBreak down
  • Four components
    • corePool: Size of the core thread pool.
    • maximumPool: Size of the maximum thread pool.
    • BlockingQueue: a work queue used to temporarily save tasks.
    • RejectedExecutionHandler: whenThreadPoolExecutorClosed orThreadPoolExecutorWhen saturated (the maximum thread pool size is reached and the work queue is full),execute()Method to be calledHandler.
  • Three kinds of ThreadPoolExecutor
    • FixedThreadPool
      • Reusable thread pools with a fixed number of threads.
      • Use unbounded queuesLinkedBlockingQueueThe work queue that acts as a thread pool (the capacity of the queue is integer.max_value.
    • SingleThreadExecutor
      • Using a singleworkerThe threadExecutor.
      • Use unbounded queuesLinkedBlockingQueueThe work queue that acts as a thread pool (the capacity of the queue is integer.max_value.
    • CachedThreadPool
      • A thread pool that creates new threads as needed.
      • Use the volumetric SynchronousQueue as the work queue of the thread pool.
  • ScheduledThreadPoolExecutorBreak down
    • ScheduledThreadPoolExecutor inherited from ThreadPoolExecutor. It is mainly used to run tasks after a given delay, or to perform tasks periodically
    • ScheduledThreadPoolExecutor execution mainly divided into two parts
      • When calling the scheduleAtFixedRate ScheduledThreadPoolExecutor fang () method or the scheduleWith – FixedDelay () method, Will be added to ScheduledThreadPoolExecutor DelayQueue an implementation of a RunnableScheduledFutur ScheduledFutureTask interface
      • The thread in the thread pool retrieves the ScheduledFutureTask from the DelayQueue and executes the task

ScheduledThreadPoolExecutor operation mechanism


  • Blocking queues in Java

    • JDK 7 provides seven blocking queues
      • ArrayBlockingQueue: A bounded blocking queue composed of array structures.
      • LinkedBlockingQueue: A bounded blocking queue consisting of a linked list structure.
      • PriorityBlockingQueue: An unbounded blocking queue that supports priority sorting.
      • DelayQueue: an unbounded blocking queue implemented using a priority queue.
      • SynchronousQueue: A blocking queue that does not store elements.
      • LinkedTransferQueue: An unbounded blocking queue consisting of a linked list structure.
      • LinkedBlockingDeque: A bidirectional blocking queue consisting of a linked list structure.
  • ArrayBlockingQueue: Array bounded blocking queue, default thread not fair access queue, fairness is implemented using reentrant lock

public ArrayBlockingQueue(int capacity, boolean fair) {
    if (capacity <= 0)
    throw new IllegalArgumentException();
    this.items = new Object[capacity];
    lock = new ReentrantLock(fair);
    notEmpty = lock.newCondition();
    notFull = lock.newCondition();
}
Copy the code
  • LinkedBlockingQueue: list bounded blocking queue, default length integer.max_value
  • PriorityBlockingQueue: Is an unbounded blocking queue that supports a priority. By default, elements are sorted in ascending order
  • DelayQueue is an unbounded blocking queue that supports delayed fetching of elements, implemented using PriorityQueue
    • Application Scenarios:
      • The design of the cache system: we can use the DelayQueue to store the expiration date of cached elements, and use a thread to query the DelayQueue. Once the element can be retrieved from the DelayQueue, the expiration date of the cache is up
      • Scheduled task scheduling: The DelayQueue is used to store the tasks that will be executed on the current day and the execution time. Once the tasks are obtained from the DelayQueue, the task will be executed. For example, the TimerQueue is implemented using DelayQueue
  • SynchronousQueue: a blocking queue that does not store elements. It supports fair access queues, which are accessed by threads using non-fair policies by default.
  • LinkedTransferQueue: Is an unbounded blocking TransferQueue queue made up of linked lists.
    • More tryTransfers and transfer methods than other blocking queues
      • When a consumer is waiting to receive an element, the transfer method can immediately transfer the element passed in by the producer to the consumer. When no consumer is waiting to receive an element, the element is stored in the tail node of the queue and will be returned until the element is consumed by the consumer
      • As above, test whether the attribute can be passed directly to the consumer, if no consumer, return false.
  • LinkedBlockingDeque: is a two-way blocking queue consisting of a linked list structure