Original link: www.jianshu.com Author: Sun Qiang Jimmy

Those familiar with Java multithreaded programming know that when we create too many threads, it is easy to cause memory overflow, so it is necessary to use thread pool technology.

directory

Advantages of thread pools

Use of thread pools

3 How thread pools work

4 Thread pool parameters

4.1 workQueue

4.2 threadFactory

4.3 Rejection Policy (Handler)

5 Functional thread pools

5.1 Fixed Length ThreadPool (FixedThreadPool)

5.2 ScheduledThreadPool (ScheduledThreadPool)

5.3 Cacheable ThreadPool

5.4 Single Threaded Thread Pool (SingleThreadExecutor)

5.5 contrast

6 summarizes

reference

Advantages of thread pools

In general, thread pools have the following advantages:

(1) Reduce resource consumption. Reduce the cost of thread creation and destruction by reusing created threads.

(2) Improve the response speed. When a task arrives, it can be executed immediately without waiting for the thread to be created.

(3) Improve the manageability of threads. Threads are scarce resources. If they are created without limit, they will not only consume system resources, but also reduce system stability. Thread pools can be used for unified allocation, tuning, and monitoring.

Use of thread pools

The real implementation class for a thread pool is ThreadPoolExecutor, which has four constructors:

public ThreadPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue<Runnable> workQueue) { this(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue, Executors.defaultThreadFactory(), defaultHandler); } public ThreadPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue<Runnable> workQueue, ThreadFactory threadFactory) { this(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue, threadFactory, defaultHandler); } public ThreadPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue<Runnable> workQueue, RejectedExecutionHandler handler) { this(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue, Executors.defaultThreadFactory(), handler); } public ThreadPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue<Runnable> workQueue, ThreadFactory threadFactory, RejectedExecutionHandler handler) { if (corePoolSize < 0 || maximumPoolSize <= 0 || maximumPoolSize < corePoolSize || keepAliveTime < 0) throw new IllegalArgumentException(); if (workQueue == null || threadFactory == null || handler == null) throw new NullPointerException(); this.corePoolSize = corePoolSize; this.maximumPoolSize = maximumPoolSize; this.workQueue = workQueue; this.keepAliveTime = unit.toNanos(keepAliveTime); this.threadFactory = threadFactory; this.handler = handler; }Copy the code

As you can see, it takes the following parameters:

  • CorePoolSize (required) : Number of core threads. By default, the core thread will always live, but when allowCoreThreadTimeout is set to true, the core thread will also timeout reclaim.
  • MaximumPoolSize (required) : The maximum number of threads that a thread pool can hold. When the number of active threads reaches this value, subsequent new tasks will block.
  • KeepAliveTime (required) : Idle timeout duration of a thread. If this time is exceeded, non-core threads are reclaimed. If allowCoreThreadTimeout is set to true, the core thread will also timeout.
  • Unit (required) : Specifies the time unit for the keepAliveTime parameter. Timeunit.milliseconds, timeunit.seconds, and timeunit.minutes are commonly used.
  • WorkQueue (required) : Task queue. Runnable objects submitted via the thread pool’s execute() method will be stored in this parameter. It is implemented by blocking queue.
  • ThreadFactory (optional) : threadFactory. Used to specify how new threads are created for the thread pool.
  • Handler (Optional) : rejects the policy. The saturation strategy that needs to be executed when the maximum number of threads is reached.

The thread pool usage flow is as follows:

ThreadPoolExecutor threadPool = new ThreadPoolExecutor(CORE_POOL_SIZE, MAXIMUM_POOL_SIZE, KEEP_ALIVE, TimeUnit.SECONDS, sPoolWorkQueue, sThreadFactory); Threadpool.execute (new Runnable() {@override public void run() {... // Tasks executed by threads}}); // disable the threadPool threadpool.shutdown (); // set the state of the threadPool to SHUTDOWN, and then interrupt all threads threadpool.shutdownnow () that are not executing tasks; // Set the state of the thread pool to STOP, then try to STOP all threads executing or suspending tasks and return to the list of waiting tasksCopy the code

3 How thread pools work

Let’s describe how thread pools work and get a better understanding of the parameters. Its working principle flow chart is as follows:

From the diagram above, I think you have an idea of all the parameters. More on task queues, thread factories, and rejection policies.

4 Thread pool parameters

4.1 workQueue

Task queues are implemented based on blocking queues, using the producer-consumer pattern, with the BlockingQueue interface implemented in Java. But Java already provides us with seven implementations of blocking queues:

  1. ArrayBlockingQueue: A bounded blocking queue consisting of an array structure that implements a circular queue with Pointers.
  2. LinkedBlockingQueue: a bounded blocking queue consisting of a linked list structure that defaults to integer.max_value if the capacity is not specified.
  3. PriorityBlockingQueue: An unbounded blocking queue that supports priority sorting. There are no requirements for elements. The Comparable interface can be implemented or a Comparator can be provided to compare elements in the queue. It has nothing to do with time, it’s just prioritizing tasks.
  4. DelayQueue: similar to PriorityBlockingQueue, an unbounded PriorityBlockingQueue implemented by the binary heap. Elements are required to implement Delayed interface, and the task is extracted from the queue by executing delay, and the task cannot be extracted until the time is up.
  5. SynchronousQueue: a blocking queue that does not store elements, which blocks when a consumer thread calls take() until a producer thread produces an element, which the consumer thread can retrieve and return; The producer thread also blocks when it calls the put() method and does not return until one of the consumer threads consumes an element.
  6. LinkedBlockingDeque: a bounded two-ended blocking queue implemented using a two-way queue. Double-endian means you can FIFO (first in, first out) like a normal queue, or you can FILO (first in, last out) like a stack.
  7. LinkedTransferQueue: It is a combination of ConcurrentLinkedQueue, LinkedBlockingQueue, and SynchronousQueue, but using it in ThreadPoolExecutor behaves the same as LinkedBlockingQueue. But an unbounded blocking queue.

Note the difference between bounded and unbounded queues: with bounded queues, the rejection policy is executed when the queue is saturated and the maximum number of threads is exceeded. With unbounded queues, however, it makes no sense to set maximumPoolSize because tasks can always be added to a task queue.

4.2 threadFactory

Thread factories specify how threads are created by implementing the ThreadFactory interface and the **newThread(Runnable R)** method. The Executors framework has implemented a default thread factory for us:

/** * The default thread factory. */private static class DefaultThreadFactory implements ThreadFactory { private static final AtomicInteger poolNumber = new AtomicInteger(1); private final ThreadGroup group; private final AtomicInteger threadNumber = new AtomicInteger(1); private final String namePrefix; DefaultThreadFactory() { SecurityManager s = System.getSecurityManager(); group = (s ! = null) ? s.getThreadGroup() : Thread.currentThread().getThreadGroup(); namePrefix = "pool-" + poolNumber.getAndIncrement() + "-thread-"; } public Thread newThread(Runnable r) { Thread t = new Thread(group, r, namePrefix + threadNumber.getAndIncrement(), 0); if (t.isDaemon()) t.setDaemon(false); if (t.getPriority() ! = Thread.NORM_PRIORITY) t.setPriority(Thread.NORM_PRIORITY); return t; }}Copy the code

4.3 Rejection Policy (Handler)

When the number of threads in the thread pool reaches the maximum number of threads, a rejection policy needs to be implemented. The RejectedExecutionHandler interface and the **rejectedExecution(Runnable r, ThreadPoolExecutor Executor)** method are required to implement the rejectedExecution policy. But the Executors framework has implemented 4 rejection strategies for us:

  1. AbortPolicy (default) : discard task and throw RejectedExecutionException anomalies.
  2. CallerRunsPolicy: This task is handled by the calling thread.
  3. DiscardPolicy: Discards the task without throwing an exception. You can customize the processing with this pattern.
  4. DiscardOldestPolicy: Discards the earliest unprocessed task in the queue and tries to execute the task again.

5 Functional thread pools

Too cumbersome to use thread pools above? 4 common functional thread pools have been wrapped up by Executors, as follows:

  • Fixed length thread pool (FixedThreadPool)
  • Timed thread pool (SchedledThreadPool)
  • Cacheable Thread pool
  • Single threaded thread pool (SingleThreadExecutor)

5.1 Fixed Length ThreadPool (FixedThreadPool)

Create method source:

public static ExecutorService newFixedThreadPool(int nThreads) { return new ThreadPoolExecutor(nThreads, nThreads, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>()); }public static ExecutorService newFixedThreadPool(int nThreads, ThreadFactory threadFactory) { return new ThreadPoolExecutor(nThreads, nThreads, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>(), threadFactory); }Copy the code
  • Features: there are only core threads, the number of threads is fixed, and the task queue is a bounded queue with linked list structure.
  • Application scenario: Controls the maximum number of concurrent threads.

Example:

/ / 1. Create a fixed-length thread pool object & set the thread pool thread count fixed for 3 executorservice fixedThreadPool = Executors. NewFixedThreadPool (3); Runnable task =new Runnable(){public void run() {system.out.println (" Execute task "); }}; // 3. Submit the task fixedThreadPool.execute(task) to the thread pool;Copy the code

5.2 ScheduledThreadPool (ScheduledThreadPool)

Create method source:

private static final long DEFAULT_KEEPALIVE_MILLIS = 10L; public static ScheduledExecutorService newScheduledThreadPool(int corePoolSize) { return new ScheduledThreadPoolExecutor(corePoolSize); }public ScheduledThreadPoolExecutor(int corePoolSize) { super(corePoolSize, Integer.MAX_VALUE, DEFAULT_KEEPALIVE_MILLIS, MILLISECONDS, new DelayedWorkQueue()); } public static ScheduledExecutorService newScheduledThreadPool( int corePoolSize, ThreadFactory threadFactory) { return new ScheduledThreadPoolExecutor(corePoolSize, threadFactory); }public ScheduledThreadPoolExecutor(int corePoolSize, ThreadFactory threadFactory) { super(corePoolSize, Integer.MAX_VALUE, DEFAULT_KEEPALIVE_MILLIS, MILLISECONDS, new DelayedWorkQueue(), threadFactory); }Copy the code
  • Features: the number of core threads is fixed, and the number of non-core threads is unlimited. The task queue is a delayed blocking queue.
  • Application scenario: Perform scheduled or periodic tasks.

Example:

/ / 1. Create a thread pool regularly object & set the thread pool thread fixed number of 5 scheduledexecutorservice scheduledThreadPool = Executors. NewScheduledThreadPool (5); Runnable task =new Runnable(){public void run() {system.out.println (" Execute task "); }}; / / 3. Submit a task to a thread pool scheduledThreadPool. The schedule (task, 1, TimeUnit. SECONDS); . / / delay after 1 s mission scheduledThreadPool scheduleAtFixedRate (task, 10100, TimeUnit. MILLISECONDS); // perform tasks every 1000ms after a delay of 10msCopy the code

5.3 Cacheable ThreadPool

Create method source:

public static ExecutorService newCachedThreadPool() { return new ThreadPoolExecutor(0, Integer.MAX_VALUE, 60L, TimeUnit.SECONDS, new SynchronousQueue<Runnable>()); }public static ExecutorService newCachedThreadPool(ThreadFactory threadFactory) { return new ThreadPoolExecutor(0, Integer.MAX_VALUE, 60L, TimeUnit.SECONDS, new SynchronousQueue<Runnable>(), threadFactory); }Copy the code
  • Features: No core thread, the number of non-core threads is unlimited, the execution is idle for 60 seconds after recycling, the task queue is a blocking queue that does not store elements.
  • Application scenario: Perform a large number of time-consuming tasks.

Example:

/ / 1. Create a cached thread pool object ExecutorService cachedThreadPool = Executors. NewCachedThreadPool (); Runnable task =new Runnable(){public void run() {system.out.println (" Execute task "); }}; // 3. Submit the task cachedThreadPool.execute(task) to the thread pool;Copy the code

5.4 Single Threaded Thread Pool (SingleThreadExecutor)

Create method source:

public static ExecutorService newSingleThreadExecutor() { return new FinalizableDelegatedExecutorService (new ThreadPoolExecutor(1, 1, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>())); }public static ExecutorService newSingleThreadExecutor(ThreadFactory threadFactory) { return new FinalizableDelegatedExecutorService (new ThreadPoolExecutor(1, 1, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>(), threadFactory)); }Copy the code
  • Features: only 1 core thread, no more than the core thread, the execution of the immediate recovery, the task queue for the linked list structure of the bounded queue.
  • Application scenario: The operations, such as database operations and file operations, are not suitable for concurrent operations but may cause I/O block and affect UI thread response.

Example:

/ / 1. Create a single threaded thread pool ExecutorService singleThreadExecutor = Executors. NewSingleThreadExecutor (); Runnable task =new Runnable(){public void run() {system.out.println (" Execute task "); }}; / / 3. Submit a task to a thread pool singleThreadExecutor. Execute (task);Copy the code

5.5 contrast

6 summarizes

Use the Thread pool for the Executors. If so, you can clear the running rules of the thread pool and avoid resource depletion. If so, you can complete the following steps: 1.

Executors have the following disadvantages:

  • FixedThreadPool and SingleThreadExecutor: The main problem is that stacked request processing queues are all LinkedBlockingQueue, which can cost a lot of memory, or even OOM.
  • CachedThreadPool and ScheduledThreadPool: The main problem is that the maximum number of threads is integer. MAX_VALUE, which can create a very large number of threads, or even OOM.

reference

  • Android multithreading: ThreadPool full parsing

  • Still creating a thread pool using Executors? Beware of Memory overflow

  • Alibaba Java Development Manual

The Path to Advanced JAVA Architecture