Introduction to Java thread pools and core concepts

A list,

Java provides tool classes for managing threads (Thread Management API)

Main functions: thread scheduling, reuse; Control the number of threads.

Benefit: Saves on the performance overhead associated with frequent thread creation.

Two, how to use

  1. Use the JDK’s own thread pool

    // Use the built-in thread pool in the JDK
    ExecutorService service = Executors.newSingleThreadExecutor();
    service.execute(()->{
        // The logical code to execute
    });
    Copy the code

    Important: Be familiar with the core parameters of several thread pools and their application scenarios (helpful to understand custom thread pools).

  2. Use custom thread pools

    // Custom thread pool
    ExecutorService customService = new ThreadPoolExecutor(
            0.// Core pool size
            Integer.MAX_VALUE,	     // Maximum capacity of the thread pool
            60L.// Non-core thread pool lifetime
            TimeUnit.SECONDS,	     // Time unit
            new SynchronousQueue<>(),    // Block queue, used to store tasks
            runnable -> {
                // Thread factory, used to name threads in a thread pool
                Thread thread = new Thread(runnable, "my Thread: ");
                return thread;
            },
            (runnable, threadPoolExecutor) -> {
                // Custom reject policy when the number of tasks in the blocking queue + the number of non-core threads > the maximum capacity of the thread pool
                // The policy will be used to reject the user's submission request});// Submit the task
    customService.execute(()->{
            // The logical code to execute
    });
    Copy the code

    Important: Be familiar with the core parameters of a custom thread pool and their meanings.

3. Basic concepts

3.1 corePoolSize

The size of the core pool is an important indicator for creating a thread pool. When a new task is submitted to the thread pool, if the number of threads running in the thread pool is less than the size of the core pool, a core thread will be directly created to run the task.

3.2 Maximum thread pool size (maximumPoolSize)

The maximum thread pool capacity is the maximum number of threads that can be used in a thread pool at the same time. The reject policy is triggered when the number of core threads + non-core threads > the maximum capacity of the thread pool.

3.3 Blocking Queue (BlockQueue)

Blocking queues are used to store tasks submitted by users to the thread pool. In the producer-consumer model, blocking queues play the role of transferring data generated by producers to consumers. Since blocking queues can ensure thread safety, users do not have to worry too much about thread safety.

Features:

As its name implies, the most important feature of a blocking queue is that it blocks. When the queue is empty, if the consumer still makes a take request, the request will be blocked until the queue inserts an element. Similarly, when the queue is full and the producer produces data that needs to be put into the blocking queue, the blocking queue will still block the request until there is an empty space in the queue.

Figure 3.1 Blocking queue PUT

Figure 3.2 Blocking queue Take

Classification:

Bounded queue: The size of the user incoming queue that will block incoming PUT requests once exceeded.

Unbounded queue: The holding element is generally set to the maximum it can currently carry, which is a very large number and can be considered approximately unbounded.

Thread pools in JDK

4.1 SingleThreadExecutor (SingleThread Pool)

Definition:

new ThreadPoolExecutor(
  1.// Core pool size
  1.// Maximum capacity of the thread pool
  0L.// Non-core thread lifetime
  TimeUnit.MILLISECONDS, 	// Time unit
  new LinkedBlockingQueue()     // block the queue
  threadFactory);	        // Thread factory
Copy the code

SingleThreadExecutor has one and only one core thread. All submitted tasks need to be executed through this core thread. The runtime is single-thread serial execution of tasks.

Procedure: Create a task 1 and submit it to the thread pool to run. The thread pool is empty and a core thread is created to run the task. At this time, another task 2 is created and submitted to the thread pool for execution. At this time, the core thread is occupied and task 2 is added to the blocking queue for execution.

4.2 FixedThreadPool

Definition:

new ThreadPoolExecutor(
  nThreads, 			// Core pool size - user-defined
  nThreads, 			// Maximum thread pool capacity - user-defined
  0L.// Non-core thread lifetime
  TimeUnit.MILLISECONDS, 	// Time unit
  new LinkedBlockingQueue(), 	// block the queue
  threadFactory);		// Thread factory
Copy the code

FixedThreadPool is a fixed length thread pool whose length is determined by the user. As you can see from the initialization parameters, there are only core threads in the fixed length thread pool.

Procedure: Create a core thread and execute the task immediately as long as the number of core threads is not full when the task is currently submitted, otherwise it is queued up for execution.

4.3 CachedThreadPool

Definition:

new ThreadPoolExecutor(
  0.// Core pool size
  2147483647.// Maximum capacity of the thread pool
  60L.// Non-core thread lifetime
  TimeUnit.SECONDS,         // Time unit
  new SynchronousQueue(),   // block the queue
  threadFactory);	    // Thread factory
Copy the code

There are no core threads in the cache thread pool, only non-core threads with a lifetime of 60 seconds, which means they can be reused in 60 seconds.

Process: the first task is submitted to the thread pool will be added to the block queue, and wait for the thread to perform the task, if the current threads have free will then use this thread to carry out the task, if there is no idle thread then create a non-core thread to run the task, when the first task is to take out the second task can be submitted to join the queue; SynchronousQueue does not cache tasks. That is, when tasks are added to the queue, they must be removed immediately. Otherwise, subsequent tasks added to the queue will be blocked.

The blocking queue of the cache thread pool does not cache any tasks

4.4 ScheduledThreadPool (Delayed ThreadPool)

Definition:

new ThreadPoolExecutor(
  corePoolSize, 					// Core pool size
  2147483647.// Maximum capacity of the thread pool
  10L.// Non-core thread lifetime
  TimeUnit.MILLISECONDS, 				// Time unit
  new ScheduledThreadPoolExecutor.DelayedWorkQueue(),   // block the queue
  threadFactory, 					// Thread factory
  handler);						// Reject the policy
Copy the code

The delayed thread pool mainly performs scheduled and periodic tasks.

Five, the summary

  1. What is a thread pool? Is a set of thread management apis that integrate the creation, scheduling, and reuse of threads.

  2. Why thread pools? One of the biggest advantages of thread pooling is thread management and thread reuse. Thread creation is known to be an extremely expensive operation because it involves interacting with the operating system. Imagine what would happen if 100,000 threads were created using new Thread(). There is a very high probability of OOM. But not with thread pools, which is the biggest advantage of thread reuse, the reuse of resources.

  3. Core parameters of the thread pool?

    1. Core pool size: When the number of runnable tasks in the thread pool is less than the core pool size, the thread pool will immediately create a core thread to run the task, even after the task is completed, the core thread’s resources are not reclaimed.
    2. Maximum capacity of the thread pool: The maximum number of tasks that can be held in the thread pool. If the number of core threads + the number of non-core threads > the maximum capacity of the thread pool, the reject policy will be triggered.
    3. Non-core thread lifetime: The lifetime of a non-core thread after the execution of a task is complete. During this lifetime, the thread can be reused. If no task needs to be reused within the specified time, the thread’s resources will be reclaimed.
    4. Time unit: time unit of non-core thread lifetime.
    5. Blocking queue: Acting as a message sender, producers are actually added to the blocking queue when they submit tasks; When a consumer gets a task, it actually gets it from a blocking queue. The biggest feature of blocking queues isblockingWhen the blocking queue is full, the operations submitted by the producer will be blocked until the blocking queue is free. Similarly, when the blocking queue is empty, the consumer acquisition operation will be blocked until there is a task committed in the blocking queue.
    6. Thread factory: A factory for creating threads. You can customize the factory to provide named threads.
    7. Reject policy: The reject policy is triggered when the number of core threads + the number of non-core threads > the maximum capacity of the thread pool.