Concurrency and parallelism

A concurrent program is one in which two or more threads exist at the same time. If the program is running on a single-core processor, the threads swap in and out of memory alternately. These threads are co-existing, each thread being in some state during execution. If the program is running on a multi-core processor, threads can be executed in parallel, and each thread in the program is assigned to a separate processor core.

For example, if you create 4 threads, but the CPU is single-core, then the concurrency is 4 and the parallelism is only 1. If the CPU is dual-core, the concurrency is still 4 and the parallelism is 2.

How processes communicate with each other

  1. The pipe
  2. The message queue
  3. The Shared memory
  4. A semaphore
  5. The socket communication

Threads and processes

  • A process can contain multiple threads.
  • It is difficult to share data between different processes, but data can be shared between different threads in the same process.
  • Processes consume more computer resources than threads
  • Processes do not interact with each other, and the failure of one thread can cause the failure of the entire process.
  • The memory address used by a process can be locked, meaning that when a thread uses some shared memory, other threads must wait for it to terminate before they can use it.

The thread pool life cycle

  1. RUNNING, SHUTDOWN, STOP, end (TERMINATED)
  2. The difference between shutdown() and shutdowNow() : Shutdown () is a gentle shutdown process. The thread pool stops accepting new tasks and waits for all submitted tasks to complete, including those that have not been queued yet. In this case, the thread pool is in shutdown state. ShutdownNow () is an immediate shutdown process in which the thread pool stops accepting new tasks and the thread pool cancels all executed and queued tasks that have not yet been executed. The thread pool is stopped.
  3. The thread pool is TERMINATED once shutdown() or shutdownNow() is TERMINATED and the thread pool enters the TERMINATED state.

Six states of a thread

Why do we need thread pools

  • It helps us manage threads without increasing resource consumption by creating and destroying threads.
  • Improve response speed. If the task arrives, it will be much slower than taking the thread from the pool and creating a new thread to execute it.
  • Reuse. When the thread is used up, it is put back into the pool, which can achieve the effect of reuse and save resources.

The core parameters of the thread pool

  • CorePoolSize: number of core threads in the thread pool
  • MaximumPoolSize: specifies the maximum number of threads in a thread pool
  • KeepAliveTime: The idle lifetime of a non-core thread
  • Unit: Unit of non-core thread lifetime
  • WorkQueue: block queue for storing tasks
  • ThreadFactory: A factory for creating threads
  • Handler: Saturation policy for the thread pool

Thread pool task execution process

  • When a task is submitted to a thread pool for execution, the number of viable core threads in the thread pool is first detected to see if the number is smaller than corePoolSize. If so, a new core thread is created to execute the submitted task.
  • If the core thread is full, or checks if the workQueue is full. If the workQueue is not full, the submitted task is queued.
  • If the workQueue is full, it detects whether the number of threads alive in the thread pool has reached maximumPoolSize, and if not, a non-core thread is created to execute the submitted task.
  • If maximumPoolSize has been reached, the reject policy is executed.

Four rejection strategies

  • AbortPolicy: Throws an exception
  • DiscardPolicy: Discards tasks directly
  • DiscardOldestPolicy: Discards the oldest task in the task queue
  • CallerRunsPolicy: Hands the task off to the thread that calls the thread pool

The task queue of the thread pool

  • ArrayBlockingQueue: a bounded blocking queue of array implementations, first-in, first-out
  • LinkedBlockingQueue: a first in, first out (FIFO) queue that is unbounded and has a maximum length of integer.max_value.
  • DelayQueue: indicates the DelayQueue. It is sorted in ascending order according to the specified execution time. Otherwise, it is sorted according to the order in which the queue is inserted. This queue is used in newScheduledThreadPool.
  • PriorityBlockingQueue: An unbounded blocking queue with a priority
  • SynchronousQueue: a SynchronousQueue, used in the newCachedThreadPool, in which each insert must wait for a thread to remove. Otherwise, the insert is blocked.

Several common thread pools

NewFixedThreadPool, which is suitable for executing long-term tasks.

  • The number of core threads is the same size as the maximum number of threads
  • There is no idle lifetime for non-core threads
  • The blocking queue is the unbounded queue LinkedBlockingQueue
  • Problem: Using thread pools with unbounded queues can cause memory spikes.

NewCachedThreadPool, used to execute a large number of small, short-term tasks concurrently.

  • The number of core threads corePoolSize is 0
  • The maximum number of threads maximumPoolSize is integer.max_value
  • The keepAliveTime for non-core threads is 60 seconds
  • Blocking queues use SynchronousQueue
  • Problem: When tasks are submitted faster than they can be processed, too many threads are created, depleting CPU and memory resources.

NewSingleThreadExecutor, suitable for serial task execution scenarios.

  • The number of core threads corePoolSize is 1
  • The maximum number of threads MaximumPoolSize is also 1
  • The blocking queue uses the unbounded queue LinkedBlockingQueue
  • The keepAliveTime for non-core threads is 0

NewScheduledThreadPool is applicable to the scenario where tasks are executed periodically

  • The maximum number of threads MaximumPoolSize is integer.max_value
  • The core thread corePoolSize is set by itself and is generally set to 1
  • The keepAliveTime for non-core threads is 0
  • ScheduleAtFixedRate () : Executes at a certain rate
  • ScheduleWithFixedDelay () : Executes after a certain delay
  • Blocking queues use DelayedWorkQueue