Preamble: It’s better to read in order
Concepts section – Processes and threads, tasks and queues
GCD for iOS Multithreading
NSOperation for iOS Multithreading
Processes and Threads
Definition 1.
-
Process: A process refers to an application program that is running in the system. Apps such as wechat and QQ are all processes.
-
Thread: A thread is the basic execution unit of a process and the basic unit of processor scheduling. All tasks of a process are executed in threads. The process must have at least one thread. By default, the program starts with one thread. This thread is called the main thread or UI thread
2. Relationships and differences
- Threads in the same process share the address space and resources (such as memory, I/O, CPU, etc.) of the same process, but are independent of each other.
- If one process crashes, it has no effect on other processes in protected mode, but if one thread crashes, the entire process dies, so multiple processes are more robust than multiple threads. The process switchover consumes large resources and is highly efficient. So when it comes to frequent switching, it’s better to use threads than processes. Similarly, if concurrent operations are required simultaneously and share some variables, only threads can be used, not processes.
- Processes run independently and threads must depend on them
Tasks and Queues
-
Task: This means to perform an operation, in other words the piece of code that you execute in a thread (in GCD’s case, in a block). There are two ways to execute a task: synchronous execution and asynchronous execution. The main differences are whether to wait for queued tasks to finish and whether to have the ability to start new threads.
-
Synchronous execution (sync):
- After a task is added to a specified queue, the system waits until the task in the queue completes.
- You can only execute tasks in the current thread and do not have the ability to start new threads.
-
Asynchronous execution:
- Asynchronously adds a task to a specified queue. It can continue executing the task without doing any waiting.
- Tasks can be performed in new threads, with the ability to start new threads.
Synchronization will not be cut while I/O is waiting, wasting time. If all services are exclusive to the CPU, there is no difference between multi-wire and single-wire services.
-
-
Queue: The Queue here refers to the Queue waiting to execute the task, that is, the Queue used to store the task. A queue is a special linear table that uses FIFO (first in, first out) principles. There are two kinds of queues in GCD serial queues and concurrent queues. Both comply with the FIFO (First in first out) principle. The main differences between the two are: the order of execution is different, and the number of open threads is different.
-
Serial Dispatch Queue: Only one task is executed at a time. Let tasks be executed one after the other. (Only one thread is started. After one task is executed, the next task is executed.)
-
Concurrent Dispatch Queue: Allows multiple tasks to execute concurrently (simultaneously). (Multiple threads can be started and tasks can be executed simultaneously)
-
So there are four ways to combine tasks and queues
- Synchronous execution + concurrent queue
- Asynchronous execution + concurrent queue
- Synchronous execution + serial queue
- Asynchronous execution + serial queue
Queue/execution | Concurrent queue | Serial queues | Main queue (special serial queue) | Global queues (special concurrent queues) |
---|---|---|---|---|
Synchronization (sync) | No new thread is started, the task is executed in serial | No new thread is started, the task is executed in serial | The deadlock is stuck and not executed | No new thread is started, the task is executed in serial |
Asynchronous (async) | New threads are started to execute tasks concurrently | A new thread is started to execute tasks sequentially | No new thread is started, the task is executed in serial | New threads are started to execute tasks concurrently |
Pay attention to
The concurrency function of the concurrent queue is only valid for the asynchronous (dispatch_async) method.
The primary queue is a special serial queue, and the global queue is a special concurrent queue (system-provided)
The main queue is essentially a normal serial queue, and invoking main queue + synchronous execution in the main thread will cause deadlock problems. This is because the synchronization tasks appending to the main queue and the tasks of the main thread itself are waiting for each other, blocking the main queue and ultimately causing a deadlock in the thread where the main queue resides (the main thread).
Parallelism and Concurrency
1. The difference between parallelism and concurrency
-
Concurrency: Turn tasks over to the processor at different points in time. Tasks do not run simultaneously at the same time.
-
Parallelism: Each task is assigned to each processor to perform independently. At the same time, the tasks must be running at the same time
Single-core processor, multiple threads can only execute concurrently (concurrency is the principle of CPU switching back and forth quickly);
Multi-core processors allow for true parallelism, that is, true simultaneous operation
Since a single core processor cannot be truly parallel, is it necessary for a single core CPU to multithread?
Usually a task takes time not only on the CPU, but also on THE IO (for example, looking up data in a database, catching web pages, reading and writing files, etc.). While one process is waiting for THE I/O, the CPU is idle, and the other process can use the CPU to calculate. Multiple processes running together can fill up both IO and CPU.
If both services are exclusive to the CPU, there is no difference between multiple threads and single threads in the single-core case
Now are generally virtual resources, resources have a rebound mechanism, so generally should run multithreading when you can run multithreading
2. Interview question: Why does multithreading improve execution?
1. On multi-core processors, the execution efficiency can be improved by dividing the task to be executed into multiple threads that can be executed in parallel.
2. On a single-core processor, multiple threads can only be executed concurrently, rather than in parallel. The principle of concurrency is that the CPU switches back and forth quickly to execute a specific task at a specific time. If blocking is not taken into account, multi-threaded concurrent execution is actually more time-consuming than single-threaded execution, too many threads also cause CPU load, and threads consume memory resources. Creating and destroying threads is also expensive.
Multithreading increases efficiency by increasing CPU utilization. Database access, the speed of the disk IO operations such as much slower than the speed of the CPU execution code, single-threaded environment, these blocks execution operation, cause the CPU idle, waiting for the operation to complete, so for produce these blocking program, using multithreading can avoid waiting for CPU during idle, improve the utilization rate of CPU.
Considering some of the drawbacks of multithreading, today’s web servers do not rely on multiple threads or processes to support large amounts of concurrency, but instead use other technologies such as asynchronous I/O