Grand Central Dispatch (GCD) is apple’s preferred multithreaded solution for developers. Multithreaded development involves a lot of details, and I will use examples to explain GCD in detail below. Please be sure to read it closely, and be sure to use Xcode or Playground to run the code many times to compare the results. After practicing this article, you will find it easy to master Swift multithreading.

In the first half of this article, I will try my best to simplify the discourse and lower the threshold of entry. As my understanding goes deeper, I will go into more details step by step.

Part ONE: The foundation

1. Serial, parallel, synchronous, asynchronous

  • Serial: in this paper, serial queue. Multiple tasks are executed in serial queue. They can only be run in sequence. The first one didn’t run, and the second one had to wait in line. And so on, until all tasks have run.
  • Parallel: In this context, parallel queues, where multiple tasks are executed in parallel queues and can run simultaneously.
  • Synchronization: In this context, synchronous task execution refers to the sequential execution of multiple tasks in a thread, and the execution order is fixed and the same as the execution order of the tasks. The total time is the sum of the time consumed by all tasks.
  • Asynchronous: In this context, asynchronous execution of tasks, also in order to perform multiple tasks, but in multiple threads running at the same time, the execution of the order is random and unpredictable. The total time is approximately the time consumed by the longest task.

2. DispatchWorkItem

DispatchWorkItem: essentially a task, you can write the code you want to execute as a closure and pass it in when DispatchWorkItem is initialized to facilitate subsequent management tasks and make your code cleaner.

Original text: The work you want to perform, encapsulated in a way that lets you attach a completion handle or execution dependencies.

Schedule work item initialization. In normal cases, use the first method (more on special cases later) :

Let item1 = DispatchWorkItem {print("item1")} //2. Let item2 = DispatchWorkItem(qos:.userinteractive, flags:.barrier) {print("item2")} let item = DispatchWorkItem(qos:.userinteractive, flags:.barrier) {print("item2")}Copy the code

3. DispatchQueue profile

Scheduling queue: An object that manages tasks to be executed serially or in parallel on the main thread or background thread of an app.

Original text: An object that manages the execution of tasks serially or concurrently on your app’s main thread or on a background thread.

There are three types of DispatchQueue:

  • Main queue
  • Global queue
  • Custom queue

3.1 Main Queue

Main queue Is a Serial queue associated with the Main thread. Ui-related operations must be executed in the Main queue.

let mainQueue = DispatchQueue.main
Copy the code

3.2 Global Queue

A Global queue running on a background thread is a Global queue shared within the system. It is a Concurrent queue used to process Concurrent tasks.

let globalQueue = DispatchQueue.global()
Copy the code

3.3 Custom Queue (Serial by default)

The Custom queue runs on a background thread and is a Serial queue by default. When initialized, the attributes parameter is set to.concurrent and can be created concurrently as follows:

// let serialQueue = DispatchQueue(label: "test") "test", attributes: .concurrent)Copy the code

4. DispatchGroup profile

Scheduling group: a group in which you can put multiple tasks into one group for unified management.

A group of tasks that you monitor as A single unit.

DispatchGroup makes it easy to manage multiple tasks. For example, when all the events in the same group are complete, the GCD API can send a notification and perform the corresponding action. Common methods:

  • Notify () : the notification is received when all tasks in the scheduling group are completed and the current thread is not blocked.
  • Wait () : waits until all tasks in the scheduling group are completed or the wait times out, blocking the current thread.

Part TWO: Actual Combat

5. Use DispatchQueue

Create the Playground project, define four scheduling tasks, and provide them for the following invocation, which can greatly reduce the amount of code below. Please copy some of the running results by yourself, and I will only talk about the results:

Let item1 = DispatchWorkItem {for I in 0... 4{ print("item1 -> \(i) thread: \(Thread.current)") } } let item2 = DispatchWorkItem { for i in 0... 4{ print("item2 -> \(i) thread: \(Thread.current)") } } let item3 = DispatchWorkItem { for i in 0... 4{ print("item3 -> \(i) thread: \(Thread.current)") } } let item4 = DispatchWorkItem { for i in 0... 4{ print("item4 -> \(i) thread: \(Thread.current)") } }Copy the code

5.1 Asynchronous Execution

MainQueue = dispatchqueue.main mainQueue.async(execute: item1) MainQueue.async (execute: item1) MainQueue.async (execute: item2) mainQueue.async(execute: item3) mainQueue.async(execute: Let globalQueue = dispatchqueue.global () globalQueue.async(execute: item1) globalQueue.async(execute: item2) globalQueue.async(execute: item3) globalQueue.async(execute: Item4) // Add asynchronous tasks to a user-defined serial queue. Let serialQueue = DispatchQueue(Label: "serial") serialQueue.async(execute: item1) serialQueue.async(execute: item2) serialQueue.async(execute: item3) serialQueue.async(execute: Let concurrentQueue = DispatchQueue(label: "Concurrent ", attributes: .concurrent) concurrentQueue.async(execute: item1) concurrentQueue.async(execute: item2) concurrentQueue.async(execute: item3) concurrentQueue.async(execute: item4)Copy the code

Note: Executing an asynchronous task on a serial queue results in exactly the same results as executing a synchronous task

5.2 Synchronization Execution

MainQueue = dispatchqueue.main mainqueue. sync(execute: item1) mainqueue.sync (execute: item1) mainqueue.sync (execute: item2) mainQueue.sync(execute: item3) mainQueue.sync(execute: Let globalQueue = dispatchqueue.global () globalqueue.sync (execute: item1) globalQueue.sync(execute: item2) globalQueue.sync(execute: item3) globalQueue.sync(execute: Item4) // Add synchronization tasks to a user-defined serial queue. Let serialQueue = DispatchQueue(Label: "serial") serialqueue. sync(execute: item1) serialQueue.sync(execute: item2) serialQueue.sync(execute: item3) serialQueue.sync(execute: Let concurrentQueue = DispatchQueue(label: "Concurrent ", attributes: .concurrent) concurrentQueue.sync(execute: item1) concurrentQueue.sync(execute: item2) concurrentQueue.sync(execute: item3) concurrentQueue.sync(execute: item4)Copy the code

Note: Performing a synchronous task on a parallel queue is exactly the same as performing an asynchronous or synchronous task on a serial queue. Do not mix synchronization tasks in the main queue, otherwise it will cause a deadlock.

5.3 Synchronous and Asynchronous Execution

MainQueue = dispatchqueue.main mainqueue. sync(execute: item1)// Synchronization task mainQueue.async(execute: item2) mainQueue.async(execute: item3) mainQueue.async(execute: Item4) // Global queue synchronous asynchronous mixed, synchronous tasks are printed sequentially, asynchronous tasks are printed randomly Let globalQueue = dispatchqueue.global () globalqueue.sync (execute: item1)// Synchronize globalQueue.async(execute: item2) globalQueue.async(execute: item3) globalQueue.async(execute: Item4) // Customize serial queue synchronous asynchronous mixing, print in sequence let serialQueue = DispatchQueue(label: "serial") serialqueue.sync (execute: Async (execute: item2) serialQueue. Async (execute: item3) serialQueue. Async (execute: item3) serialQueue. Item4) // Synchronous tasks are printed sequentially and asynchronous tasks are printed randomly. // In this example, subsequent asynchronous tasks are executed only after the synchronous tasks are executed. Let concurrentQueue = DispatchQueue(label: "Concurrent ", attributes:.concurrent) concurrentQueue.sync(execute: item1)// Synchronize tasks concurrentQueue.async(execute: item2) concurrentQueue.async(execute: item3) concurrentQueue.async(execute: item4)Copy the code

Note: Performing a synchronous task on a parallel queue is exactly the same as performing an asynchronous or synchronous task on a serial queue. Do not mix synchronization tasks in the main queue, otherwise it will cause a deadlock.

6. Deadlock analysis

6.1 Main Queue deadlock

As mentioned above, the main queue cannot be mixed with the synchronization task, otherwise it will cause deadlock. Since the main queue is a serial queue and can only run on the main thread, it cannot create new threads, which means that all code must run on only one thread. Under normal circumstances, there is A continuous flow of asynchronous tasks on the main queue (for example, the task used to constantly refresh the UI, represented by A). If the synchronous task is mixed with the task (represented by B), if B is after A, from the time perspective, B can execute A after B is finished. In space, B can’t be executed until A is done. The two tasks are polite, waiting for each other, giving in to each other, and neither one is too embarrassed to execute first, which causes a deadlock and causes the program to freeze and crash.

Attempting to synchronously execute a work item on the main queue results in a deadlock.

MainQueue = dispatchqueue.main mainqueue. async(execute: item1) mainQueue.async(execute: item1) mainQueue. Item2) mainQueue.async(execute: item3) mainqueue. sync(execute: item4)// Synchronizes tasksCopy the code

One might wonder, what if A comes after B? Would it not cause a deadlock? It doesn’t look like deadlock, but Playground crashes every time it runs code like this. There are asynchronous tasks in the main queue that we can’t see.

MainQueue = dispatchqueue.main mainqueue. sync(execute: item1) mainQueue.async(execute: item2) mainQueue.async(execute: item3) mainQueue.async(execute: item4)Copy the code

Therefore, we can only assume that the synchronization task cannot exist on the main queue, or it will definitely cause a deadlock.

6.2 Other Queues Are deadlocked

As mentioned above, can other types of queues cause deadlocks? Here’s a try:

  • Custom serial queues nested synchronization tasks, resulting in deadlocks
Let serialQueue = DispatchQueue(label: "serial") \(thread.current)") serialqueue. sync {print(" Thread: \(thread.current)")}} // deadlock serialqueue. async {print(" Thread: \(thread.current)") serialqueue. sync {print(" Thread: \(thread.current)")}} serialqueue. sync {print(" Thread: \(thread.current)") serialQueue.async {print(" Thread: \(thread.current)")}} serialQueue.async {print(" Thread: \(thread.current)") serialQueue.async {print(" Thread.current)")}}Copy the code
  • Parallel queues nest synchronization tasks and do not cause deadlocks
// Let concurrentQueue = DispatchQueue(label: "concurrent", attributes: ConcurrentQueue. Async {print(" async thread: \(thread.current)") concurrentQueue.sync {print(" Thread: \(thread.current)")}} concurrentQueue.sync {print(" Thread: \(thread.current)") concurrentQueue.sync {print(" Thread: \(thread.current)")}} concurrentQueue.sync {print(" Thread: \(thread.current)") concurrentQueue. Async {print(" Thread: \(thread.current)")}} concurrentQueue. Async {print(" Thread: \(thread.current)") concurrentQueue.async {print(" Thread.current)")}}Copy the code

6.3 Deadlock summary

As can be seen from the above, self-defined serial queues nested synchronization tasks can also cause deadlocks, so deadlocks are not the patent of the main queue. But why do they cause deadlocks? What’s the core reason? Run the following code to see the result:

Print (" dispatchqueue.main mainqueue.async ") let mainQueue = dispatchqueue.main GlobalQueue = dispatchqueue.global () globalqueue.sync (execute: Item2) print("=> DispatchQueue(label: "serial") serialQueue. Sync (execute: Item3)// Sync task print("=> execute complete 3") let concurrentQueue = DispatchQueue(label: "concurrent", attributes: .concurrent) concurrentQueue.sync(execute: item4)// Synchronization task print("=> Execute complete all") Running result: => Execute complete 1 Item2 -> 0 Thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item2 -> 1 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item2 -> 2 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item2 -> 3 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item2 -> 4 thread: <NSThread: 0x7FBF2cc0e7e0 >{number = 1, name = main} => 0x7fbf2cc0e7e0>{number = 1, name = main} item3 -> 1 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item3 -> 2 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item3 -> 3 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item3 -> 4 thread: <NSThread: 0x7FBF2cc0e7e0 >{number = 1, name = main} => 0x7fbf2cc0e7e0>{number = 1, name = main} item4 -> 1 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item4 -> 2 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item4 -> 3 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item4 -> 4 thread: <NSThread: 0x7FBF2CC0e7e0 >{number = 1, name = main} => All Item1 -> 0 thread: 0x7fbf2cc0e7e0>{number = 1, name = main} item1 -> 1 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item1 -> 2 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item1 -> 3 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main} item1 -> 4 thread: <NSThread: 0x7fbf2cc0e7e0>{number = 1, name = main}Copy the code

See what the problem is? The four groups of code run exactly the same results, even the thread information is the same, all run on the main thread, and the first group of code is executed last. In other words:

  • All tasks on the main queue (only asynchronous) and synchronous tasks on other queues run on the main thread (there is only one main thread).
  • Threads don’t care if a task is synchronous or asynchronous, only queues do.
  • Threads do not deadlock, only queues do.

The root cause of a deadlock when a synchronization task is added to the main queue is:

  • The main queue can only run on the main thread (important things again).
  • The main queue does not have the ability to start a background thread to do anything else.
  • Once the main queue is mixed with the synchronous task, it will wait for each other with the existing asynchronous task, resulting in deadlock.

Adding synchronization tasks to a custom serial queue does not deadlock because:

Custom serial queues have the ability to start the main thread and background thread (only one background thread can be started). When a self-defined serial queue encounters a synchronization task, it will automatically arrange it to be executed in the main thread. When an asynchronous task is encountered, it is automatically scheduled to be executed in a background thread, so there is no deadlock.

Adding synchronization tasks to parallel queues does not deadlock because:

Parallel queues have the ability to start both main and background threads (one or more background threads can be started, with up to 64 background threads on some devices). When a parallel queue encounters a synchronization task, it automatically arranges the task to be executed in the main thread. When an asynchronous task is encountered, it is automatically scheduled to be executed in a background thread, so there is no deadlock.

A custom serial queue where one asynchronous or synchronous task (A) nested another synchronous task (B) causes A deadlock because:

Task A and task B are equivalent to A1 -> B -> A2. Task B is A synchronization task. Task B is after A1 and before A2. Therefore, serial queues cannot nest synchronization tasks, which would cause deadlocks.

7. DispatchQueue switching

7.1 Background

This chapter simulates network request: request network data in APP (task A: takes 10s), process certain data after obtaining it (Task B: takes 5s), and finally refresh THE UI.

If A and B are both synchronous tasks, the main queue will deadlock, and any other queue will freeze the interface for 15s. If you don’t believe me, put the following two thread hibernation methods in the APP UIViewController:

override func viewDidAppear(_ animated: Bool) { //1. Dispatchqueue.global ().sync {sleep(15)// The current thread sleeps for 15 seconds} //2. Dispatchqueue.main. async {sleep(15)// Current thread sleep for 15 seconds}}Copy the code

As expected, both methods stuck the interface for 15 seconds. Recall from the previous section that all synchronization tasks are eventually scheduled to run on the main thread, and the main thread running a long time task will cause the interface to seriously lag, so:

Do not perform time-consuming tasks that can be performed asynchronously. The interface pays off the debt incurred by long synchronous tasks.

If both A and B are asynchronous tasks, you cannot process them in the main queue. This will cause the APP interface to freeze for 15s, because the main thread will run long tasks, which will cause the interface to lag severely.

All long time consuming tasks should never be performed in the main queue. The interface also pays off the debts incurred by the main queue for long asynchronous tasks.

Having said that, you should now have a good understanding of how queues work.

7.2 Network Request Instances

Now let’s talk about the correct way to use GCD multithreading to handle network requests: A, B are defined as asynchronous tasks, nested asynchronous tasks in parallel queues, and finally switch to the main queue to refresh the UI, so that the interface can ensure the most smooth.

Let queue = DispatchQueue(label: "com.apple.request", attributes: Queue. Async {print(" Start request data \(Date()) thread: \(thread.current)") sleep(10) print(" Date() Thread: \(thread.current)") // Queue. Async {print(" Start processing data \(Date()) Thread: \(thread.current)") sleep(5) print(" Date() Thread: \(thread.current)") // Dispatchqueue.main. async {print("UI refresh succeeded \(Date()) Thread: \(thread.current)")}}} 0x7FF917D8C0c0 >{number = 4, name = (null)} Thread: <NSThread: 0x7ff917d8c0c0>{number = 4, name = (null)} 0x7ff8F7d0c190 >{number = 3, name = (null)} 0x7ff8f7d0c190>{number = 3, name = (null)} 0x7ff917c0e7e0>{number = 1, name = main}Copy the code

As you can see, both queues and threads have been switched as expected. GCD queue switching is like a Russian doll, with layer by layer nesting. If nesting fails, go to Chapter 6 deadlock analysis to find the cause and modify it.

8. Use DispatchGroup

If you want to execute another task after completing multiple tasks, you can use DispatchGroup. These tasks can be placed in the same queue or in different queues.

DispatchGroup common methods:

Group.wait () : blocks the current thread until all tasks in the group are completed.

Group.notify () : after all tasks are completed, the notification is sent asynchronously without blocking the current thread.

8.1 Use group.notify() to rewrite the network request example from the previous chapter:

let group = DispatchGroup() let queue = DispatchQueue(label: "com.apple.request", attributes: Queue. Async (group: group) {print(" Start request data \(Date()) thread: \(thread.current)") sleep(10) print(" Date() Thread: \(thread.current)") // Queue. Async (group: \(Date()) thread: \(Date()) thread: \(Date()) thread: \(Date()) thread: \ \(thread.current)")}} print(" thread.current ") Queue) {dispatchqueue.main. async {print("UI refreshed successfully \(Date()) thread: 2020-08-06 06:45:22 +0000 Thread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: > 0x7FE312F30B60 >{number = 4, name = (null)} 0x7fe312f30b60>{number = 4, name = (null)} 2020-08-06 06:45:32 +0000 Thread: 0x7FE312e70d70 >{number = 5, name = (null)} 2020-08-06 06:45:37 +0000 Thread: 0x7fe312e70d70>{number = 5, name = (null)} 0x7fe312c0e7e0>{number = 1, name = main}Copy the code

As you expect, the results are the same.

8.2 Simplified code, listen for notifications and refresh UI directly in the main queue:

let group = DispatchGroup() let queue = DispatchQueue(label: "com.apple.request", attributes: Queue. Async (group: group) {print(" Start request data \(Date()) thread: \(thread.current)") sleep(10) print(" Date() Thread: \(thread.current)") // Queue. Async (group: \(Date()) thread: \(Date()) thread: \(Date()) thread: \(Date()) thread: \ \(thread.current)")}} print(" thread.current ") Dispatchqueue.main) {print("UI refresh succeeded \(Date()) thread: 2020-08-06 06:49:31 +0000 Thread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: <NSThread: > 0x7FC608C80370 >{number = 4, name = (null)} Thread: <NSThread: 0x7FC608c80370 >{number = 4, name = (null)} 0x7FC608d2b200 >{number = 5, name = (null)} 0x7FC608d2b200 >{number = 5, name = (null)} 0x7fc608c0e7e0>{number = 1, name = main}Copy the code

As you would expect, the results are still consistent.

Use group.wait() to rewrite:

let group = DispatchGroup() let queue = DispatchQueue(label: "com.apple.request", attributes: Queue. Async (group: group) {print(" Start request data \(Date()) thread: \(thread.current)") sleep(10) print(" Date() Thread: \(thread.current)") // Queue. Async (group: \(Date()) thread: \(Date()) thread: \(Date()) thread: \(Date()) thread: \ \(thread.current)")}} print(" thread.current ") Dispatchqueue.main) {print("UI refresh succeeded \(Date()) thread: \(thread.current)")} group. Wait () : print(" thread.current ") <NSThread: 0x7Fe1AD538580 >{number = 4, name = (null)} 0x7fe1AD538580 >{number = 4, name = (null)} 2020-08-06 06:53:10 +0000 thread: 0x7FE1B8010060 >{number = 5, name = (null)} 2020-08-06 06:53:15 +0000 Thread: 0x7fe1B8010060 >{number = 5, name = (null)} Thread: <NSThread: 0x7fe1ad40e7e0>{number = 1, name = main}Copy the code

You can see that group.wait() does block the current thread.

Part three: Advanced chapter

9. DispatchGroup is suspended and restored

In the example in Chapter 7, there are three levels of nesting, not much, but a hint of nested hell. This section addresses the nesting problem with queue suspension and resume rewrite. You can do the same for more levels of nesting in the future.

let group = DispatchGroup() let queue1 = DispatchQueue(label: "com.apple.request", attributes: .concurrent) let queue2 = DispatchQueue(label: "com.apple.response", attributes: .concurrent) queue2.suspend()// Queue suspits // Async (group: group) {print(" Start requesting data \(Date()) thread: \(thread.current)") sleep(10) print(" Date() Thread: \(thread.current)") queue2.resume()// Network data request completed, resume queue, data processing} // Async (group: \(Date()) thread: \(Date()) thread: \(Date()) thread: \(Date()) thread: \ \(thread.current)")} print(" thread.current ") Dispatchqueue. main) {print("UI refresh successfully \(Date()) thread: \(thread.current)")} print(" Thread finished ")Copy the code

10. Thread safety

If a variable can be read and written by multiple threads at the same time, the result is unpredictable and special processing is required to ensure thread-safety.

10.1 Setting Barriers through Barriers

You can configure flags for DispatchWorkItem to be set to. Barrier. You can configure flags for DispatchWorkItem to be executed after all tasks preceding the barrier are completed. (note: barrier is invalid for global queues and flags.)

import Foundation let item1 = DispatchWorkItem { for i in 0... 4{ print("item1 -> \(i) thread: \(Thread.current)") } } let item2 = DispatchWorkItem { for i in 0... {print("item2 -> thread: \(thread.current)")}} .barrier) { for i in 0... 4{ print("item3 barrier -> \(i) thread: \(Thread.current)") } } let item4 = DispatchWorkItem { for i in 0... 4{ print("item4 -> \(i) thread: \(Thread.current)") } } let item5 = DispatchWorkItem { for i in 0... 4{ print("item5 -> \(i) thread: \(Thread.current)") } } let queue = DispatchQueue(label: "test", attributes: .concurrent) queue.async(execute: item1) queue.async(execute: item2) queue.async(execute: item3) queue.async(execute: Item1 -> 0 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item2 -> 0 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item1 -> 1 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item2 -> 1 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item1 -> 2 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item2 -> 2 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item2 -> 3 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item1 -> 3 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item2 -> 4 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item1 -> 4 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item3 barrier -> 0 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item3 barrier -> 1 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item3 barrier -> 2 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item3 barrier -> 3 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item3 barrier -> 4 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item4 -> 0 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item5 -> 0 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item4 -> 1 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item4 -> 2 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item5 -> 1 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item4 -> 3 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item4 -> 4 thread: <NSThread: 0x7fd6055c07d0>{number = 2, name = (null)} item5 -> 2 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item5 -> 3 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)} item5 -> 4 thread: <NSThread: 0x7fd60560b7f0>{number = 3, name = (null)}Copy the code

10.2 Locking a Thread Using DispatchSemaphore

“DispatchSemaphore” has been translated by many people as a semaphore, and to be honest it’s the first time I’ve heard of a semaphore in my life. What quantity? How much? Ridicule finished, in order to facilitate understanding, here I put it temporarily translated into traffic lights. DispatchSemaphore is initialized with a single parameter, value (number of passes), indicating how many more cars can pass (and how many asynchronous tasks can be performed). DispatchSemaphore has two methods:

  • Wait () : execute once, number of traffic minus 1, number of traffic zero indicates red light, all have to wait
  • Signal () : execute once, increment the number of traffic by 1

10.2.1 Take a 99 times table example and feel DispatchSemaphore:

let semaphore = DispatchSemaphore(value: 1) let queue = DispatchQueue(label: "concurrent", attributes: .concurrent) // Execute 9 asynchronous tasks for I in 1... Queue.async {semaphore. Wait () var STR = "" for j in 1... 9{// Format the string with two Spaces. Let value = I * j let tempStr = value <= 9? " \(value) " : "\(value) "STR += tempStr} print(STR) semaphore. Signal () Run result 12 3 4 5 6 7 8 9 24 6 8 10 12 14 16 18 36 9 12 15 18 21 24 27 4 8 12 16 20 24 28 32 36 5 10 15 20 25 30 35 40 45 6 12 18 24 30 36 42 48 54 7 14 21 28 35 42 49 56 63 8 16 24 32 40 48 56 64 72 9 18 27 36 45 54 63 72 81Copy the code

99 times table shows ideal

10.2.2 Comment out semaphore.wait() and semaphore.signal(), run several more times:

let semaphore = DispatchSemaphore(value: 1) let queue = DispatchQueue(label: "concurrent", attributes: .concurrent) // Execute 9 asynchronous tasks for I in 1... Queue.async {//semaphore. Wait () var STR = "" for j in 1... 9{// Format the string with two Spaces. Let value = I * j let tempStr = value <= 9? " \(value) " : "\(value) "STR += tempStr} print(STR)// semaphore. Signal () }} // Result 5 10 15 20 25 30 35 40 45 4 8 12 16 20 24 28 32 36 36 9 12 15 18 21 24 27 12 3 45 6 7 8 8 16 16 32 40 48 56 64 72 9 18 27 36 45 54 63 72 81 24 6 8 10 12 14 16 18 6 12 18 24 30 36 42 48 54 7 14 21 28 35 42 49 56 63Copy the code

The Times table is out of control

For a better understanding, try setting the DispatchSemaphore initialization value to 2 or 3 in example 1 above and run the program several times to see the result. You can get a sense of how the number of passes affects the degree of loss of control.

10.3 Modify variables using serial queue + calculation properties

import Foundation let queue = DispatchQueue(label: "Test ") var a:Int = 10 var b:Int{get{queue. Sync {print(" thread = \(thread.current)") return a}} set{ Sync {print(" thread = \(thread.current)") a = newValue}}} b = 30// Print ("a = \(a) b = \(b) thread = \(thread.current)") {number = 1, name = main} Thread = <NSThread: 0x7f8018c0e7e0>{number = 1, name = main} a = 30 b = 30 thread = <NSThread: 0x7f8018c0e7e0>{number = 1, name = main}Copy the code

Try changing set to write asynchronously and consider the result.

11. DispatchQoS

DispatchQoS priority: literally translates to the quality of service or execution priority applied to a task. It can be understood as the task identity and level. DispatchWorkItem, DispatchQueue.

Just like the customers with status in the airlines, they will wait for the plane in the VIP lounge, take first class, and provide considerate service of high-quality stewardesses. The best service will be given to you first. If you have no identity, only id card, safely arrived at the destination can be satisfied; If you don’t have an ID, take the bus. The quality of service, or The execution priority, to apply to tasks.

There are several types of DispatchQoS:

  • UserInteractive: Give priority to tasks related to user interaction to ensure a smooth interface
  • UserInitiated: Pay more attention to tasks initiated by users
  • Default: the default task is performed normally
  • Utility: The user has no active focus on the task
  • Background: Maintenance, cleaning and other tasks that are not very important, as long as I can finish them when I have time
  • Unspecified: Regardless of your identity, it still has no ID card, which is unspecified

DispatchQoS is just a simple priority flag, why put it in an advanced post? Because for the vast majority of developers, it’s not necessary to set this flag, and it just adds complexity to the code. It’s a lot of fancy tricks, a lot of code, and it’s full of bugs. What’s the point? It’s best to keep your code simple and buggy, as many books say: the less code you have, the less bugs you have. When you want to enhance the user experience, improve the efficiency of your code, and optimize the power consumption of your device, your application quality and code level are already very good, clearly belongs to the advanced level, then you should try this logo. Therefore, I think DispatchQoS belongs to the advanced content.

11.1 Adding the DispatchQoS flag to DispatchWorkItem:

import Foundation let item1 = DispatchWorkItem(qos: .userInteractive) { for i in 0... 9999{ print("--item1 -> \(i) thread: \(Thread.current)") } } let item2 = DispatchWorkItem(qos: .unspecified) { for i in 0... 9999{ print("item2 -> \(i) thread: \(Thread.current)") } } let queue = DispatchQueue(label: "test1", attributes: .concurrent) queue.async(execute: item1) queue.async(execute: item2)Copy the code

The result shows that item1 is finished and item2 starts printing 3824. The number of for loops needs to be increased, otherwise the effect is not obvious.

11.2 Adding a DispatchQoS Flag to the DispatchQueue:

import Foundation let item1 = DispatchWorkItem { for i in 0... 9999{ print("--item1 -> \(i) thread: \(Thread.current)") } } let item2 = DispatchWorkItem { for i in 0... 9999{ print("item2 -> \(i) thread: \(Thread.current)") } } let queue1 = DispatchQueue(label: "test1",qos: .userInteractive, attributes: .concurrent) let queue2 = DispatchQueue(label: "test2", qos: .unspecified, attributes: .concurrent) queue1.async(execute: item1) queue2.async(execute: item2)Copy the code

I run the result shows that item1 is finished, and item2 starts printing 3798. The for loop doesn’t need to be too large and the effect can be obvious, so you can explore it for yourself.

conclusion

To master Swift multithreading, or to use more in practice, in the process of using repeated thinking, repeated optimization, this technology will soon become your best. Multithreading is good, but please do not abuse, do not in order to show off to use multithreading, after all, the current CPU performance has been very high, every second can perform trillions of other operations, and the screen only refresh dozens of times every second, hundreds of times, blink of an eye a lot of code is finished. Use multithreading where necessary; clean code, fewer problems, and a reliable application are more important.

Matters needing attention

  • When developing multiple threads, make it a habit to print thread.current at all times to see if the code is running on its intended Thread.
  • Be aware of the DispathQueue nesting problem if you encounter calls between singletons. When another singleton is called, wrap it with dispatchqueue.main.async {} if necessary, switch it back to the main thread, and then thread switch in the method inside the singleton. Otherwise, the code will run out of control when threads are nested multiple times, and it will be impossible to try to save it.

Author’s brief introduction

My name is Zhu Gongge, a developer based in Shanghai, working on iOS, Swift, SwiftUI and TypeScript.