I’m participating in nuggets Creators Camp # 4, click here to learn more and learn together!
preface
Recently, I attended the Denver Nuggets creators training camp, and I benefited a lot from one lesson. One of the most impressive words is that 10 articles on hydrology are not as good as one article of high quality. Suddenly realized, to write some articles in front of their own, is indeed a little water. Sometimes WORRY about writing too long, will read tired, sometimes is oneself do not have so much time, separate into several parts to write.
Now that I think about it, what do I want to do with my own articles? Am I doing it to get a raise? Am I proud of getting likes? In general, I just want to sum up what I have learned, so that I can grow up. Of course, I can help others better in the process. In a word, it is good not to go against my original intention.
NSThread
NSThread is a encapsulation of pThread objectification. It is apple’s official object-oriented thread operation technology. It is easy to use and can directly operate thread objects, but requires us to manage the life cycle of threads.
NSThread Creates a thread
We initialize it directly with initWithBlock. [thread start] is the starting thread.
- (void)nsthreadDemo {NSThread *thread = [[NSThread alloc] initWithBlock:^{// Prints the current thread NSLog(@"%@",[NSThread currentThread]); }]; [thread start]; }Copy the code
In fact, we have other methods of initialization besides this.
- (instancetype)initWithTarget:(id)target selector:(SEL)selector object:(nullable id)argument;
Copy the code
There are also class methods for thread creation that do not require start.
+ (void)detachNewThreadWithBlock:(void (^)(void))block;
+ (void)detachNewThreadSelector:(SEL)selector toTarget:(id)target withObject:(nullable id)argument;
Copy the code
Thread to sleep
The sleepUntilDate method refers to sleeping when a date.
+ (void)sleepUntilDate:(NSDate *)date;
Copy the code
A sleepForTimeInterval is something that allows a thread to sleep for a few seconds.
+ (void)sleepForTimeInterval:(NSTimeInterval)ti;
Copy the code
The thread of control
The thread starts, we use the start method.
- (void)start;
Copy the code
The thread cancels, we call it cancel.
- (void)cancel;
Copy the code
To get the currentThread, we use currentThread.
[NSThread currentThread]
Copy the code
Main thread execution
We can just use the NSObject method. Use performSelectorOnMainThread back to the main thread.
- (void)performSelectorOnMainThread:(SEL)aSelector withObject:(nullable id)arg waitUntilDone:(BOOL)wait;
Copy the code
The child thread executes and goes back to the main thread to refresh the UI
Let’s use an example here to use NSThread.
- (void)nsthreadDemo {NSThread *thread = [[NSThread alloc] initWithBlock:^{// Prints the current thread NSLog(@"%@",[NSThread [NSThread sleepForTimeInterval:2.0]; / / back to the main thread [self performSelectorOnMainThread: @ the selector (updateUI) withObject: nil waitUntilDone: NO];}]; [thread start]; } - (void)updateUI { NSLog(@"%@",[NSThread currentThread]); self.view.backgroundColor = [UIColor systemRedColor]; }Copy the code
Nsthreads are rarely used to manipulate threads, so let’s look at the following.
NSOperation
NSOperation is a higher level of encapsulation based on GCD and is more object-oriented in use. More simple features than GCD. Automatic management of the lifecycle, which I like. Let’s take a look.
NSOperation and NSOperationQueue are used
In GCD, we create a queue and add tasks to the Block. The task is then scheduled by the queue.
Since it is based on GCD, NSOperation will generally have the same logic as GCD.
Where NSOperationQueue is the operation queue and NSOperation is the operation task.
The specific steps are as follows:
- Create operation: Encapsulate the operation to be performed in an NSOperation object.
- Create a queue: Create an NSOperationQueue object.
- Add operations to the queue: Add the NSOperation object to the NSOperationQueue object.
- (void) operationDemo {// Create queue NSOperationQueue *queue = [[NSOperationQueue alloc] init]; NSBlockOperation *operation = [NSBlockOperation blockOperationWithBlock:^{NSLog(@"%@",[NSThread) currentThread]); }]; [queue addOperation:operation]; }Copy the code
Queue control
CancelAllOperations: Cancels all operations in the current queue.
- (void)cancelAllOperations;
Copy the code
Suspended: The queue is suspended.
@property (getter=isSuspended) BOOL suspended;
Copy the code
MainQueue: indicates the mainQueue.
[NSOperationQueue mainQueue]
Copy the code
CurrentQueue: indicates the currentQueue.
[NSOperationQueue currentQueue]
Copy the code
WaitUntilAllOperationsAreFinished: current queue up all the tasks, will continue to walk, can block the current thread.
- (void)waitUntilAllOperationsAreFinished;
Copy the code
NSOperationQueue Controls the number of concurrent requests
So we’re going to do the queue add task directly with addOperationWithBlock.
We set the maximum concurrency to 1 and the current queue is a serial queue.
- (void) operationDemo {// Create queue NSOperationQueue *queue = [[NSOperationQueue alloc] init]; / / create a serial queue queue. MaxConcurrentOperationCount = 1; for (int i = 0; i < 5; Queue addOperationWithBlock:^{NSLog(@"%d: %@", I,[NSThread currentThread]);}]; }}Copy the code
Print result:
2022-02-17 16:17:38.379723+0800 Threads [80881:1209674] 0: <NSThread: 0x600003108380>{Number = 6, Name = (null)} 2022-02-17 16:17:38.380011+0800 Multithreading [80881:1209674] 1: <NSThread: 0x600003108380>{Number = 6, name = (null)} 2022-02-17 16:17:38.380219+0800 Multithreading [80881:1209674] 2: <NSThread: 0x600003108380>{Number = 6, name = (null)} 2022-02-17 16:17:38.380460+0800 Multithreading [80881:1209672] 3: <NSThread: 0x6000031023C0 >{Number = 3, name = (null)} 2022-02-17 16:17:38.380838+0800 Multithreading [80881:1209672] 4: <NSThread: 0x6000031023c0>{number = 3, name = (null)}Copy the code
Result analysis: Print results are indeed executed sequentially, that is, operation tasks are executed one by one.
If we set the maximum number of concurrent requests to greater than 1, the current queue is a concurrent queue.
- (void) operationDemo {// Create queue NSOperationQueue *queue = [[NSOperationQueue alloc] init]; / / create a concurrent queue queue. MaxConcurrentOperationCount = 2; for (int i = 0; i < 5; Queue addOperationWithBlock:^{NSLog(@"%d: %@", I,[NSThread currentThread]);}]; }}Copy the code
Take a look at the print result:
You can see that the order of execution is out of order. Also keep in mind that the maximum number of concurrent requests is not the number of open threads, the number of open threads is determined by the system and not managed by us.
Operation dependencies and priorities
Tasks can depend on each other. If there is no dependency, it depends on who has the highest priority and executes first.
- (void) operationDemo1 {// Create queue NSOperationQueue *queue = [[NSOperationQueue alloc] init]; / / create a concurrent queue queue. MaxConcurrentOperationCount = 2; NSBlockOperation *operation1 = [NSBlockOperation blockOperationWithBlock:^{ NSLog(@"%d: %@",1,[NSThread currentThread]); }]; / / set the ordinary priority [operation1 setQueuePriority: NSOperationQueuePriorityNormal]; NSBlockOperation *operation2 = [NSBlockOperation blockOperationWithBlock:^{ NSLog(@"%d: %@",2,[NSThread currentThread]); }]; //2 dependency 1 [operation2 addDependency:operation1]; NSBlockOperation *operation3 = [NSBlockOperation blockOperationWithBlock:^{ NSLog(@"%d: %@",3,[NSThread currentThread]); }]; / / set high priority [operation3 setQueuePriority: NSOperationQueuePriorityHigh]; NSBlockOperation *operation4 = [NSBlockOperation blockOperationWithBlock:^{ NSLog(@"%d: %@",4,[NSThread currentThread]); }]; //4 dependency 3 [operation4 addDependency:operation3]; [queue addOperation:operation1]; [queue addOperation:operation2]; [queue addOperation:operation3]; [queue addOperation:operation4]; }Copy the code
We analyze the first wave: a queue is created with a maximum number of concurrent tasks of 2. Operation task 2 depends on operation task 1, and operation task 4 depends on operation task 3. So after execution, 2 is placed after 1, and 4 is placed after 3. The priority of 1 is Common, and that of 3 is high. So the 3 will be executed first, and the 1 will come after the 3.
Let’s take a look at the print:
3 -> 1 -> 4 -> 2, is not with our analysis of a ah.
The child thread executes and goes back to the main thread to refresh the UI
Finally, we execute the task on the subthread, and then return to the main thread to execute the code.
- (void) operationDemo2 {// Create queue NSOperationQueue *queue = [[NSOperationQueue alloc] init]; Queue addOperationWithBlock:^{NSLog(@"%@",[NSThread currentThread]); sleep(2); [[NSOperationQueue mainQueue] addOperationWithBlock:^{ NSLog(@"%@",[NSThread currentThread]); }]; }]; }Copy the code
Looks OK, then let’s look at the GCD.
GCD
GCD is also auto-managing the life cycle, which is one of the most common ways we deal with threads on a daily basis. The core of GCD is the queue + execution mode. First, create a queue and then add tasks to the queue. The system will execute tasks according to the type of tasks.
In a previous article written in iOS one of the multithreading: process, thread, queue relationship, has been written about synchronous serial, synchronous concurrent, asynchronous serial, asynchronous concurrent. So in this chapter, I will write about the other aspects of GCD.
The child thread executes and goes back to the main thread to refresh the UI
If you look at the following code, it looks similar to the NSOperation code. This is also the way we use it.
- (void)gcddemo1
{
dispatch_async(dispatch_get_global_queue(0, 0), ^{
sleep(2);
NSLog(@"%@",[NSThread currentThread]);
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(@"%@",[NSThread currentThread]);
});
});
}
Copy the code
Dispatch_queue_create (“jj.com”, DISPATCH_QUEUE_CONCURRENT); Concurrent queues are also possible.
dispatch_once
We can create a singleton with dispatch_once.
@implementation GCDDemo
+ (instancetype)shareInstance {
static GCDDemo *demo;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
demo = [GCDDemo new];
});
return demo;
}
@end
Copy the code
The dispatch_once code is executed only once during the entire program. This can also ensure thread safety when multi-threading.
dispatch_after
We can use dispatch_after to perform the delay method.
- (void)gcddemo2 {dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ NSLog(@"2:%@",[NSThread currentThread]); }); NSLog(@"1:%@",[NSThread currentThread]); }Copy the code
Dispatch_after does not block and returns to the main thread after a delay.
dispatch_barrier_async
Dispatch_barrier_async is a fence function, we are talking about asynchronous fences here, so it does not block the current thread. So what does it do?
Let’s look at this example:
- (void)dispatch_barrier_async_request
{
dispatch_queue_t queue = dispatch_queue_create("jj.com", DISPATCH_QUEUE_CONCURRENT);
dispatch_async(queue, ^{
NSLog(@"2:%@", [NSThread currentThread]);
});
dispatch_async(queue, ^{
NSLog(@"3:%@", [NSThread currentThread]);
});
dispatch_barrier_async(queue, ^{
NSLog(@"4:%@", [NSThread currentThread]);
});
dispatch_async(queue, ^{
NSLog(@"5:%@", [NSThread currentThread]);
});
NSLog(@"1:%@", [NSThread currentThread]);
}
Copy the code
It is an asynchronous fence function, which means that it needs to wait for the first queue task to complete before executing its own, and then execute the next one. Let’s take a look at the results:
It does not block the main thread and dispatch_barrier_async does act as a fence.
In addition to this usage, we also play an important role in reading and writing lists. Let’s look at the code
@property (nonatomic, strong) dispatch_queue_t concurrent_queue; @property (nonatomic, strong) NSMutableDictionary *dataCenterDic;Copy the code
Start by defining a concurrent queue and a dictionary that can be accessed by multiple threads.
- (instancetype)init{ self = [super init]; If (self){// Create a concurrent queue: self.concurrent_queue = dispatch_queue_create("read_write_queue", DISPATCH_QUEUE_CONCURRENT); // Create data dictionary: self.dataCenterDic = [NSMutableDictionary dictionary]; } return self; }Copy the code
To initialize, we add a read and write method directly.
#pragma mark - read data - (id)jj_objectForKey:(NSString *)key{__block id obj; Dispatch_sync (self.concurrent_queue, ^{obj = [self.datacenterdic objectForKey:key]; }); return obj; } #pragma mark - write data - (void)jj_setObject:(id)obj forKey:(NSString *)key Dispatch_barrier_async (self.concurrent_queue, ^{NSLog(@" write --%@",obj); [self.dataCenterDic setObject:obj forKey:key]; }); }Copy the code
We externally start calling write read and write methods.
- (void)readWriteLock { dispatch_queue_t queue = dispatch_queue_create("jj.com", DISPATCH_QUEUE_CONCURRENT); for (int i = 0; i < 5; i++) { dispatch_async(queue, ^{ [self.rwLock jj_setObject:[NSString stringWithFormat:@"jj---%d",i] forKey:@"key"]; }); } for (int i = 0; i < 5; I++) {dispatch_async (queue, ^ {NSLog (@ "read 1 - % @", [self. RwLock jj_objectForKey: @ "key"]); }); }}Copy the code
Let’s take a look at the print:
The read and write output is fine, and this acts as a read-write lock. You can also use pthread_rwlock_rdlock, which we’ll cover in more detail in the next article.
dispatch_group
In general, queue groups are used to request several asynchronous tasks and then wait for the task to be executed at dispatch_group_notify.
- (void)dispatch_group_request {// Create a queue group dispatch_group_t group = dispatch_group_create(); Dispatch_queue_t queue = dispatch_queue_create("jj.com", DISPATCH_QUEUE_CONCURRENT); Dispatch_group_async (group, queue, ^{sleep(1); NSLog(@"1--%@",[NSThread currentThread]); }); Dispatch_group_async (group, queue, ^{// Delay analog sleep(1); NSLog(@"2--%@",[NSThread currentThread]); }); // dispatch_group_notify(group, dispatch_get_main_queue(), ^{NSLog(@"3--%@",[NSThread currentThread]); }); NSLog(@"0--%@",[NSThread currentThread]); }Copy the code
Let’s look at the output:
The dispatch_group_notify task is executed after the first two tasks are completed.
A lot of people think, “Well, if I’m sending dispatch_group_async and I’m using a third party network framework to get an asynchronous network request, and the asynchronous network request is in the concurrent queue of its framework, then I’m done with dispatch_group_async before the request comes back.” “Dispatch_group_notify” does not work.
We’re going to use dispatch_group_t, we’re going to use dispatch_group_enter and dispatch_group_leave.
Dispatch_group_enter, dispatch_group_leave are equivalent to dispatch_group_async, but that’s how they’re used.
- (void)dispatch_group_request1 {// Create a queue group dispatch_group_t group = dispatch_group_create(); Dispatch_queue_t queue = dispatch_queue_create("jj.com", DISPATCH_QUEUE_CONCURRENT); / / dispatch_group_enter into group (group); // Dispatch_async (queue, ^{sleep(1); NSLog(@"1--%@",[NSThread currentThread]); / / the group dispatch_group_leave (group); }); / / dispatch_group_enter into group (group); // Dispatch_async (queue, ^{sleep(1); NSLog(@"2--%@",[NSThread currentThread]); / / the group dispatch_group_leave (group); }); dispatch_group_notify(group, dispatch_get_main_queue(), ^{ NSLog(@"3--%@",[NSThread currentThread]); }); NSLog(@"0--%@",[NSThread currentThread]); }Copy the code
Let’s take a look at the print result first:
“Dispatch_group_enter” and “dispatch_group_leave” must be paired. Once there is a group, “dispatch_group_notify” is not called until “dispatch_group_leave” is called. Will be called. And the semaphore actually has the same effect.
dispatch_semaphore
Dispatch_semaphore is used to initialize the semaphore. The value of the semaphore controls which thread executes and which one waits. And set the maximum number of concurrent GCD requests. A value of 1 also provides the effect of a synchronous lock, which will be explained in the next article.
-
Dispatch_semaphore_create: Creates a Semaphore and initializes the total number of signals.
-
Dispatch_semaphore_signal: dispatch_semaphore_signal: sends a signal.
-
Dispatch_semaphore_wait: If the semaphore is greater than 0, the dispatch_semaphore_wait is executed normally and the semaphore is decreased by 1. If the semaphore is 0, the system waits until the notification semaphore is greater than 0. If it blocks the current thread while waiting.
Let’s look at an example of how two tasks can be executed before the third one is executed.
- (void) dispatch_semapHORE_T_request {// Dispatch_semaphoRE_t sema = dispatch_semapHORE_create (0); // dispatch_queue_t queue = dispatch_queue_create("jj.com", DISPATCH_QUEUE_CONCURRENT); dispatch_async(queue, ^{ sleep(1); NSLog(@"1--%@",[NSThread currentThread]); // Dispatch_semaphore_signal (sema); }); dispatch_async(queue, ^{ sleep(1); NSLog(@"2--%@",[NSThread currentThread]); // Dispatch_semaphore_signal (sema); }); Dispatch_async (dispatch_get_main_queue(), ^{// wait for 2 dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER); dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER); NSLog(@"0--%@",[NSThread currentThread]); }); }Copy the code
Let’s take a look at the print result first:
It is indeed realized that 2 tasks are completed before the latter task is completed.
If task 1 is sent to dispatch_semaphore_wait, it will sleep for 1s. If task 2 is sent to dispatch_semaphore_wait, it will sleep for 1s.
At this point, the semaphore we created has a value of 0 and we have to wait. If “dispatch_semaphore_signal” is executed, then “dispatch_semaphore_wait” is executed and the value of “dispatch_semaphore_signal” is “1”. “Dispatch_semaphore_wait” After dispatch_semaphore_signal of task 1, the semaphore is increased by 1 and task 0 can be continued.
We don’t care who finishes task 1 or task 2 first, our last task will have to wait for them to finish because dispatch_semaphore_wait and dispatch_semaphore_signal correspond one to one.
Dispatch_source
Usually we use Dispatch_source, and we use timers a lot. The dispatch_source_t timer is not affected by RunLoop. Dispatch_source_t is a system-level source event with high precision and is automatically triggered by the system.
- (void)dispatch_source_request {if (_timer) {// Timer is running. } self.timer = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, dispatch_get_global_queue(0, 0)); Dispatch_source_set_timer (_timer, DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC, 0 * NSEC_PER_SEC); Dispatch_source_set_event_handler (_timer, ^{if (self.timeout <= 0) {dispatch_source_cancel(self.timer); self.timer = nil; }else { dispatch_async(dispatch_get_main_queue(), ^{ self.timeout--; NSLog(@" timing --%ld", self.timeout); }); }}); // Start dispatch_resume(_timer); // suspend // dispatch_suspend(timer); }Copy the code
conclusion
NSThread
: Use more object-oriented, easy to use, direct manipulation of thread objects, need to manually manage the lifecycle.NSOperation
: GCD-based packaging, using more object-oriented, operable dependencies, priorities, and maximum concurrency, automatic management lifecycle.GCD
: Designed to replace threading technologies such as NSTread, flexible operation of threads and queues, with other powerful features, automatic management of life cycles.
The resources
- IOS advanced multithreading -NSThread details
- IOS multithreading: summary of NSOperation, NSOperationQueue
- IOS multithreading: an exhaustive summary of “GCD”