Small knowledge, big challenge! This paper is participating in theEssentials for programmers”Creative activities

What is Caton

Stuck is the so-called frame off, screen. The CPU computes the rendered stuff, the GPU does the rendering, and the rendered stuff goes into the frame cache (framebuffer) and submit it via VideoControllerMonitorTo display.

When the CPU gets stuck,VideoControlleIt would be a waste of time to wait. At this point, the system introduced the front frame and the back frame, which is the double cache mechanism. But what happens is that in the interval between frames, if a frame is not rendered, it is discarded, that is, dropped frames. The time it takes to switch back and forth between the two caches is the vSYNC signalVSync, within this time interval can be calculated to be able to display normally, calculation is not finished to lose.

**CADisplayLink**Caton detection

YYFPSLable of YYKit uses CADisplayLink to detect it, as this is a timer bound to Vsync. It’s usually 60

_link = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self] selector:@selector(tick:)];
[_link addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
Copy the code
- (void)tick:(CADisplayLink *)link {
    if (_lastTime == 0) {
        _lastTime = link.timestamp;
        return;
    }
    _count++;
    NSTimeInterval delta = link.timestamp - _lastTime;
    if (delta < 1) return;
    _lastTime = link.timestamp;
    float fps = _count / delta;
    _count = 0;
    // ...
}
Copy the code

runloopCaton detection

  1. First add a runloop to the main threadObserverAnd the corresponding callBack callBackcallBackThe callback** Send signals **dispatch_semaphore_signal+1
- (void)registerObserver{
    CFRunLoopObserverContext context = {0,(__bridge void*)self.NULL.NULL};
    //NSIntegerMax: has the lowest priority
    CFRunLoopObserverRef observer = CFRunLoopObserverCreate(kCFAllocatorDefault,
                                                            kCFRunLoopAllActivities,
                                                            YES.NSIntegerMax,
                                                            &CallBack,
                                                            &context);
    CFRunLoopAddObserver(CFRunLoopGetMain(), observer, kCFRunLoopCommonModes);
}

static void CallBack(CFRunLoopObserverRef observer, CFRunLoopActivity activity, void *info)
{
    LGBlockMonitor *monitor = (__bridge LGBlockMonitor *)info;
    monitor->activity = activity;
    // Send a signal
    dispatch_semaphore_t semaphore = monitor->_semaphore;
    dispatch_semaphore_signal(semaphore);
}
Copy the code
  1. Open one in the child threaddo-whileResident thread, creates a semaphore in the child thread, setdispatch_semaphore_waitThe timeout is 1s. If the main thread does not stall, the child threadwaitAnd then the main thread immediatelysignal, the return value should be 0. If it is not 0, it indicates that there is a delay.
- (void)startMonitor{
    // Create a signal
    _semaphore = dispatch_semaphore_create(0);
    // Monitor the duration of the child thread
    dispatch_async(dispatch_get_global_queue(0.0), ^ {while (YES)
        {
            // The timeout is 1 second, st is not equal to 0 without waiting for the semaphore, RunLoop all tasks
            long st = dispatch_semaphore_wait(self->_semaphore, dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC));
            if(st ! =0)
            {
                if (self->activity == kCFRunLoopBeforeSources || self->activity == kCFRunLoopAfterWaiting)
                {
                    if(+ +self->_timeoutCount < 2) {NSLog(@"timeoutCount==%lu", (unsigned long)self->_timeoutCount);
                        continue;
                    }
                    // The scale of one second or so is very possible continuously to avoid large-scale printing!
                    NSLog("More than two consecutive freezes detected"); }}self->_timeoutCount = 0; }}); }Copy the code

Wechat stuck detection tool

The addMonitorThread method in Matrix is much the same as the runloop method above

How to solve the problem

  1. Pretypesetting: After the network data request comes back, pretypesetting is calculated based on the data
  2. Pre-decode & Pre-render (when loading images)

UIImageViewIt is actually a model of the loading flow of images

We can refer to the third party SDWebImage after the task is successful to find the result callback function

- (void)URLSession:(NSURLSession *)session task:(NSURLSessionTask *)task didCompleteWithError:(NSError *)error
Copy the code

SDWebImage opened after get result in a child thread asynchronous decoding encoding queue SDImageLoaderDecodeImageData images, so why do it? When we call the [[UIImage] imageWithData:] method, Apple defaults to decoding the data on the main thread. However, in practical application, most of the lag time is spent on loading images, so SDWebImage rewrites the decoding method in the child thread.

Graphics codec plug-in audio and video FFmpeg

  1. Load on demand: If the Cell is constantly scrolling, you can process only visible data at this point
  2. Asynchronous rendering: The operations to be drawn are placed in asynchronous rendering, and the main thread is only responsible for refreshing the display

UIView & UILayer

What does Commint Transaction do

  1. layout: Builds the view frame, traverses the operation[UIViewlayerSubview][CALayer layoutSubLayers]
  2. display: draw the displaydrawRect(), displayLyaer:(bitmap drawing)
  3. Prepare: Additional CoreAnimation work, such as decoding
  4. Commi: Package the layers and send them toRenderServer