source

UIView rendering

The Core of iOS rendering view is Core Animation, whose rendering hierarchy is successively: layer tree -> render tree -> Render tree

CPU phase

  1. Layout (Frame)
  2. Display (Core Graphics)
  3. Ready (QuartzCore/Core Animation)
  4. Submit via IPC (packaged layer tree and animation properties)

OpenGL ES stage

  1. Generate (Generate)
  2. Bound (Bind)
  3. Buffer Data
  4. Enable (Enable)
  5. Set Pointers
  6. The drawing (Draw)
  7. Remove (Delete)

GPU phase

  1. Receive submitted Texture (Texture) and vertex description (triangle)
  2. Apply transform
  3. Merge render (off-screen render, etc.)

conclusion

  1. First, the Frame layout of a view is performed by the CPU, and the hierarchy between the view and the layer is prepared. Check to see if the drawRect: or drawLayer:inContext: methods are overwritten.
  2. The CPU packages the hierarchical relationships of the views and layers processed and submits them to the rendering service via IPC (Internal Processing Communication) channel, which consists of OpenGL ES and GPU.
  3. The rendering service first hands the layer data to OpenGL ES for texture generation and coloring. Generate the before-after frame cache, and then switch the before-after frame cache according to the refresh frequency of the display hardware, generally using the VSync signal and CADisplayLink as the standard.
  4. Finally, the post-frame cache to be displayed on the screen is handed over to GPU for image collection and shape collection, transformation operation, texture application and blending. And it ends up on the screen.

The core principles of iOS platform rendering mainly focus on frame caching, Vsync signal, CADisplayLink

Core Animation

Core Animation is not just the literal core Animation, but the core Animation in the whole display core QuartzCore framework

Core Animation relies on OpenGL ES for GPU rendering and CoreGraphics for CPU rendering

Core Animation registers an Observer in the RunLoop that listens for BeforeWaiting and Exit events. This Observer has a priority of 2,000,000, which is lower than other common observers. When a touch event arrives, the RunLoop is awakened, and the code in the App performs operations such as creating and adjusting the view hierarchy, setting the UIView’s frame, modifying the CALayer’s transparency, and adding an animation to the view. These operations are eventually captured by CALayer and submitted to an intermediate state via CATransaction (the CATransaction documentation mentions these slightly, but not completely). Observers of the event are notified when the RunLoop is about to go to sleep (or exit) after all the above operations have been completed. The Observer registered with the CA then merges all intermediate states and submits them to the GPU for display in a callback; If there is an animation, the CA will trigger the relevant process multiple times through mechanisms such as DisplayLink.

CPU rendering function

CPU rendering functions are mainly reflected in the following five aspects:

  1. Layout calculation
  2. Lazy view loading
  3. The Core Graphics rendering
  4. If you implement the drawRect: or drawLayer:inContext: methods on the view, or CALayerDelegate’s methods, there is a huge performance overhead before you can draw anything. To support arbitrary drawing of layer content, Core Animation must create a host image with medium memory size. Then, once the drawing is finished, the image data must be sent to the rendering server via IPC. On top of that, Core Graphics draws are very slow, so it’s not a good idea to do this in a performance-critical scenario.
  5. Extract the image
  6. The layer of packaging

OpenGL ES rendering functions

In simple terms, OpenGL ES is to color layers, sample layers, generate textures, bind data, and generate before and after frame caches.

GPU rendering function

GPU will cache data according to the generated before and after frames and synthesize the data according to the actual situation. Generally, GPU rendering burden is caused by off-screen rendering, layer mixing and lazy loading.

Before and after frame cache & Vsync signal

The iOS display is powered by a VSync signal, which is generated by the hardware clock and is emitted 60 times per second (depending on the hardware, such as 59.97 on a real iPhone). After iOS graphics service receives VSync signal, it will notify the App through IPC. App Runloop will register the corresponding CFRunLoopSource to receive the clock signal notification through mach_port after startup, and then the Source callback will drive the animation and display of the entire App.

Frame cache: A buffer that receives render results and specifies an area for the GPU to store render results

Multiple frame caches can exist simultaneously, but the screen display pixels are controlled by pixel color elements stored in the specific frame cache of the front Frame buffer. The program’s rendering results are usually stored in other frame caches, including the back frame buffer. After the rendered back frame cache is complete, the back frame cache is switched. (OS complete)

The front-frame cache determines the color of the pixels displayed on the screen and switches with the back-frame cache when appropriate.

The synthesizer of Core Animation will combine OpenGL ES layer, UIView layer, StatusBar layer, etc., and mix the post-frame cache to produce the final color, and switch the post-frame cache. OpenGL ES coordinates are stored in floating point numbers, even vertex data of other data types is converted to floating point.