Preface:

Recently, I studied the rendering process and principle of GPU and App.

First of all, thanksQiShare teamGuidance and support, as wellPengge (@snow)Review and help for this article.

Next, let’s begin our journey of discovery today.


First, talk about GPU

The GPU (Graphics Processing Unit) :

Also known as the graphics processor, is the “core” of the graphics card.

Mainly responsible for image calculation, with high parallel ability, display images in pixels through calculation.

The working principle of GPU is simply: — convert “3D coordinates” into “2D coordinates”, and then convert “2D coordinates” into “actual colored pixels”.

Then, the specific work line of GPU can be divided into “six stages”, which are as follows:

Vertex shader => Shape Assembly => Geometry shader => Rasterize => fragment shader => Test and mix

  • Phase 1: Vertex Shader

This phase inputs Vertex Data, which is a collection of vertices. The main purpose of the vertex shader is to convert 3D coordinates to “2D” coordinates, and the vertex shader can do some basic processing on vertex attributes. (In a simple sentence, determine the shape of the point.)

  • Stage 2: Shape Assembly

This stage takes all vertices output by the vertex shader as input and assembles all points into the shape of the specified primitives. Primitive is used to show how to render vertex data such as points, lines, and triangles. This stage is also called primitive assembly. (In a simple sentence, determine the shape of the line.)

  • Stage 3: Geometry Shader

This stage takes as input a set of fixed points in the form of primitives and constructs entirely new primitives (or other primitives) to generate geometric shapes. (In short, determine the number of triangles and make them geometric.)

  • Stage 4:Rasterization

This stage maps primitives to corresponding pixels on the final screen, generating fragments. Fragments are all the data needed to render a pixel. (In a nutshell, turn an image into an actual screen pixel.)

  • Phase 5: Fragment Shader

This phase starts with Clipping the input fragment. Crop will discard all pixels outside the view to improve execution efficiency. And color the Fragment. (In short, color screen pixels.)

  • Step 6: Tests and Blending

This phase checks the corresponding depth value of the fragment (z coordinate) to determine whether the pixel is in front of or behind the pixels of other layers and whether it should be discarded. In addition, the layer is blended by checking the alpha value, which defines the transparency of a pixel. (In short, check the layer depth and transparency, and blend the layers.)

(PS: This is a key point that I will address in the upcoming “App Performance Optimization In Action” blog series.)

Therefore, even if a pixel output color is calculated in the fragment shader, the final pixel color may be completely different after testing and blending layers.

For mixing, GPU uses the following formula to calculate and get the final actual pixel color.

R = S + D * (1-sa) R: Result, the final pixel color. S: Source, Source pixel (layer pixel above). D: Destination, the target pixel (layer pixel below). A: Alpha, transparency. Result = S(top) color + D(bottom) color * (1-s (top) transparency)

The complete process of GPU rendering pipeline is shown in the figure below:

Q: CPU vs. GPU?

Here is a page of PPT shared by our leader (@Moon Shadow) :

As each pixel of the screen has the requirement of refreshing each frame, the parallel working efficiency of GPU is higher.


After briefly talking about the pipeline of GPU rendering, let’s talk about the rendering process and principle of App. The rendering of iOS App is mainly divided into the following three types:

  • Native rendering
  • Large front-end rendering (WebView, class,React Native)
  • Flutter rendering

2. Native rendering

Speaking of native rendering, the first thing that comes to mind is the iOS rendering frameworks we are most familiar with: UIKit, SwiftUI, Core Animation, Core Graphics, Core Image, OpenGL ES, and Metal.

  • UIKit: The most commonly used UI framework for daily development, can be setUIKitComponent layout and associated properties to draw the interface. In fact itselfUIViewIt doesn’t have screen imaging capabilities, it’sViewOn theCALayerProperties have presentation capabilities. (UIViewInherited fromUIResponder, which is mainly responsible for the event response of user operations. IOS event response transmission is realized through view tree traversal.
  • SwiftUI: the apple inWWDC-2019The launch of a new “declarative UI” framework, usedSwiftTo write. A set of code, can be completediOS,iPadOS,macOS,watchOSThe development and adaptation of. (onSwiftUII wrote a simple article last yearDemoFor reference:Write a simple page with SwiftUI)
  • Core AnimationCore animation, a composite engine. Combine different visual content on the screen as quickly as possible. Split into separate layers (CALayer) stored in the layer tree.
  • Core GraphicsBased on:QuartzAdvanced drawing engine, mainly for drawing images at run time.
  • Core Image: Pre-operation image rendering, efficient processing of existing images.
  • OpenGL ES:OpenGL for Embedded Systems, hereinafter referred to asGLES, it isOpenGLA subset of the. byGPUVendor custom implementation, available throughC/C++Programming controlGPU.
  • Metal: Implemented by Apple,WWDC-2018Has been launchedMetal2, rendering performance ratioOpenGL ESHigh. In order to solve theOpenGL ESNot taking full advantage of Apple’s chips.

So, what are the components of the iOS native rendering process? It is mainly divided into the following four steps:

  • Step 1: Update the view tree and layer tree. (Corresponding to View hierarchy and Layer hierarchy on View respectively)

  • Step 2: THE CPU calculates what to display in the next frame (view creation, layout calculation, view drawing, image decoding). When runloop is in the kCFRunLoopBeforeWaiting and kCFRunLoopExit state, the registered listeners are notified, the layers are packaged, and the packaged data is sent to Render Server, a separate process responsible for rendering. These are collectively called Commit Transactions.

  • Step 3: Data arrivalRender ServerThen it will be deserialized to get the layer tree. According to the layer order of the layer tree,RGBAValue, the layerframeAfter filtering, turn the layer tree into a render tree. The render tree information is transferred to the layer treeOpenGL ES/Metal.

  • Step 4:Render ServerWill be calledGPU.GPUStart with the aforementioned vertex shader, shape assembly, geometry shader, rasterization, fragment shader, test and mix. After completing these six stages of work, it willCPUGPUThe calculated data is displayed on every pixel of the screen.

So, for the overall flow of iOS native rendering, I also drew a graph:


Big front-end rendering

1. The WebView:

For WebView rendering, most of the work is done in WebKit. WebKit itself is based on the Lay Rendering architecture of macOS, and iOS itself is based on this architecture. Therefore, the performance of the rendering itself should not differ from that of native in terms of how it is implemented.

But why is it so obvious that WebView rendering is slower than native rendering?

  • First, first load. There will be additional network requests and script parsing. Even for native web page loading, WebView has more scripting to parse than native. The WebView parses HTML+CSS+JavaScript code in addition.

  • Second, language interpretation execution performance. JS language parsing performance is weaker than native. Especially when it comes to complex logic and a lot of computation, WebView interprets performance much slower than native.

  • Thirdly, the rendering process of WebView is independent, and GPU process is called through IPC for each frame update, which will cause frequent IPC process communication, resulting in performance consumption. In addition, the two processes cannot share texture resources, so the GPU cannot raster the context directly, and must wait for the WebView to send the context to the GPU via IPC. Therefore, the performance of GPU itself will also be affected.

Therefore, WebView rendering is less efficient than native rendering.

2. React NativeJavaScriptCoreEngine as virtual machine solution)

React Native, Weex, applets, etc.

ReactNative renders iOS renderers directly, but with Json+JavaScript scripting parsing. Use the JavaScriptCore engine to associate “JS” with “native controls”. Furthermore, achieve the goal of controlling iOS native controls through JS. (In a nutshell, this JSON is a scripting to native-language mapping table, where KEY is a scripting language-aware symbol and VALUE is a local-language-aware symbol.)

JavaScriptCore: JavaScriptCore is a bridge between iOS native and JS. It was originally the engine that explains executing JavaScript code in WebKit. Currently, apple has a JavaScriptCore engine and Google has a V8 engine.

However, RN, like WebView, also faces the problem of JS interpretation performance.

Therefore, WebView < class ReactNative < native in terms of rendering efficiency. (Because JSON is less complex than HTML + CSS)


4, Flutter rendering

First, there is a YouTube video titled “Flutter’s Rendering Pipeline” about Flutter Rendering.

1. Structure of Flutter:

As you can see, Flutter has rewritten the UI framework, reimplementing everything from UI controls to rendering itself. Does not rely on iOS, Android platform native controls, rely on Engine (C++) layer Skia graphics library and system graphics drawing interface. So you have the same experience on different platforms.

2. Flutter rendering process:

Simply put, the interface of Flutter consists of widgets. All widgets make up the Widget Tree. When the interface is updated, the Widget Tree is updated, the Element Tree is updated, and finally the RenderObjectTree is updated.

The logic for updating widgets is as follows:

\ newWidget == null newWidget ! = null
child == null Returns null Return the new Element
child ! = null Removes the old child and returns NULL Returns the child if the old child is updated, and the new Element otherwise

And then the rendering process, A Flutter rendering contains Build, Widget Tree, Element Tree, RenderObject Tree, Layout, Paint, Composited Layer, etc. In the C++ Layer of Flutter, Skia library is used to combine layers to generate textures, and OpenGL interface is used to submit rendering content to GPU for rasterization and composition. After submitting to the GPU process, the process of compositing and displaying the screen is basically similar to iOS native rendering, so the performance is similar.

For more details, see: How Does Flutter Actually render?

Fifth, summarize and compare

rendering language performance Corresponding to the group
native Objective – C and Swift U u u IOS developers
WebView HTML, CSS, JavaScript u Front-end developer
Class React Native JavaScript End to end Front-end developer
Flutter Dart U u u The Dart developers

But the advantages of Flutter are:

  1. Cross-platform, can run simultaneously oniOS,AndroidTwo platforms.
  2. Thermal overload (Hot Reload), save the time to recompile the code, greatly improve the development efficiency.
  3. And future Google systems"Fuscia"Release and support of. If Google’s new system in the futureFuchsiaCan be applied to the mobile end, and domain replacementAndroid. Due to theFuchsiaThe upper isFlutterWritten, thereforeFlutterDevelopment has become a must in the mobile space. At the same timeFlutterWith support for cross-platform development, technology stacks in other areas will become less and less valuable.

Apple’s hopes, of course, lie with SwiftUI. If Fuchisa ultimately fails, SwiftUI will also support cross-ends. Also, SwiftUI itself supports thermal overloading. Maybe a future, too.

Look forward to apple’s online WWDC-2020 in June this year, and hope to bring us a different surprise.

Reference and acknowledgements: IOS Development Master Class (Dai Ming) 2. GPU you Don’t know (Moon Shadow) 3. Flutter from load to display (Shengwen senior) 4. UIKit Performance Optimization Practice (BestSwifter) 5 8. WWDC14: Advanced Graphics and Animations for iOS Apps