Rendering pipeline
Triangle rendering processOpenGL graphics pipeline: host programs fill memory buffers managed by OpenGL with an array of vertices; These vertices are projected into screen space, assembled into triangles, and rasterized into pixel-sized fragments. Finally, the fragment is assigned a color value and drawn to the frame buffer. Modern Gpus gain flexibility by delegating the “items into screen space” and “assign color values” phases to programs called shaders.
1. Vertex data Specifies vertex data
The vertex data (mainly vertex attributes: position, normal, color, and texture coordinates) is the input to the rendering pipeline, specifically, the vertex array.
2. Vertex shader
The GPU first reads each selected vertex from the vertex array, then runs the program through a vertex shader that takes a set of vertex attributes as input, then outputs a new set of vertex attributes, and the vertex shader computes at least the projected positions of vertices in screen space. Vertex shaders can also generate other variable output, such as color or texture coordinates.
Shape assembly
In most cases, the assembly is triangular primitives. In this process, the GPU connects vertices projected onto the screen space to form triangles. It does this by taking vertices in the order specified by the element array and grouping them into groups of three. Vertices can be grouped in several different ways:
- Treat each of the three elements as a separate triangle
- Make a triangle strip and reuse the last two vertices of each triangle as the first two vertices of the next one
- Triangle Fan, joins the first element to each of the following pairs
4. The rasterizer
The Fragment Shader generates fragments for use by mapping primions to corresponding pixels on the final screen. Clipping is performed before the fragment shader runs. Crop will discard all pixels beyond your view to improve execution efficiency.
5. Fragment shaders
The main purpose of the fragment shader is to calculate the final color of a pixel, and this is where all of OpenGL’s advanced effects are generated. Typically, fragment shaders contain data about the 3D scene (such as lighting, shadows, light colors, and so on) that can be used to calculate the color of the final pixels. It is then drawn into the frame buffer. Common fragment shader operations include texture mapping and lighting. Because the fragment shader runs independently for each pixel drawn, the most complex special effects can be performed.