OpenGL ES series

  • GLKit que
  • Introduction and rendering process
  • GLKit app loads images
  • Filter 1- Split screen filter
  • Filter 2- Grayscale, Reverse, Vortex, Mosaic
  • Filter 3- Zoom, out-of-body, flash white, burr

A brief introduction.

OpenGL ES (OpenGL for Embedded Systems) is an advanced 3D graphics application programming interface (API) for handheld and Embedded Systems.

OpenGL ES is currently the dominant graphics API in smart phones. Supported platforms: iOS, Android, BlackBerry, Bada,Linux,Windows.

OpenGL ES Open Graphics Library (OpenGL’s) for visualizing 2d and 3D data. It is a versatile open standard graphics library that supports 2D and 3D digital content creation, mechanical and architectural design, virtual prototyping, flight simulation, video games and more. You can use OpenGL to configure and submit data to a 3D graphics pipeline. Vertices are transformed and lit, grouped into primitives, and rasterized to create 2D images. OpenGL aims to convert function calls into graphical commands that can be sent to the underlying graphics hardware. Because this underlying hardware is dedicated to processing graphics commands, OpenGL drawing is typically very fast. OpenGL for Embedded Systems(OpenGL ES) is a simplified version of OpenGL that eliminates redundancy and provides a library that is both easy to learn and easy to implement in mobile graphics hardware.

Ii. Rendering process

1. Vertex shaders

Input to vertex shaders:

  • Shader program – Vertex shader program source code/executable that describes operations performed on vertices

  • Vertex shader input (attributes) – Provides data for each vertex with an array of vertices

  • Uniform variables — invariant data used by vertex/fragment shaders.

  • Sampler – a special uniform variable type that represents the texture used by vertex shaders.

Vertex shader business:

  • Matrix transformation position

  • Computes the lighting formula to generate per-vertex colors

  • Generate/transform texture coordinates

Code sample

// attribute -- vertex coordinates, 4-dimensional vector attribute vec4 position; // Attribute -- Texture coordinate, 2d vector Attribute Vec2 textCoordinate; Uniform -- rotation matrix. Uniform mat4 rotateMatrix; Varying LOWP VEC2 varyTextCoord; // Varying LOWp VEC2 varyTextCoord; voidmain() {// Texture value output varyTextCoord = textCoordinate; vec4 vPos = position; // vPos = vPos * rotateMatrix; // The vertex coordinates change // the default output vertex coordinates to the slice shader gl_Position = vPos; }Copy the code

Summary: It is used to perform custom operations, implement new variations, lighting or vertex-based effects that traditional fixed functions cannot.Copy the code

2. Diagram source assembly

Primitive: points, lines, triangles, etc.

Pixel assembly: The vertex data is calculated into pixels, and clipping, perspective segmentation, and Viewport transformation are performed at this stage.

  • The primitives type and vertex index determine the individual primitives to be rendered.
  • For each pixel and its corresponding vertex, the operations performed in the pixel assembly stage include clipping the output value of the vertex shader, perspective segmentation, and entering the rasterization stage after the viewport changes.

3. The rasterizer

The process of drawing the corresponding picture (point, line, triangle) and converting the pixel into a set of two-dimensional fragments is to convert the computed picture of the vertex data into pixels, which are then processed by the pixel shader.

4. Slice shader

Fragment shaders are also called fragment shaders or pixel shaders.

Input to the chip shader:

  • Shader program – source code/executable file of a shard shader program that describes operations performed on a fragment.

  • Input variable — Raster unit output with vertex shaders generated by interpolation for each fragment

  • Uniform variables — invariant data used by vertex or slice shaders

  • Sampler – a special uniform variable type that represents the texture used by the slice shader

Chip shader business:

  • Calculation of the color

  • Get texture value

  • Fill the pixels with color values (texture values/color values)

Code examples:

// Texture coordinates: Varying lowp VEC2 varyTextCoord; varying lowP VEC2 varyTextCoord; Uniform sampler2D colorMap; voidmainGl_FragColor = texture2D(texture sampler, texture coordinates); gl_FragColor = texture2D(colorMap, varyTextCoord); }Copy the code

Summary: It can be used to fill every pixel in an image/video/graphic with color (for example, adding a filter to a video actually changes the color fill of every pixel in the video).

5. Operate fragment by fragment

  • Pixel ownership test: Determines whether the pixels at positions (Xw,Yw) in the frame cache are currently owned by OpenGL ES. For example, if a View displaying OpenGL ES frame cache is obscured by another View. The windowing system can determine that the shaded pixel does not belong to the OpenGL ES context. So not all of these pixels are displayed. The pixel attribution test is part of OpenGL ES, which is not controlled by the developer, but carried out by OpenGL ES.
  • Clipping test: The clipping test determines whether (Xw,Yw) is within the clipping rectangle that is part of the OpenGL ES state. If the segment is outside the clipping region, it is discarded.
  • Depth testing: Enter the depth value of the fragment for progressive comparison to determine whether the fragment rejects the test
  • Blend: Blend combines the newly generated fragment color with the color value stored at the location in the frame cache
  • Jitter: Jitter can be used to minimize artifacts caused by using limited precision to store color values in the frame cache.

3.EGL (Embedded Graphics Library)

  • The OpenGL ES command requires a rendering context and a drawing surface to complete the drawing of a graphic image.
  • Render context: Stores the relevant OpenGL ES state.
  • Draw surface: Is a surface used to draw primitives, specifying the type of cache required for rendering, such as color buffer, depth buffer, and template buffer.
  • The OpenGL ES API does not provide information on how to create a rendering context or how the context is connected to the native windowed system. EGL is the interface between Khronos rendering apis (such as OpenGL ES) and the native window system. The only platform that supports OpenGL ES but does not support EGL is iOS. Apple provides its own iOS implementation of the EGL API, called EAGL.
  • Because each window-system has a different definition, EGL provides the basic opaque type EGLDisplay, which encapsulates all system dependencies and is used to interface with the native window-system.

Because OpenGL ES is a C-based API, it is very portable and widely supported. As a C API, it integrates seamlessly with Objective-C Cocoa Touch applications. The OpenGL ES specification does not define a window layer; In contrast, the managed operating system must provide functions to create an OpenGL ES rendering context that receives commands and a frame buffer that writes the results of any drawing command. Using OpenGL ES on iOS requires the use of iOS classes to set up and render drawing surfaces, and the use of platform-neutral apis to render their contents.