First, what isOpenGL ES

OpenGL ES (OpenGL for Embedded Systems) is a subset of OpenGL 3D graphics API, designed for Embedded devices such as mobile phones, PDAs and game consoles. In the official Apple documentation of OpenGL ES, there is this quote:

The Open Graphics Library (OpenGL) is used for visualizing 2D and 3D data. It is a multipurpose open-

standard graphics library that supports applications for 2D and 3D digital content creation, mechanical and architectural design, virtual prototyping, flight simulation, video games, and more. You use OpenGL to configure a 3D graphics pipeline and submit data to it. Vertices are transformed and lit, assembled into primitives, and rasterized to create a 2D image. OpenGL is designed to translate function calls into graphics commands that can be sent to underlying graphics hardware. Because this underlying hardware is dedicated to processing graphics commands, OpenGL drawing is typically very fast. OpenGL for Embedded Systems (OpenGL ES) is a simplified version of OpenGL that eliminates redundant functionality to provide a library that is both easier to learn and easier to implement in mobile graphics hardware.

OpenGL ES Open graphics library (OpenGL’s) for visualizing 2d and 3D data. It is a versatile open standard graphics library that supports 2D and 3D digital content creation, mechanical and architectural design, virtual prototyping, flight simulation, video games and other applications. You can use OpenGL to configure and submit data to a 3D graphics pipeline. Vertices are transformed and lit, grouped into primitives, and rasterized to create 2D images. OpenGL aims to convert function calls into graphical commands that can be sent to the underlying graphics hardware. Because this underlying hardware is dedicated to processing graphics commands, OpenGL drawing is typically very fast.

OpenGL for Embedded Systems(OpenGL ES) is a simplified version of OpenGL that eliminates redundancy and provides a library that is both easy to learn and easier to implement in mobile graphics hardware.

Second,OpenGL ESUse process

1. Vertex shader

inOpenGL ESA vertex shader usually contains the following function/code segment:

    1. Shader program – Vertex shader program source code/executable file describing row operations on vertices
  • 2. Vertex shader input (properties) – Provides data for each vertex with an array of vertices

  • Uniform variables — invariant data used by vertex/slice shaders

    1. Sampler – a special uniform variable type that represents the texture used by vertex shaders.

Vertex shader business content:

    1. Matrix transformation position
  • 2. Calculate lighting formula to generate per-vertex color
  • 3. Generate/transform texture coordinates

Summary: It can be used to perform custom calculations, implement new transformations, lighting, or vertex-based effects that traditional fixed features do not allow.

Vertex shader code:

attribute vec4 position;
attribute vec2 textCoordinate; uniform mat4 rotateMatrix; varying lowp vec2 varyTextCoord; void main()
{
varyTextCoord = textCoordinate;
vec4 vPos = position;
vPos = vPos * rotateMatrix;
gl_Position = vPos; 
}
Copy the code

2. Pixel assembly

After the vertex shader, the next stage is primitive assembly. Primitive: points, lines, trigons, etc. Primitives assembly: Computes vertex data into primitives. Clipping, perspective splitting, and Viewport transformation are performed at this stage. The primitives type and vertex string determine the individual primitives to be rendered. For each individual pixel and its corresponding vertex, the operations performed in the pixel assembly stage include clipping the output value of the vertex shader, perspective segmentation, viewport transformation and then entering the rasterization stage.

3. The rasterizer

At this stage, the corresponding primions (point/line/triangle) are drawn. Rasterization is the process of converting primions into a set of 2d fragments. These transformed fragments are processed by the pixel shader. These two-dimensional fragments are the pixels that can be drawn on the screen.

4. Fragment Shader

inOpenGL ESThe source shader usually contains the following function/code segment:

    1. Shader program – source code/executable file for a fragment shader program that describes operations performed on the fragment
  • 2. Input variable — Raster unit output with vertex shader generated by interpolation for each fragment

  • Uniform variables — invariant data used by vertex/slice shaders

    1. Sampler – a special uniform variable type that represents the texture used by the slice shader.

The business content of the source shader:

  • 1. Count colors
    1. Get texture value
    1. Fill the pixel with a color value (texture value/color value);

Summary: It can be used to fill every pixel in an image/video/graphic with color (for example, adding a filter to a video actually changes the color fill of every pixel in the video).

Source shader code:

varying lowp vec2 varyTextCoord; uniform sampler2D colorMap;
void main() {
    gl_FragColor = texture2D(colorMap, varyTextCoord);
}
Copy the code

5. Fragment by fragment operation:

  • Pixel attribution test: Determines whether pixels at positions (Xw,Yw) in the frame cache are currently assignedOpenGL ESFor example, if a displayOpenGL ESThe frame bufferViewBy anotherViewThe window system can determine that the shaded pixel does not belong toOpenGL ESContext. Thus not all of these pixels are displayed. And the pixel attribution test isOpenGL ESPart of it is not controlled by the developer, but byOpenGL ESInternally.
  • Clipping test: Clipping test determines whether (Xw,Yw) is located asOpenGL ESState part of the clipping rectangle range. If the segment is outside the clipping region, it is discarded.
  • Depth testing: Enter the depth value of the fragment for progressive comparison to determine whether the fragment rejects the test
  • Blend: Blend combines the newly generated fragment color with the color value stored at the location in the frame cache.
  • Jitter: Jitter can be used to minimize artifacts caused by using limited precision to store color values in the frame cache.

Iii. EGL (Embedded Graphics Library)

OpenGL ES can be used on iOS, Android, Linux and other mobile terminals, but the interconnection between OpenGL ES and each platform needs to be provided and maintained by each platform. OpenGL ES is only responsible for low-level rendering.

  • The OpenGL ES command requires a rendering context and a drawing surface to complete the drawing of a graphic image.
  • Render context: Store correlationOpenGL ESState.
  • Draw surface: A surface used to draw primitives that specifies the type of cache required for rendering, such as color cache, depth buffer, and template cache.
  • OpenGL ESThe API does not provide information on how to create a render context or how the context is connected to a native window port system.EGLKhronosRendering API (such asOpenGL ES) and native windowing systems. Only supportedOpenGL ESBut I do not supportEGLThe platform is iOS. Apple provides its ownEGL APIThe iOS implementation is calledEAGL
  • Because every window system has a different definition, soEGLProvide basic opaque types —EGLDisplayThis type encapsulates all system dependencies for native Windows and system interfaces.

Because OpenGL ES is a C-based API, it is very portable and widely supported. As a C API, it integrates seamlessly with Objective-C Cocoa Touch applications. The OpenGL ES specification does not define a window layer; Instead, the managed operating system must provide functions to create an OpenGL ES rendering context that receives commands and a frame buffer to write the results of any drawing commands. Using OpenGL ES on iOS requires the use of iOS classes to set up and render drawing surfaces, and the use of platform-neutral apis to render their contents.

What is GLKit?

GLKit framework is designed to simplify application development based on OpenGL/OpenGL ES. Its appearance speeds up OpenGL ES or OpenGL application development. Use math libraries, background texture loading, pre-created shader effects, and standard views and view controllers to implement the render loop. The GLKit framework provides functionality and classes to reduce the amount of work required to create new shader-based applications, or to support existing applications that rely on fixed function vertex or fragment handling provided by earlier versions of OpenGL ES.

Two important classes:

  • GLKView: Provide a drawing place (View)
  • GLKViewController(Extended to standardUIKitDesign mode. Used to manage and render the contents of the drawing view.

While Apple has switched from OpenGL ES to Metal, iOS developers can continue to use it

GLKit function

  • Loading textures
  • Provides high performance mathematics
  • Provides common shaders
  • Provides views and view controllers.

GLKit texture loading

1).GLKTextureInfocreateOpenGLThe texture information
@interface GLKTextureInfo : NSObject <NSCopying>
{
@private
    GLuint                      name;
    GLenum                      target;
    GLuint                      width;
    GLuint                      height;
    GLuint                      depth;
    GLKTextureInfoAlphaState    alphaState;
    GLKTextureInfoOrigin        textureOrigin;
    BOOL                        containsMipmaps;
    GLuint                      mimapLevelCount;
    GLuint                      arrayLength;
}

@property (readonly) GLuint                     name;
@property (readonly) GLenum                     target;
@property (readonly) GLuint                     width;
@property (readonly) GLuint                     height;
@property (readonly) GLuint                     depth;
@property (readonly) GLKTextureInfoAlphaState   alphaState;
@property (readonly) GLKTextureInfoOrigin       textureOrigin;
@property (readonly) BOOL                       containsMipmaps;
@property (readonly) GLuint                     mimapLevelCount;
@property (readonly) GLuint                     arrayLength;

@end
Copy the code

The GLKTextureInfo object has a number of properties. Common ones are as follows:

  • name: Texture name in OpenGL context
  • target: The target of the texture binding
  • height: Height of the loaded texture
  • width: The width to load the texture
  • textureOrigin: loads the origin position in the texture
  • alphaState: loads the alpha component state of the texture
  • containsMipmaps: Boolean value, whether the loaded texture contains an MIP map
2.GLTextureLoader Simplifies loading textures from various resource files

GLTextureLoader is used to load textures and is initialized as follows:

Initialization mode:

-initWithShareGroup: initializes a new texture to load into the object. -initWithShareContext: initializes a new texture to load into the object

The common loading methods are as follows:
  • Load a texture from a file

+ textureWithContentsOfFile: options: errer: 2 d texture image and the data to be loaded from a file

Create a new texture

– textureWithContentsOfFile: options: queue: completionHandler: asynchronous from the file

Load a 2D texture image and create a new texture based on the data

  • Load the texture from the URL

– textureWithContentsOfURL: options: error: 2 d texture image and the data to be loaded from the URL

The new texture

– textureWithContentsOfURL: options: queue: completionHandler: from the URL asynchronous

Load a 2D texture image and create a new texture based on the data.

  • Creates a texture from an in-memory representation

+ textureWithContentsOfData: options: errer: load 2 d texture image from memory space, and the root

Create a new texture from the data

– textureWithContentsOfData: options: queue: completionHandler: from memory space

Asynchronously loads 2D texture images and creates new textures from the data

  • Create textures from CGImages

– textureWithCGImage: options: error: load from Quartz image from 2 d texture image and from the data

The new texture

– textureWithCGImage: options: queue: completionHandler: from Quartz image asynchronous

Load a 2D texture image and create a new texture based on the data

  • Load multidimensional create textures from the URL

From a single URL + cabeMapWithContentsOfURL: options: errer: load the cube texture texture

Image and create a new texture based on the data

– cabeMapWithContentsOfURL: options: queue: completionHandler: from a single

The URL asynchronously loads the cube map texture image and creates a new texture based on the data

  • Create textures by loading multidimensional data from a file

+ cubeMapWithContentsOfFile: options: errer: load the cube texture texture from a single file

Object and create a new texture from the data

– cubeMapWithContentsOfFile: options: queue: completionHandler: from a single file

Asynchronously loads cube map texture objects and creates new textures from the data

+ cubeMapWithContentsOfFiles: options: errer: load the cube texture from a series of files

Texture images and create new textures from data totals

- cubeMapWithContentsOfFiles:options:options:queue:completionHandler:

Asynchronously loads cube map texture images from a series of files and creates new textures from the data

2.GLKit OpenGL ESThe view to render

  • GLKViewuseOpenGL ESView for drawing content is implemented by default
  • Initialize the view

– initWithFrame:context: initializes the delegate view’s proxy

  • Configure the frame cache object

DrawableDepthFormat drawableStencilFormat Specifies the template rendering cache format DrawableMultisample Specifies the format of the multisample cache

  • Frame cache property

DrawableHeight Specifies the height of the underlying cache object, in pixels.

  • Draws the contents of the view

Context – bindDrawable binds the underlying FrameBuffer object to OpenGL ES enableSetNeedsDisplay, a Boolean value that specifies whether the view responds to messages that invalidate the view’s contents – display Immediately redraws view content snapshot Draws the view content and returns it as a new image object

  • Delete the viewFrameBufferobject

– deleteDrawable Deletes the drawable object associated with the view

  • GLKViewDelegateUsed forGLKViewObject callback method

Draw view contents – glkView:drawInRect: Draw view contents (proxy must be implemented)

  • GLKViewControllermanagementOpenGL ESRender loop view controller
  • update

- (void) glkViewControllerUpdate:

  • Configure frame Rate

PreferredFramesPerSecond Rate at which the view controller calls the view and updates its content framesPerSencond The actual rate at which the view controller calls the view and updates its content

  • configurationGLKViewController The agent

Delegate Is the delegate of the view controller

  • Control frame update

Paused, a Boolean value that indicates whether the rendering loop has paused pausedOnWillResignActive. The view controller is paused when the program reinitiates the active state

No Automatically pauses the rendering loop

ResumeOnDidBecomeActive Boolean for whether view control is automatic when the current program becomes active

Resume rendering loop

  • Get more up-to-date information about the View

FrameDisplayed Number of frame updates sent by the view controller since it was created timeSinceFirstResume The amount of time that has elapsed since the view controller first resumed sending the update event TimeSinceLastUpdate timeSinceLastUpdate glkViewControllerUpdate glkViewControllerUpdate timeSinceLastUpdate timeSinceLastUpdate glkViewControllerUpdate glkViewControllerUpdate glkViewControllerUpdate glkViewControllerUpdate TimeSinceLastDraw ‘The amount of time that has elapsed since the last view controller called the view display method

  • GLKViewControllerDelegateRender loop callback method

Handle update events – glkViewControllerUpdate: call pause/resume notifications before displaying each frame – glkViewController: willPause: call before rendering loop pauses or resumes

  • GLKBaseEffectA simple lighting/shader system for shader-based applicationsOpenGLApply colours to a drawing
  • Named Effect

Label names an Effect

  • Configure the model view transformation

Transform binding effects are applied to vertex data model views, projections, and texture transformations

  • Configure lighting effects

LightingType is used to calculate the lighting strategy for each fragment,GLKLightingType

  • GLKLightingType

GLKLightingTypePerVertex means that the light calculation is performed on each vertex in the triangle and then interpolated in the triangle. GLKLightingTypePerPixel means that the input of the light calculation is inserted into the triangle and the light calculation is performed on each fragment

  • Configure the illumination

LightModelTwoSided Boolean that represents the side of the primitive that calculates the lighting material that calculates the lighting used to render the primitive lightModelAmbientColor Environment color All the primitives that the effect is to render. Light0 The first lighting property in the scene The second lighting attribute in light1 the third lighting attribute in light2

  • Configuration texture

Texture2d0 The first texture property texture2d1 the second texture property textureOrder the order in which the texture is applied to render primions

  • Configuration atomization

ColorMaterialEnable a Boolean value indicating whether the color vertex property useConstantColor is used when calculating light and material interactions. A Boolean value indicating whether the constantColor constantColor is used Constant color is used when no per-vertex color data is supplied

  • Prepare the rendering effect

PrepareToDraw Prepares the render effect

V. Case analysis

1. UseGLKitImplement modify currentviewthebackgroundColor

You’ve already seen that there are two important classes in GLKitGLViewandGLViewController, so we use the currentviewInheritance andGLViewOr the currentVCInheritance inGLViewController. We used in the casestoryboard, so just put the currentVCtheviewInheritance inGLViewYou can:

And then we go to the.m file

  • Import the related library header file and define oneEAGLContextMember variables:
#import <OpenGLES/ES3/gl.h>
#import <OpenGLES/ES3/glext.h>

@interface ViewController ()
{
    EAGLContext *context;
    
@end
Copy the code
  • viewDidLoadMethods:
- (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib. //1. Initialize the context & set the current context / * EAGLContext OpenGLES render layer is apple's iOS platform. KEAGLRenderingAPIOpenGLES1 = 1, Fixed line kEAGLRenderingAPIOpenGLES2 = 2, kEAGLRenderingAPIOpenGLES3 = 3, */ context = [[EAGLContext alloc]initWithAPI:kEAGLRenderingAPIOpenGLES3]; // Check whether the context is created successfully. context) { NSLog(@"Create ES context Failed"); } / / set the current context [EAGLContext setCurrentContext: context]; GLKView *view =(GLKView *) self.view; view.context = context; //3. Set the background color glClearColor(0, 0, 1, 1.0); }Copy the code
  • implementationGLViewProxy method
The GLKView object makes its OpenGL ES context the current context and binds its framebuffer to the target of the OpenGL ES render command. The delegate method should then draw the contents of the view. */ - (void)glkView:(GLKView *)view drawInRect:(CGRect)rect { glClear(GL_COLOR_BUFFER_BIT); }Copy the code

Ok, let’s look at the effect:

Perfect!!

2. UseGLKitLoad the image texture and display toviewon

  • Add a new member variable based on case 1:
@interface ViewController ()
{
    EAGLContext *context;
    GLKBaseEffect *cEffect;
}
@end
Copy the code
  • Initialize theOpenGL ES
- (void) setUpConfig {/ / 1. Initialization context & set the current context / * EAGLContext OpenGLES render layer is apple's iOS platform. KEAGLRenderingAPIOpenGLES1 = 1, Fixed line kEAGLRenderingAPIOpenGLES2 = 2, kEAGLRenderingAPIOpenGLES3 = 3, */ context = [[EAGLContext alloc]initWithAPI:kEAGLRenderingAPIOpenGLES3]; // Check whether the context is created successfully. context) { NSLog(@"Create ES context Failed"); } / / set the current context [EAGLContext setCurrentContext: context]; GLKView *view =(GLKView *) self.view; view.context = context; DrawableColorFormat: drawableColorFormat: drawableColorFormat Summary: OpenGL ES has a cache that stores the colors that will be displayed on the screen. You can use its properties to set the color format of each pixel in the buffer. GLKViewDrawableColorFormatRGBA8888 = 0, the default. The minimum component of each pixel of the cache (RGBA) uses 8 bits, (so 4 bytes per pixel, 4*8 bits). GLKViewDrawableColorFormatRGB565, if your APP allows a smaller range of color, can set this. DrawableDepthFormat: drawableDepthFormat: drawableDepthFormat: drawableDepthFormat: drawableDepthFormat: drawableDepthFormat: Depth buffer format GLKViewDrawableDepthFormatNone = 0 means no depth buffer GLKViewDrawableDepthFormat16, GLKViewDrawableDepthFormat24, If you want to use this property (usually used in 3 d games), you should choose GLKViewDrawableDepthFormat16 or GLKViewDrawableDepthFormat24. The difference here is using GLKViewDrawableDepthFormat16 will consume fewer resources * / / / 3. Configuration view create render buffer. The drawableColorFormat = GLKViewDrawableColorFormatRGBA8888; view.drawableDepthFormat = GLKViewDrawableDepthFormat16; //4. Set the background color glClearColor(0, 0, 1, 1.0); }Copy the code

Compared to example 1, this step increases the color buffer and depth buffer for view creation:

3. Configure the render cache created by the view.

view.drawableColorFormat = GLKViewDrawableColorFormatRGBA8888;
view.drawableDepthFormat = GLKViewDrawableDepthFormat16;
Copy the code
  • Create vertex data
-(void)setUpVertexData {//1. Set the vertex array (vertex coordinates, texture coordinates) /* texture coordinates range [0,1]; The origin is the lower left corner (0,0); So (0, 0) is the lower left corner of the texture image, point (1, 1) is the top right corner. * / GLfloat vertexData [] = {0.5, 0.5, 0.0 f, f 1.0, 0.0, f / / lower 0.5, 0.5, 0.0 f, 1.0 f, 1.0 f, / / right - 0.5, 0.5, 0.0 f, f 0.0, 1.0, f / / upper left 0.5, 0.5, 0.0 f, f 1.0, 0.0, f / / right - 0.5, 0.5, 0.0 f, f 0.0, 1.0, f / / upper left - 0.5, 0.5, 0.0f, 0.0f, 0.0f, 0.0f, // bottom left}; / * vertex array: developers can choose to set the function Pointers, drawing method, the call directly from memory into the vertex data, that is to say, this part of the data is stored in the memory of before, called vertex array buffer: higher performance, allocate a block of memory, in advance of vertex data in advance into the memory. This part of the video memory is called the vertex buffer */ /2. Create vertex buffer identifier ID GLuint bufferID; glGenBuffers(1, &bufferID); GlBindBuffer (GL_ARRAY_BUFFER, bufferID); GL_ARRAY_BUFFER (GL_ARRAY_BUFFER, sizeof(vertexData), vertexData, GL_STATIC_DRAW); (1) In iOS, all vertex shader Attribute variables are turned off by default for performance reasons. Meaning that vertex data is not available on the shader side (server side). Even if you already use the glBufferData method to copy vertex data from memory to the vertex cache (GPU memory). So, must open the channel by glEnableVertexAttribArray method. Specify access properties to allow vertex shaders to access data copied from CPU to GPU. Note: data on the GPU is visible, that is, to the shader can read data, is determined by whether to enable the corresponding attribute, that's the function of glEnableVertexAttribArray, allowed to read the vertex shader GPU (server) data. GlVertexAttribPointer (GLuint indx, GLint size, GLenum type, GLboolean normalized, GLsizei stride, Const GLvoid* PTR) function: upload vertex data to memory (set the appropriate way to read data from buffer) parameter list: index, specify the index value of the vertex property to be modified, such as size, the number of reads each time. (For example, position is composed of three (x,y,z), color is four (R, G, B,a), and texture is two.) Type, specifying the data type of each component in the array. Available symbolic constants are GL_BYTE, GL_UNSIGNED_BYTE, GL_SHORT,GL_UNSIGNED_SHORT, GL_FIXED, and GL_FLOAT, starting with GL_FLOAT. Normalized, specifies whether the fixed point data value should be normalized (GL_TRUE) or converted to a fixed point value (GL_FALSE) stride when accessed, specifies the offset between consecutive vertex attributes. If 0, the vertex attributes are understood to mean that they are tightly packed together. PTR specifies a pointer to the first component of the first vertex property in the array. The initial value of 0 * / / / vertex coordinate data glEnableVertexAttribArray (GLKVertexAttribPosition); glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 5, (GLfloat *)NULL + 0); / / texture coordinates data glEnableVertexAttribArray (GLKVertexAttribTexCoord0); glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 5, (GLfloat *)NULL + 3); }Copy the code

The purpose of this step is to create the vertex data that will display the image and determine where the image will be displayed in the view. The picture is a square, and we have two triangles, so we need six vertices.

Now, it’s worth mentioning that we’re defining vertex coordinates using a one-dimensional array, and we talked about this earlier, we’re used to using one-dimensional arrays in OpenGL, so OpenGL ES is the same thing.

  • Load the image texture and display it
NSString *filePath = [[NSBundle mainBundle]pathForResource:@"Jessica" ofType:@"jpg"]; // The origin of the texture coordinates is the lower left corner, but the image shows that the origin should be the upper left corner. NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:@(1),GLKTextureLoaderOriginBottomLeft, nil]; GLKTextureInfo *textureInfo = [GLKTextureLoader textureWithContentsOfFile:filePath options:options error:nil]; CEffect = [[GLKBaseEffect alloc]init]; GLKBaseEffect alloc = [GLKBaseEffect alloc]init]; cEffect.texture2d0.enabled = GL_TRUE; cEffect.texture2d0.name = textureInfo.name; }Copy the code
  • implementationGLKViewDelegateProxy method
The GLKView object makes its OpenGL ES context the current context and binds its framebuffer to the target of the OpenGL ES render command. The delegate method should then draw the contents of the view. */ - (void)glkView:(GLKView *)view drawInRect:(CGRect)rect { //1. glClear(GL_COLOR_BUFFER_BIT); //2. Prepare [cEffect prepareToDraw]; //3. Say, Say, Say, glDrawArrays(Say, GL_TRIANGLES, 0, 6); }Copy the code
  • viewDidLoadcall
- (void)viewDidLoad { [super viewDidLoad]; //1.OpenGL ES related initialization [self setUpConfig]; //2. Load vertex/texture coordinate data [self setUpVertexData]; //3. Load texture data (GLBaseEffect) [self setUpTexture]; }Copy the code

Ok, run it

Note: Using GLKit is important to note the implementation of the GLKViewDelegate proxy

Remember to like it if you think it’s good! I heard that the people who read the praise will meet the test, meet the prize will be in. ღ(´ · ᴗ · ‘)