The main purpose of this case is to understand GLSL custom shaders and how custom shaders are used

The effect of the case is to load an image using GLSL custom coloring, as shown below

The overall flow chart of the case is as follows

The process is mainly divided into four modules

  • Preparation: project creation and customization of views, attributes, etc
  • Custom shaders: Use GLSL to write custom vertex and slice shaders
  • Initialization: create layer, context, clear the cache, and set Render and Frame caches
  • Drawing: Mainly GLSL loading, vertex data processing and loading textures, and finally drawing to the screen

The preparatory work

How to create a shader file

  • Command + N, start creating a file

  • Select ios –> Other –> Empty and click Next

  • Enter a file name, such as shaderv.vsh, and click Create

Custom shaders

The custom shader is essentially a string, and when written in Xcode, there is no hint, so be careful!

Vertex shader

  • Define twoattributeA variable that represents the vertex coordinatespositionAnd texture coordinatestextCoordinate
  • Defines a variable that Bridges to a chipvaryTextCoordTo pass texture coordinates from the vertex shader to the slice shader
  • Main: If there is no transformation of the vertex, the vertex coordinates are directly assigned to the built-in variablegl_PositionIf the vertex is transformed, the result of the transformation, the final vertex coordinate data, is assigned to the built-in variable
// Attribute vec4 position; Texture coordinate Attribute Vec2 textCoordinate; // Texture coordinates varying lowp VEC2 varyTextCoord; Void main(){// The texture coordinates are passed to the fillet shader varyTextCoord = textCoordinate by varying varyTextCoord. Gl_Position = position; gl_Position = position; }Copy the code

Slice with shader

  • The precision of type float in the chip shader, if not written, may result in some exception errors
  • Defines a bridge variable to a vertex shadervaryTextCoordThat is, texture coordinates must be exactly the same as in the vertex shader. If they are not consistent, texture coordinate data will not be passed
  • To define aunifomModified texture samplercolorMap, used to get the grain element for each pixel of texture coordinates
  • Main function: mainly texture color filling, throughtexture2DThe built-in function gets the final color value. It takes two parameters, 1 is the texture image, 2 is the texture coordinate, and returns one valuevec4Type, and assigns the result of that value to a built-in variable
// Specify the default precision of float highp float; // Texture coordinates varying lowp VEC2 varyTextCoord; Uniform sampler2D colorMap; // Uniform sampler2D colorMap; Void main(){//texture2D(texture sampler, texture coordinates); Assign the read pixel to the built-in variable gl_FragColor gl_FragColor = texture2D(colorMap, varyTextCoord); }Copy the code

Initialize the

Initialization is divided into four parts

  • SetupLayer: Create a layer
  • SetupContext: Create context
  • DeleteRenderAndFrameBuffer: clear the cache area
  • SetupRenderBuffer, setupFrameBuffer: Set RenderBuffer & FrameBuffer

SetupLayer function: creates a layer

Layer is mainly used to display the carrier function flow of OpenGL ES drawing content as follows

  • There are two ways to create a special layer
    • 1) Directly use the native layer of UIView because the native layer of UIView is inherited fromCALayer, and the layer to be created is inherited fromCAEAGLLayerSo you need to override the class methodlayerClassTo return to[CAEAGLLayer class]
    • 2) Using init to create layers can be used directly[[CAEAGLLayer alloc] init]Create a new layer and add it to the layer
    • In this case, we use the layer that comes with the View
  • Set scale to layer size to match the screen size
  • Setting description Properties
    • KEAGLDrawablePropertyRetainedBacking only true or false
    • KEAGLDrawablePropertyColorFormat has the following three values
KEAGLDrawablePropertyColorFormat enumerated values describe
kEAGLColorFormatRGBA8 32 bit RGBA color values (each represents 8 bits, so 4*8=32 bits)
kEAGLColorFormatRGB565 16-bit RGB color value
kEAGLColorFormatSRGBA8 Representing the standard red, green and blue, sRGB’s color space is based on independent color coordinates, so that colors can be used in different equipment transmission corresponding to the same color coordinate system, and is not affected by the different color coordinates of these devices.

Here are some descriptions of the two properties

attribute instructions The default value
kEAGLDrawablePropertyRetainedBacking Indicates whether the contents of a drawing surface are retained after it is displayed false
kEAGLDrawablePropertyColorFormat An internal color cache format for a drawable surface kEAGLColorFormatRGBA8
Self.myeaglayer = (CAEAGLLayer*)self.layer; // 2, set scale [self setContentScaleFactor:[[UIScreen] scale]]; / / 3, set the description attribute self. MyEagLayer. DrawableProperties = [NSDictionary dictionaryWithObjectsAndKeys: @ false, kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil]; } + (Class)layerClass{ return [CAEAGLLayer class]; }Copy the code

The setupContext function creates the context

Context is used to store state in OpenGL ES. It is a state machine. Whether it is GLKIt or GLSL, context is required

  • Create a Context, specify the OpenGL ES rendering API version number (2 or 3), and check whether the Context is created successfully
  • Set the current context to the created context, and check whether the Settings were successful
  • Assign it to the global context,
/ / 2, create context - (void) setupContext {/ / 1, specify the OpenGL ES rendering API version EAGLRenderingAPI API = kEAGLRenderingAPIOpenGLES2; EAGLContext *context = [[EAGLContext alloc] initWithAPI: API]; // if (! context) { NSLog(@"Create context failed!" ); return; } // 4, set the graphics context if (! [EAGLContext setCurrentContext:context]) { NSLog(@"setCurrentContext failed"); return; } // 5, assign the local variable to the global variable self.myContext = context; }Copy the code

DeleteRenderAndFrameBuffer function: clear the cache area

The purpose of clearing the buffer is to remove residual data, preventing residual data from affecting the operation

Two cache areas need to be cleared: RenderBuffer and FrameBuffer

/ / 3, clear the cache area - (void) deleteRenderAndFrameBuffer {/ / empty glDeleteBuffers render cache area (1, & _myColorRenderBuffer); self.myColorRenderBuffer = 0; GlDeleteBuffers (1, &_mycolorFrameBuffer); self.myColorFrameBuffer = 0; }Copy the code

Set RenderBUffe & FrameBuffer

Before setting it up, let’s talk about RenderBuffer and FrameBuffer

  • RenderBuffer: A 2D image buffer allocated by the application that needs to be attached to the FrameBuffer
  • FrameBuffer: An attachment point that collects colors, depths, and the template cacheFBO, is amanagersRenderBuffer is used to manage the RenderBuffer. The FrameBuffer does not actually store the RenderBuffer

The graph below illustrates the relationship between the two

  • The FrameBuffer has three attachment points
    • Color Attachment: Manages texture and Color buffers
    • Depth Attachment: Affects the color Buffer and manages the depth Buffer.
    • Stencil Attachment: Manage Stencil Buffer
  • RenderBuffer has three types of cache
    • Depth Buffer: Stores Depth values
    • Texture cache area: stores texture coordinates in the corresponding texture elements, color values, etc
    • Stencil Buffer: Stores templates

SetupRenderBuffer function

Create the RenderBufferID and apply for an identifier, bind the identifier to the GL_RENDERBUFFER, and bind the layer’s associated store to the RenderBuffer object

The configuration process is as follows

// set RenderBuffer - (void)setupRenderBuffer{ Buffers (1, &buffer); // Buffers (1, &buffer); // 3, assign to the global variable self.myColorRenderBuffer = buffer; GlBindRenderbuffer (GL_RENDERBUFFER, self.mycolorrenderbuffer); OpenGL ES renderBuffer [self.myContext renderbufferStorage:GL_RENDERBUFFER] {self.myContext renderbufferStorage:GL_RENDERBUFFER  fromDrawable:self.myEagLayer]; }Copy the code

SetupFrameBuffer function

Create the ID of the FrameBuffer, apply for an identifier, bind the identifier to GL_FRAMEBUFFER, Then RenderBuffer glFramebufferRenderbuffer function through binding to the FrameBuffer GL_COLOR_ATTACHMENT0 attachment points, through the FrameBuffer RenderBuffer management, The RenderBuffer stores relevant data into the corresponding cache

The setup process is as follows

Void setFrameBuffer - (void)setupFrameBuffer{ Buffers(1, &buffer); // Buffers(1, &buffer); // 3, assign to the global variable self.myColorFrameBuffer = buffer; GL_FRAMEBUFFER glBindFramebuffer(GL_FRAMEBUFFER, self.mycolorFramebuffer); / / 5, will render cache area myColorRenderBuffer by binding to the GL_COLOR_ATTACHMENT0 / * glFramebufferRenderbuffer glFramebufferRenderbuffer function (GLenum Target, GLenum Attachment, GLenum RenderBufferTarget, GLuint RenderBuffer) Render buffer target parameter 4 to bind: Render buffer * / glFramebufferRenderbuffer (GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, self myColorRenderBuffer); }Copy the code

RenderBuffer and FrameBuffer are bound in sequence. RenderBuffer is first attached before FrameBuffer is attached

draw

The overall process diagram is drawn as shown in the diagram

It mainly includes five parts

  • Initialize: Initialize the background color, clear the cache, and set the viewport size
  • GLSL custom shader loading: To load custom shaders, the general steps are as followsRead --> Load --> compile --> Program link --> use
  • Vertex data setup and processing: Read vertex coordinates and texture coordinates into a custom vertex shader
  • Load the texture: Unpack the PNG/JPG image into a bitmap and read the texture elements for each pixel
  • Draw: Start drawing, store it in RenderBuffer, and display the image on screen from RenderBuffer

Initialize the

Note that you need to set the viewport size to match the screen size

// Set clear color & clear screen glClearColor(0.3f, 0.45f, 0.5f, 1.0f); glClear(GL_COLOR_BUFFER_BIT); CGFloat scale = [[UIScreen mainScreen] scale]; glViewport(self.frame.origin.x * scale, self.frame.origin.y * scale, self.frame.size.width * scale, self.frame.size.height * scale);Copy the code

GLSL custom shader loading

The loading of custom shaders is divided into the following steps

  • Read from the definition shader
  • Functions to compileShader & loadShaders: compile & loadShaders
  • Link program & Determine whether the link was successful
  • The use of the program

To read from a definition shader file, you need to get the path of the file and pass it into the loadShaders function for loading

NSString *vertFile = [[NSBundle mainBundle] pathForResource:@"shaderv" ofType:@"vsh"];    NSString *fragFile = [[NSBundle mainBundle] pathForResource:@"shaderf" ofType:@"fsh"];
Copy the code

LoadShaders & compileShader

  • To load/attach shaders to program, you need to compile them in the following steps

    • Reads the source string in the shader file according to the file path and converts it to a string in C of typeGLchar
    • Depending on the type of shader passed intype, the callglCreateShaderThe shader () function creates a shader with a uniquely identified ID and no corresponding source code attached
    • Will read the shader source throughglShaderSourceThe shader function appends to the created shader and returns the SHAder’s ID toloadShadersFunction shader to get and use the corresponding shader as ID
    • throughglCompileShaderThe shader () function compiles the source code attached to the shader into object code
Shader - (void)compileShader:(GLuint *)shader type:(GLenum)type file:(NSString *)file{//1. Read the file path string nsstrings * content = [nsstrings stringWithContentsOfFile: file encoding: NSUTF8StringEncoding error: nil]; const GLchar* source = (GLchar *)[content UTF8String]; NSLog(@"content %@, source %s", content, source); //2. Create a shader (based on type) *shader = glCreateShader(type); //3. Attach the shader source code to the shader object. // Parameter 1: shader, the shader object to compile *shader // Parameter 2: numOfStrings, the number of source strings to pass 1 // Parameter 3: Strings, the source code of the shader program (the real shader program source code) // Parameter 4: shader LenOfStrings, length, array with the length of each string, or NULL, which means string is null-terminated glShaderSource(*shader, 1, &source,NULL); NSLog(@"shader %d", *shader); // compileshader gleshader (*shader); }Copy the code
  • [loadShaders function] : After compiling vertex shader and slice shader respectively, return the corresponding ID of the shader, then attach the vertex shader and slice shader to program respectively through glAttachShader function, and release the shader that is no longer used. And assign to the global program

// load shader-(GLuint)loadShaders:(NSString *)vert Withfrag:(NSString *)frag{//1. Define two zero-time shader objects, GLuint verShader, fragShader; // create program GLint program = glCreateProgram(); //2: type of compilation, GL_VERTEX_SHADER (vertex_vertex_shader), GL_FRAGMENT_SHADER(fragment_shader) // 3: File path [self compileShader: &vershader type:GL_VERTEX_SHADER file:vert]; [self compileShader:&fragShader type:GL_FRAGMENT_SHADER file:frag]; NSLog(@"verShader %d, fragShader: %d", verShader, fragShader); // create final program glAttachShader(program, verShader); glAttachShader(program, fragShader); //4. Release unneeded shader glDeleteShader(verShader); glDeleteShader(fragShader); return program; }Copy the code

Link to the program

  • throughglLinkProgramFunction link program
  • Can be achieved byglGetProgramivFunction by specifying valuesGL_LINK_STATUSGets the status of the link and determines whether the link succeeded or failed
  • If the link fails, it can passglGetProgramIngoLogFunction to obtain error information log, according to the error information to troubleshoot the fault

Note 1: If the link fails, you need to do the following checks

  • General shader has a problem:
    • 1, check theWhether the shader was written incorrectly
    • 2, check theThe place where the value is passed indicates whether it was incorrectly written
  • If you’re sure Shader is ok, check againlayoutSubviewsThe most important thing is that there is no error message. Through breakpoint debugging, it is found that the shader is always 0, and the shader file is also positive length. Then frantically check the problem, and finally found that it is careless when calling. The context call is written as a layer call.)

Note 2: If the link is successful but the image is not loaded

  • Check the RenderBuffer and FrameBuffer Settings for problems

Use program to use a linked successful program through the glUseProgram function

 glUseProgram(self.myPrograme);
Copy the code

Vertex data setup and processing

Storing vertex data through an array and reading vertex and texture coordinates into a custom vertex shader is a three-step process

  • Set vertex data: mainly initialize vertex coordinates and texture coordinates
  • Open vertex cache: used to copy vertex data from CPU to GPU
  • Open the vertex/slice channel

There’s nothing to say about setting the vertex data, it’s just a one-dimensional array, what’s the next two steps

Open vertex buffer open vertex buffer, this part is actually the same as before using GLKit framework open buffer step, there are four steps

  • throughGLuintDefine a vertex cache ID
  • throughglGenBuffersFunction to apply for a vertex buffer identifier
  • throughglBindBuffersFunction to bind the cache identifier toGL_ARRAY_BUFFER
  • throughglBufferDataFunction to copy vertex data to the GPU
    GLuint attrBuffer;    glGenBuffers(1, &attrBuffer);    glBindBuffer(GL_ARRAY_BUFFER, attrBuffer);    glBufferData(GL_ARRAY_BUFFER, sizeof(attrArr), attrArr, GL_DYNAMIC_DRAW);
Copy the code

In iOS, the attribute channel is closed by default and needs to be manually enabled. Data has vertex coordinates and texture coordinates, which need to be enabled twice. The opening here is different from that in GLKit framework.

  • This example uses a custom shader, so you need to get the entry to Vertex Attribut yourself.
  • In GLKit, we use a packaged fixed shader that specifies the entry

Opening channels with custom shaders generally involves the following three steps (compared to GLKit, there is only one more step to get an entry, the latter two steps are not much changed)

  • throughglGetAttribLocationThe “program” function is used to obtain the entry to the vertex attribute. The “program” function is used to obtain the entry to the vertex attribute. The “program” function is used to obtain the entry to the vertex attribute.The string of the second argument must match the corresponding variable name in the shader file!
  • throughglEnableVertexAttribArrayThe buffer () function sets the appropriate format to read data from the buffer
  • throughglVertexAttribPointerFunction to set the read mode
GLuint Position = glGetAttribLocation(self.myame, "position"); / / (2). Set the appropriate format to read data from a buffer glEnableVertexAttribArray (position); Parameter 1: index, the index of vertex data // Parameter 2: size, the number of components per vertex property, 1, 2, 3, or 4. The default initial value is 4.// Parameter 3: type, the type of each component in the data, commonly used GL_FLOAT,GL_BYTE,GL_SHORT. Default initial value is GL_FLOAT// parameter 4: normalized, whether fixed point data values should be normalized, or directly converted to fixed values. (GL_FALSE) // Parameter 5: stride, the offset between consecutive vertex attributes, default to 0; Parameter 6: Specifies a pointer to the first component of the first vertex property in the array. GlVertexAttribPointer (position, 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL); glVertexAttribPointer(position, 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL) // (1).glgetattribLocation, which is used to obtain the entry of the vertex attribute. TextCoordinate be consistent // (2). GLuint textColor = glGetAttribLocation(self.myame, "textCoordinate"); glEnableVertexAttribArray(textColor); glVertexAttribPointer(textColor, 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, (float *)NULL+3);Copy the code

Loading textures

The setupTexture function unzips PNG/JPG into bitmaps and loads them into texture data. CoreGraphic is used for unzipping textures. The process for loading textures is shown below

  • Image compression: Convert UIImage to CGImageRef
  • Picture redraw: useCGContextRefCommon context, callCGContextDrawImageThe default drawing method is used. Before drawing again, you need to obtain the size, width, and height of the image, because these data will be used when drawing
  • Bind texture: PassglBindTextureFunction binding: when there is only one texture, the default texture ID is 0, and 0 is always active, so it can be omittedglGenTexturethe
  • Setting texture properties: PassglTexParameteriFunction set separatelyZoom in/out filtering modeS/T wrap mode
  • Loading textures: PassglTexImage2DThe texture () function loads the texture and, when finished, releases a pointer to the texture data
// Load texture from image - (GLuint)setupTexture: (NSString *)fileName {// 1, convert UIImage to CGImageRef & determine whether the image is successfully obtained CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage; if (! spriteImage) { NSLog(@"Failed to lead image %@", fileName); exit(1); Size_t width = CGImageGetWidth(spriteImage); size_t height = CGImageGetHeight(spriteImage); GLubyte *spriteData = (GLubyte *)calloc(width*height*4, sizeof(GLubyte)); // create context /* 1: data, pointing to the memory address of the drawing image to be rendered 2: width, the bitmap width, in pixels 3: height, the bitmap height, in pixels 4: BitPerComponent, the number of bits per component of pixels in memory, such as 32-bit RGBA, is set to 8. KCGImageAlphaPremultipliedLast colorSpace, bitmap using the color space: RGBA */ CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4, CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast); NSLog(@"kCGImageAlphaPremultipliedLast %d", kCGImageAlphaPremultipliedLast); /* CGContextDrawImage uses the Core Graphics framework. The coordinate system is different from UIKit. The origin of the UIKit frame is in the upper-left corner of the screen, and the origin of the Core Graphics frame is in the lower-left corner of the screen. CGRect rect = CGRectMake(0, 0, width, height); CGContextDrawImage = CGRectMake(0, 0, width, height); CGContextDrawImage(spriteContext, rect, spriteImage); CGContextRelease(spriteContext); // bind texture to default texture ID glBindTexture(GL_TEXTURE_2D, 0); GlTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); float fw = width, fh = height; GlTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA, fw, fH, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData); // 11, spriteData free(spriteData); return 0; }Copy the code
  • Summary: General steps of image decompression
    • Convert UIImage to CGImageRef & determine whether the image was successfully obtained
    • Set the size, width, and height of the diagram
    • Get picture bytes = widthhigh4 (rgba)
    • Create the CGContextRef context
    • useCGContextDrawImageDraw pictures

The main purpose of setting the texture sampler is to obtain the color value of the corresponding pixels in the texture, that is, the grain element

  • throughglGetUniformLocationThe Fragment Uniform function, which gets an entry to the Fragment Uniform, takes two arguments: program and a string of variable names in the custom fragment shader filecolorMap, here emphasize next!!The string of the second argument must match the corresponding variable name in the shader file!
  • throughglUniform1iThe fragment uniform () function takes two arguments. The first argument is the entry to the Fragment Uniform, which is essentially an ID, and the second argument is the ID of the texture, which uses the default ID 0
glUniform1i(glGetUniformLocation(self.myPrograme, "colorMap"), 0);
Copy the code

draw

Start drawing, store it in RenderBuffer, and from RenderBuffer display the image to the screen

  • callglDrawArraysThe function specifies how primitives are joined to draw
  • The context to invokepresentRenderbufferThe render () function renders the drawn image to the screen for display
// Triangles (GL_TRIANGLES, 0, 6); [self.myContext presentRenderbuffer:GL_RENDERBUFFER];Copy the code

See github-10_glSL_01_ load image for the complete code