OpenGL ES 2.0 for iPhone Tutorial Part 2: Textures It mainly introduces OpenGLES texture related content.
In this series of tutorials, we aim to explore the mysteries and difficulties of OpenGL ES 2.0 in a step-by-step, hands-on way. In the first part of the OpenGL Tutorial for iOS: OpenGL ES 2.0, we learned the basic steps of initializing OpenGL, creating Vertex and Fragment shaders, and rendering a simple rotating cube on the screen. In this section, we will move to the next stage, adding some texture to the cube. Warning: I’m not an OpenGL expert! I’m teaching myself, and I’m just taking notes. If there are some stupid mistakes, please correct them. OK, let’s learn more about OpenGL ES 2.0 textures!
start
You can download or write a sample demo for part 1. Compile and run, and you’ll see a rotating cube.
We now have green and red in the cube, because we set the vertices to color, and we didn’t use any textures up here. But don’t worry, that’s exactly what we’re going to do in this section. First download the textures and unzip them into a folder. Drag the folder into your Xcode project, make sure “Copy items into destination group’s Folder” is checked, and click Finish. You’ll see two pictures, one that looks like a small piece of carpet and one that looks like a fish. Let’s start here and apply the carpet piece to each face of the cube.
Reading pixel data
Our first step is to pass the image data into OpenGL. The problem is that OpenGL doesn’t accept the image formats programmers typically use (like PNGs). Instead, OpenGL requires you to convert the image into pixel data, store it in a buffer, and specify a fixed format. Fortunately, you can easily retrieve a buffer containing pixel data using some built-in Quartz2D functions. If you’ve seen the Core Graphics 101 series of tutorials beforehand, you’ll be familiar with many of the function calls here. There are four main steps as follows:
- Gets the Core Graphics picture reference. Since we’re going to use Core Graphics to get the raw pixel data, we need a reference to this image. This is easy, we can use the CGImageRref property of UIImage.
- Generate a Core Graphics bitmap context. This step is used to generate the Core Graphics Bitmap Context, which can refer to a buffer in memory used to store raw pixel data.
- Draw the image into the context. We do this using a simple Core Graphics function call. The buffer then contains the raw pixel data.
- Pass the pixel count to OpenGL. We need to create an OpenGL texture object, get its unique ID (name), and then call the relevant function to pass the pixel count to OpenGL.
OK, now let’s see what the code looks like. In the opengLView.m file, add a new method above initWithFrame:
- (GLuint)setupTexture:(NSString *)fileName {
// 1
CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage;
if(! spriteImage) { NSLog(@"Failed to load image %@", fileName);
exit(1);
}
// 2
size_t width = CGImageGetWidth(spriteImage);
size_t height = CGImageGetHeight(spriteImage);
GLubyte * spriteData = (GLubyte *) calloc(width*height*4, sizeof(GLubyte));
CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4,
CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);
// 3
CGContextDrawImage(spriteContext, CGRectMake(0, 0, width, height), spriteImage);
CGContextRelease(spriteContext);
// 4
GLuint texName;
glGenTextures(1, &texName);
glBindTexture(GL_TEXTURE_2D, texName);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
free(spriteData);
return texName;
}
Copy the code
There’s so much code up there! But the basic four steps are the same as summarized above. Let’s look at them step by step.
- Gets the Core Graphics picture reference. We use UIImage’s imageNamed method to generate a UIImage object and then get its CGImage property.
- Generate a Core Graphics bitmap context. To create a Bitmap context, you need to allocate memory space. Here, we use some function calls to get the width and height of the image, and then request memory space of width*height*4 bytes. Why do I multiply by 4? When we call the corresponding method to draw the image data, a byte space will be prepared for red, Greed, Blue and alpha respectively, so there are 4 bytes in total. Why are they each 1 byte? When we set up the context, we’re going to use Core Graphics to do that. The fourth argument to CGBitmapContextCreate is the number of bits per element, which we set to 8 bits (one byte).
- Draw the image into the context. This step is also very simple. All we need to do is call the Core Graphics function to draw the image in the specified rectangle. Once you’ve done that, you’re ready to release the context.
- Pass the pixel count to OpenGL. First you need to call glGenTextures to create a Texture object with a unique ID (or name). Then call glBindTexture to bind the name of the new texture to the current texture unit. Next use glTexParameteri to set the parameters of the texture. Here, we set GL_TEXTURE_MIN_FILTER (the case where we have to shrink the texture for far away objects) to GL_NEAREST. Select the nearest texture pixel). We can also simply think of GL_NEAREST as pixel art-like and GL_LINEAR as smooth.
Note: If you do not use MIpmap, setting GL_TEXTURE_MIN_FILTER is mandatory. I didn’t know that at first, and without this line of code, nothing would show up. It turns out that this was listed in OpenGL Common Mistakes -OpenGL Common Mistakes. The last step is to use the glTexImage2D function to pass the previously created buffer containing pixel data to OpenGL. When you call this function, you specify the format of the incoming pixel data. Here we set GL_RGBA and GL_UNSIGNED_BYTE, meaning there is a byte for R, G, B, and alpha. OpenGL supports other pixel formats (here’s how Cocos2D Pixel formats work). But in this article, we chose the simplest case. Once the image data has been passed to OpenGL, we can release the pixel buffer. This buffer is no longer needed because OpenGL already stores the textures in the GPU. When finished, return the ID (name) of the texture that will be used in the next drawing.
Using texture data
Now we have encapsulated the above method: load an image, pass it to OpenGL, and return the name of the texture. Next we use it to skin the cube. We’ll start with Vertex and Fragment shaders. Replace SimpleVertex. GLSL with the following:
attribute vec4 Position;
attribute vec4 SourceColor;
varying vec4 DestinationColor;
uniform mat4 Projection;
uniform mat4 Modelview;
attribute vec2 TexCoordIn; // New
varying vec2 TexCoordOut; // New
void main(void) {
DestinationColor = SourceColor;
gl_Position = Projection * Modelview * Position;
TexCoordOut = TexCoordIn; // New
}
Copy the code
Here, we declare a new attribute variable, TexCoordIn. Remember: Each attribute variable is a property that can be set for each vertex. So, for each vertex, we can specify the texture coordinate that it should correspond to. Texture coordinates are a bit strange because they range from 0 to 1. So (0, 0) is the lower left corner of the texture, and (1, 1) is the upper right corner of the texture. However, when the image is loaded, Core Graphics flips the image. So, in this code right here, PI (0, 1) is actually the bottom left corner, and PI (0, 0) is the top left corner, which is weird. We also declare a new VARIABLE VARYING TexCoordOut, passing TexCoordIn to TexCoordOut. Remember: Every VARIABLE VARYING is automatically inserted into the Fragment shader by OpenGL. So, if we set the lower left corner of the square to (0, 0) and the lower right corner to (1, 0), the Fragment Shader will automatically pass (0.5, 0) if we want to render the bottom pixel. (The coordinates related to this paragraph are not understood yet, the original text is as follows)
So for example if we setThe bottom left corner of a square we're Texturing to (0,0) and the bottom right to (1, 0),ifWe 're rendering the pixelin- Between on the bottom, our fragment shader will be automatically passed (0.5, 0).Copy the code
Replace simpleFragment. GLSL with the following:
varying lowp vec4 DestinationColor;
varying lowp vec2 TexCoordOut; // New
uniform sampler2D Texture; // New
void main(void) {
gl_FragColor = DestinationColor * texture2D(Texture, TexCoordOut); // New
}
Copy the code
We used the DestinationColor directly as the output color gl_FragColor, now we multiply the DestinationColor by the coefficient of the texture’s specified coordinates. Texture2D is a built-in GLSL function used to sample a texture.
Now our new shader script is ready to use. Open the opengLView.h file and add the following new instance variables:
GLuint _floorTexture;
GLuint _fishTexture;
GLuint _texCoordSlot;
GLuint _textureUniform;
Copy the code
These variables can track the names of two textures (floor and fish), new input variables, and new texture constants.
Then open the opengLView.m file and make the following changes:
// Add texture coordinates to Vertex structure as follows
typedef struct {
float Position[3];
float Color[4];
floatTexCoord[2]; // New } Vertex; // Add texture coordinates to Vertices as follows const Vertex Vertices[] = { {{1, -1, 0}, {1, 0, 0, 1}, {1, 0}}, {{1, 1, 0}, {1, 0, 0, 1}, {1, 1}}, {{1, 1, 0}, {0, 1, 0, 1}, {0, 1}}, {{1, 1, 0}, {0, 1, 0, 1}, {0, 0}}, {{1, 1, 1}, {1, 0, 0, 1}, {1, 0}}, {{1, 1, 1}, {1, 0, 0, 1}, {1, 1}}, {{1, 1, 1}, {0, 1, 0, 1}, {0, 1}}, {{1, 1, 1}, {0, 1, 0, 1}, {0, 0}}}; // Add to end of compileShaders _texCoordSlot = glGetAttribLocation(programHandle,"TexCoordIn");
glEnableVertexAttribArray(_texCoordSlot);
_textureUniform = glGetUniformLocation(programHandle, "Texture");
// Add to end of initWithFrame
_floorTexture = [self setupTexture:@"tile_floor.png"];
_fishTexture = [self setupTexture:@"item_powerup_fish.png"];
// Add inside render:, right before glDrawElements
glVertexAttribPointer(_texCoordSlot, 2, GL_FLOAT, GL_FALSE,
sizeof(Vertex), (GLvoid*) (sizeof(float) * 7));
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _floorTexture);
glUniform1i(_textureUniform, 0);
Copy the code
Given what we’ve learned, most of the code above is pretty straightforward. The only notable lines are the last three. This sets the Texture Uniform defined in the Fragment Shader to the Texture loaded in the code. First, activate the texture unit to which the texture will be loaded. On iOS, we have at least 2 texture units and at most 8. This ensures adequate performance if you need to perform computations on multiple textures at once. In this article, however, we don’t actually need to use multiple texture units at once, so we’ll use the first texture unit (GL_TEXTURE0). Then, bind the texture object _floorTexture to the current texture unit (GL_TEXTURE0). Finally, set the Texture Uniform to the sequence number of the current texture cell (0 here). Note: The first and third lines are not strictly required, and in many cases you won’t see them at all. Because OpenGL defaults to using GL_TEXTURE0 as the currently active texture unit, and has set texture Uniform to the default serial number 0. Therefore, I include these lines because they are helpful for beginners. Compile and run the above code and you should see a textured cube.
As shown above, the cube looks good on the front, but the sides are slightly skewed and distorted. Why is that?
Modified deformation effect
The problem is that we currently only set one texture coordinate for each vertex, and then reused those vertices. For example, we map the bottom left corner of the front to (0, 0). But on the left, the same vertex is the lower right corner, so it doesn’t make sense to use the texture coordinates of (0, 0), which should be (1, 0). In OpenGL, we can’t treat a vertex as just its vertex coordinates. It is the result of a unique combination of its coordinates, colors, texture coordinates, and other attributes. Replace your Vertices and Indices arrays with the following, using a different vertex/color/texture coord combination for each face to ensure proper mapping of texture coordinates:
#define TEX_COORD_MAX 1
const Vertex Vertices[] = {
// Front
{{1, -1, 0}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
{{1, 1, 0}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
{{-1, 1, 0}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
{{-1, -1, 0}, {0, 0, 0, 1}, {0, 0}},
// Back
{{1, 1, -2}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
{{-1, -1, -2}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
{{1, -1, -2}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
{{-1, 1, -2}, {0, 0, 0, 1}, {0, 0}},
// Left
{{-1, -1, 0}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
{{-1, 1, 0}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
{{-1, 1, -2}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
{{-1, -1, -2}, {0, 0, 0, 1}, {0, 0}},
// Right
{{1, -1, -2}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
{{1, 1, -2}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
{{1, 1, 0}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
{{1, -1, 0}, {0, 0, 0, 1}, {0, 0}},
// Top
{{1, 1, 0}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
{{1, 1, -2}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
{{-1, 1, -2}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
{{-1, 1, 0}, {0, 0, 0, 1}, {0, 0}},
// Bottom
{{1, -1, -2}, {1, 0, 0, 1}, {TEX_COORD_MAX, 0}},
{{1, -1, 0}, {0, 1, 0, 1}, {TEX_COORD_MAX, TEX_COORD_MAX}},
{{-1, -1, 0}, {0, 0, 1, 1}, {0, TEX_COORD_MAX}},
{{-1, -1, -2}, {0, 0, 0, 1}, {0, 0}}
};
const GLubyte Indices[] = {
// Front
0, 1, 2,
2, 3, 0,
// Back
4, 5, 6,
4, 5, 7,
// Left
8, 9, 10,
10, 11, 8,
// Right
12, 13, 14,
14, 15, 12,
// Top
16, 17, 18,
18, 19, 16,
// Bottom
20, 21, 22,
22, 23, 20
};
Copy the code
Just like last time, I solved this problem by stretching the cube on a piece of paper. This is a very good practice and recommended. Note: Compared to last time, we repeated a lot of data. I don’t know if there is a smarter way to solve this problem while allowing texture coordinates to remain different (if there is, please let me know!). Compile and run, and we get a very nice texture cube:
Repeat texture
In OpenGL, it is very easy to repeatedly paste a texture on a surface. The carpet slice we used happens to be a seamless texture, so try repeating it a few times on each surface.
Simply modify the opengLView. m file as follows:
#define TEX_COORD_MAX 4
Copy the code
Now, we map each face of the cube, the lower left corner is (0,0), and the lower right corner is (4,4).
When mapping texture coordinates, it will behave asifIt was a modulo of 1 --for example ifA texture coordinate is 1.5, it will map to the texture asifIt was 0.5.Copy the code
Compile and run, and you’ll see that the texture repeats perfectly on the cube.
Note: This is done automatically. The default value for GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T is GL_REPEAT. If you want the texture to repeat like this (maybe you want them clamped to the last Pixel value), you can override this parameter using glTexParameteri.
Add the stickers
What we’re going to do is put the fish bones on top of the cube. The following code is an additional exercise in what we’ve already learned. Add a new instance variable to opengLView.h:
GLuint _vertexBuffer;
GLuint _indexBuffer;
GLuint _vertexBuffer2;
GLuint _indexBuffer2;
Copy the code
Previously, we only had one Vertex buffer and one index buffer, so we created it as an active buffer and didn’t need to reference it. Now we’ll need two sets of Vertex /index buffers, one for the cube and one for the fish bone surface. So you need some references.
Make the following changes in opengLView. m:
/ / 1) Add to the top of the file const Vertex Vertices2 [] = {{{0.5, 0.5, 0.01}, {1, 1, 1, 1}, {1, 1}}, {{0.5, 0.5, 0.01}, {1, 1, 1, 1}, {1, 0}}, {{0.5, 0.5, 0.01}, {1, 1, 1, 1}, {0, 0}}, {{0.5, 0.5, 0.01}, {1, 1, 1, 1}, {0, 1}},}; const GLubyte Indices2[] = { 1, 0, 2, 3 }; // 2) Replace setupVBOs with the following - (void)setupVBOs { glGenBuffers(1, &_vertexBuffer); glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer); glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW); glGenBuffers(1, &_indexBuffer); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices), Indices, GL_STATIC_DRAW); glGenBuffers(1, &_vertexBuffer2); glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer2); glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices2), Vertices2, GL_STATIC_DRAW); glGenBuffers(1, &_indexBuffer2); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer2); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices2), Indices2, GL_STATIC_DRAW); } // 3) Add inside render:, right after call to glViewport glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer); // 4) Add to bottom of render:, right before [_context presentRenderbuffer:...] glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer2); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer2); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, _fishTexture); glUniform1i(_textureUniform, 0); glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView.glMatrix); glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0); glVertexAttribPointer(_colorSlot, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*) (sizeof(float) * 3));
glVertexAttribPointer(_texCoordSlot, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*) (sizeof(float) * 7));
glDrawElements(GL_TRIANGLE_STRIP, sizeof(Indices2)/sizeof(Indices2[0]), GL_UNSIGNED_BYTE, 0);
Copy the code
In the first section of code, we define a new Vertices array to draw rectangles with fish textures. Note that the rectangle is slightly smaller than the surface of the cube and slightly higher in z coordinates. Otherwise, it will be ignored in depth testing. In the second code, we store the Vertex /index buffer in a new instance variable instead of a local variable. We also create a second set of Vertex /index buffers for the minnow rectangle using the new Vertices /indexes array. In the third code, bind the cube’s Vertex /index buffer before drawing. In the fourth code, bind the fish rectangle’s Vertex /index buffer, load it in the fish texture, and set all the attributes. Notice that we used the new method GL_TRIANGLE_STRIP to draw triangles. For every three vertices, GL_TRIANGLE_STRIP forms a new triangle by combining the first two vertices with the next one. This reduces the size of the index buffer by reusing vertices. I’m here to show you how it works. For a detailed explanation and use of this parameter, see iOS – OpenGLES vertex cache object VBO.
The result is as follows:
Drew the fish picture we wanted, but it didn’t mix well with the rest of the content. Before render, use the following two lines to set the blend mode:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
Copy the code
The first line uses glBlendFunc to set the blend mode. GL_ONE is set for the source (to get all the content of the source image) and GL_ONE_MINS_SRC_ALPHA is set for the target (to get all the content of the target image except the part of the source image that has been set). For more discussion on blending mode, check out this tutorial, There’s a cool online tool to help you understand blending patterns. The second line turns on mixed mode. Run and you will now have a weirdly textured cube with a weirdly fishbone on top.
What do I do next?
This is our sample project. At this point, you’re starting to touch on some of the most important parts of OpenGL ES 2.0: adding vertices, creating Vertex buffer objects, creating shader programs, adding textures, and so on. There’s still a lot to learn, but if you want to learn more, I recommend Philip Rideout’s book iPhone 3D Programming.