Everybody is good, the texture with university learning, have a project in my lot OpenGLES2.0 SamplesForAndroid, I’ll write constantly learning sample, articles and code synchronization update, welcome attention, link: github.com/kenneycode/…

In the previous example, we rendered simple colors. What if we wanted to render an image? To do this, we need to create a texture, load an image into the texture, and then sample the texture in the Fragment Shader to render it.

We create a texture with glGenTextures and set some parameters. The texture we get here is just an ID:

// Create an image texture
// Create texture
val textures = IntArray(1)
GLES20.glGenTextures(textures.size, textures, 0)
val imageTexture = textures[0]

GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, imageTexture)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE)
Copy the code

After the texture is created, it is still empty, and we need to fill the texture with content. We first decode an image into a bitmap and copy the pixel data into a ByteBuffer, because loading bitMP into the texture accepts a ByteBuffer:

val bitmap = Util.decodeBitmapFromAssets("image_0.jpg")
val b = ByteBuffer.allocate(bitmap.width * bitmap.height * 4)
bitmap.copyPixelsToBuffer(b)
b.position(0)
Copy the code

Let’s load the above ByteBuffer into the texture using the glTexImage2D method:

GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 
                    0, 
                    GLES20.GL_RGBA, 
                    bitmap.width, 
                    bitmap.height, 
                    0, 
                    GLES20.GL_RGBA, 
                    GLES20.GL_UNSIGNED_BYTE, 
                    b)
Copy the code

The fragment shader needs to sample the texture and render it. Let’s look at how to use the texture in the Fragment Shader:

// vertex shader
precision mediump float;
attribute vec4 a_position;
attribute vec2 a_textureCoordinate;
varying vec2 v_textureCoordinate;
void main() {
    v_textureCoordinate = a_textureCoordinate;
    gl_Position = a_position;
}

// fragment shader
precision mediump float;
varying vec2 v_textureCoordinate;
uniform sampler2D u_texture;
void main() {
    gl_FragColor = texture2D(u_texture, v_textureCoordinate);
}
Copy the code

The key point is uniform sampler2D u_texture; This one declares a 2D sampler for sampling textures.

varying vec2 v_textureCoordinate; It is the interpolated value of the texture coordinate passed from the Vertex Shader. The variable varying is described in the previous article Android OpenGL ES 2.0 Fragment Shader.

gl_FragColor = texture2D(u_texture, v_textureCoordinate); Sample the color corresponding to the v_textureCoordinate coordinate from the texture as the output of the Fragment shader. We can see that the output of the fragment shader is actually a color. In the previous article, we control the color ourselves. When the color is sampled from the texture, the final rendering is what the texture looks like.

Now that the texture and shader are ready, what if they are linked together? First we need to get the location of the texture variable in the shader as before:

val uTextureLocation = GLES20.glGetAttribLocation(programId, "u_texture")
Copy the code

We then specify which texture unit this location corresponds to. Here we use texture unit 0:

GLES20.glUniform1i(uTextureLocation, 0)
Copy the code

Some of you may be confused by this, but what is a texture unit? Well, a texture unit can be thought of as a kind of register, and before OpenGL uses a texture, we have to put a texture on a texture unit, and then OpenGL will go to the texture unit that we specify and fetch the corresponding texture.

We just set location to texture unit 0, but we don’t seem to have anything to say that we put the texture in texture unit 0. What’s going on here? Since OpenGL uses texture unit 0 by default, we have not changed the texture unit we use, so the default is 0. If we want to use another texture unit, we can specify it with glActiveTexture, for example:

GLES20.glActiveTexture(GLES20.GL_TEXTURE1)
Copy the code

If our example is changed to use texture unit 1, then uTextureLocation should be changed to correspond to texture unit 1:

GLES20.glUniform1i(uTextureLocation, 1)
Copy the code

Let’s look at vertex coordinates and texture coordinates:

private val vertexData = floatArrayOf(-1f, -1f, -1f, 1f, 1f, 1f, -1f, -1f, 1f, 1f, 1f, -1f)
private val textureCoordinateData = floatArrayOf(0f, 1f, 0f, 0f, 1f, 0f, 0f, 1f, 1f, 0f, 1f, 1f)
Copy the code

How to pass vertex coordinates to shader, the method should be familiar by now, in fact, texture coordinates are almost the same, but the value is different.

As I mentioned earlier, the point of vertex coordinates is to tell OpenGL what region to render, and the point coordinates are in the range of -1 to 1 for each axis, and you can actually go beyond -1 and 1, but you’re not in the range of rendering, you can’t see it, it’s not even an error, the point coordinates are in the middle.

The origin of the texture coordinates is in the lower left corner, and the range of each axis is 0~1. Similarly, it can be beyond 0 and 1, and the performance after the beyond will be different according to the texture parameters set.

In this example, we use the drawing mode of GL_TRIANGLES. For the rendering mode, use my last article, Android OpenGL ES 2.0 — Drawing mode. In this drawing mode, every three points form an independent triplet. So the two triangles formed by texture coordinates will be rendered to the two triangles specified by vertex coordinates. Let’s look at the effect:

For example, the vertex coordinates (-1,-1) correspond to the texture coordinates (0,1), that is, the lower left corner of the rendering area corresponds to the upper left corner of the texture. In this case, the rendered image should be inverted. But what we’re seeing is the right result.

This is because our texture comes from a Bitmap, and the origin of the bitmap coordinates is in the upper left corner, which is upside down with the texture coordinate system in OpenGL, so we reverse the texture coordinates again, which is positive.

When we created the texture, we set several parameters for the texture. Let’s take a look:

GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE)
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE)
Copy the code

GL_TEXTURE_MIN_FILTER and GL_TEXTURE_MAG_FILTER are texture filter parameters that specify what to do when the rendered texture is smaller or larger than the original texture. Here we use GL_LINEAR, which is a linear interpolation method. Another commonly used one is GL_NEAREST, which will select the nearest pixel and get a larger sawtooth than GL_LINEAR. We enlarge the vertex coordinates by 5 times, that is, to -5~5, to obtain the magnification effect. Here are the two effects of magnification:

GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T specify how to fill the texture coordinates when they are outside the scope of the texture. The more common ones are GL_CLAMP_TO_EDGE and GL_REPEAT, which fill the edge pixels and repeat the texture, respectively. Let’s change the texture coordinates to 0~3 and see what happens:

That is some basic knowledge about the texture, the code in my lot OpenGLES2.0 SamplesForAndroid project, this paper, the corresponding is SampleTexture, link of the project: github.com/kenneycode/…

Thanks for reading!