directory

  1. Basic concepts related to textures
  2. Texture rendering process and key methods
  3. Practice (texture loading, 2-screen, 3-screen, 8-screen, mirroring, texture and color mixing)
  4. Problems encountered
  5. harvest

1. Basic Concepts

The Texture is a 2D image (it even has 1D and 3D textures) and can be used to add detail to objects; Put it on something like a sticker to make it look like what our sticker is supposed to represent. This makes the graph more realistic

Texture coordinates

In OpenGL, the texture coordinate system is based on the lower left corner of the texture as the origin of coordinates, while the pixels in the image are stored from the upper left to the lower right, so we need to “flip” our coordinate system on the Y-axis.

The image coordinate system (0,0) is in the upper left corner of the image, and the texture coordinate (0,0) is in the lower left corner of the texture

Texture mapping

To be able to Map a texture to a triangle, we need to specify which part of the texture corresponds to each vertex of the triangle. Each vertex is then associated with a Texture Coordinate that indicates which part of the Texture image to sample from. Then Fragment Interpolation is performed on other segments of the graph.

Texture unit

A texture unit is a reference to a texture object that can be sampled by the shader. The texture is bound to the specified texture unit by calling the glBindTexture function. The texture is bound to GL_TEXTURE0_ by default without specifying which texture unit to use

GlActiveTexture: Activates the texture unit

Why sampler2D variable is UNIFORM, but we don’t assign it glUniform. Use glUniform1i? Using glUniform1i, we can assign a location value to the texture sampler so that we can set multiple textures in a fragment shader

Texture wrap

GL_REPEAT: Default, repeat texture image. GL_MIRRORED_REPEAT: Similar to the default, but mirroring mirroring occurred during each repetition. GL_CLAMP_TP_EDGE: Limits the coordinates between 0 and 1. The extra coordinates repeat the pixels of the edges, creating a pattern that extends the edges. GL_CLAMP_TO_BORDER: The outside coordinates will be drawn to the user-specified border color.

Texture filtering

GL_NEAREST: filter: use the stripper nearest to the texture coordinate. This is the default filtering method of OpenGL. It is the fastest, but the effect is poor. GL_LINEAR: the result of some interpolating calculation of several grain pixel values near the texture coordinates. This is the most widely used way, the effect is general, fast.

Mipmaps are a series of texture images, each one 1/4 the size of the previous one, down to the last pixel

It is simply a series of texture images, the latter one half of the previous one. The idea behind multistage receding textures is simple: If the distance from the observer exceeds a certain threshold, OpenGL will use a different multistage receding texture, the one that best fits the distance of the object. Due to the distance, the resolution is not high and will not be noticed by the user. At the same time, the multi-stage fade texture is another plus that it is very good performance.

GL_NEAREST_MIPMAP_NEAREST: use the nearest MIPmap to filter and sample textures. GL_LINEAR_MIPMAP_NEAREST: Adopt the nearest MIPmap, and use linear filter sampling for texture sampling. GL_NEAREST_MIPMAP_LINEAR: used two miPmap linear interpolation texture map, texture sampling using the nearest point filter sampling. GL_LINEAR_MIPMAP_LINEAR: A linear interpolation texture map using two MIPmap maps, and linear filtering sampling is used for texture sampling. The corresponding method for generating miMap is as follows

Second, texture rendering process and key methods

final int[] textureObjectIds = new int[1]; // Initialize the texture glGenTextures(1, textureObjectIds, 0); if (textureObjectIds[0] == 0) { return 0; } final bitmapFactory.options Options = new bitmapFactory.options (); options.inScaled = false; final Bitmap bitmap = BitmapFactory.decodeResource( context.getResources(), resourceId, options); if (bitmap == null) { glDeleteTextures(1, textureObjectIds, 0); return 0; } // Bind texture 2D texture and texture ID glBindTexture(GL_TEXTURE_2D, textureObjectIds[0]); // Set texture surround to GL_REPEAT glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT); GlTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); // Load texImage2D(GL_TEXTURE_2D, 0, bitmap, 0); // Generate multilevel gradient texture glGenerateMipmap(GL_TEXTURE_2D); // recycle bitmap bitmap.recycle(); GlBindTexture (GL_TEXTURE_2D, 0);Copy the code

Important method

glActiveTexture

glGenTextures

glBindTexture

glTexParameteri

glTexImage2D

glGenerateMipmap

glUniform1i

3. Practice: Loading textures (texture loading, 2-screen, 3-screen, 8-screen, mirror, texture and color mixing)

We usually need to use a JPG and PNG image file as the texture of the model, and There is no API in OpenGL to convert these image files into the array we need. In the Java layer we can load the bitmap into the texture by using an image of the Canton Tower Light Festival as the texture

final BitmapFactory.Options options = new BitmapFactory.Options();
      options.inScaled = false;

      // Read in the resource
      final Bitmap bitmap = BitmapFactory.decodeResource(
          context.getResources(), resourceId, options);

texImage2D(GL_TEXTURE_2D, 0, bitmap, 0);
Copy the code

1. First to write vertex shader and slice shader GLSL program

//texture_vertex_shader. GLSL // Attribute vec4 a_Position; // Attribute vec2 a_TextureCoordinates; varying vec2 v_TextureCoordinates; void main() { v_TextureCoordinates = a_TextureCoordinates; gl_Position = a_Position; } //texture_fragment_shader.glsl precision mediump float; Uniform sampler2D u_TextureUnit; // Texture coordinates varying VEC2 v_TextureCoordinates; Void main() {gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates); }Copy the code

2. Then, write the texture program

Public Class TextureShaderProgram extends ShaderProgram {private Final Int. Public Class TextureShaderProgram extends ShaderProgram {private Final Int uTextureUnitLocation; // Attribute locations private final int aPositionLocation; private final int aTextureCoordinatesLocation; public TextureShaderProgram(Context context) { String vertexCode = ShaderHelper.loadAsset(MyApplication.getContext().getResources(), "texture_vertex_shader.glsl"); String fragmentCode = ShaderHelper.loadAsset(MyApplication.getContext().getResources(), "texture_fragment_shader.glsl");  ProgramId = shaderhelper. loadProgram(vertexCode, fragmentCode); // Location uTextureUnitLocation = glGetUniformLocation(programId, U_TEXTURE_UNIT); // location aPositionLocation = glGetAttribLocation(programId, A_POSITION); / / texture coordinate location aTextureCoordinatesLocation = glGetAttribLocation (program, A_TEXTURE_COORDINATES); } public int getPositionAttributeLocation() { return aPositionLocation; } public int getTextureCoordinatesAttributeLocation() { return aTextureCoordinatesLocation; }}Copy the code

3. Next, generate vertex data

public class GuangzhouTa { private static final int POSITION_COMPONENT_COUNT = 2; private static final int TEXTURE_COORDINATES_COMPONENT_COUNT = 2; private static final int STRIDE = (POSITION_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT) * BYTES_PER_FLOAT; private final VertexArray vertexArray; public GuangzhouTa(float[] vertexData) { vertexArray = new VertexArray(vertexData); } public void bindData(TextureShaderProgram textureProgram) { vertexArray.setVertexAttribPointer( 0, textureProgram.getPositionAttributeLocation(), POSITION_COMPONENT_COUNT, STRIDE);  vertexArray.setVertexAttribPointer( POSITION_COMPONENT_COUNT, textureProgram.getTextureCoordinatesAttributeLocation(), TEXTURE_COORDINATES_COMPONENT_COUNT, STRIDE); } public void draw() { glDrawArrays(GL_TRIANGLE_FAN, 0, 6); }}Copy the code

4. Load the texture again and obtain the texture ID

public static int loadTexture(Context context, int resourceId) { final int[] textureObjectIds = new int[1]; // Initialize the texture glGenTextures(1, textureObjectIds, 0); if (textureObjectIds[0] == 0) { return 0; } final bitmapFactory.options Options = new bitmapFactory.options (); options.inScaled = false; final Bitmap bitmap = BitmapFactory.decodeResource( context.getResources(), resourceId, options); if (bitmap == null) { glDeleteTextures(1, textureObjectIds, 0); return 0; } // Bind texture 2D texture and texture ID glBindTexture(GL_TEXTURE_2D, textureObjectIds[0]); // Set texture surround to GL_REPEAT glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT); GlTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); // Load texImage2D(GL_TEXTURE_2D, 0, bitmap, 0); // Generate multilevel gradient texture glGenerateMipmap(GL_TEXTURE_2D); // recycle bitmap bitmap.recycle(); GlBindTexture (GL_TEXTURE_2D, 0); // textureObjectIds[0]; }Copy the code

5. Finally draw in the onDrawFrame of Render

public class GuangZhouTaRenderer implements Renderer { private final Context context; private GuangzhouTa guangzhouta; private TextureShaderProgram textureProgram; private int textureId; public GuangZhouTaRenderer(Context context) { this.context = context; } @override public void onSurfaceCreated(GL10 glUnused, EGLConfig config) {glClearColor(0.0f, 0.0f, 0.0f, 0.0f); guangzhouta = new GuangzhouTa(VertexDataUtils.VERTEX_DATA); textureProgram = new TextureShaderProgram(context); textureId = TextureHelper.loadTexture(context, R.drawable.guangzhou); } @Override public void onSurfaceChanged(GL10 glUnused, int width, int height) { glViewport(0, 0, width, height); } @Override public void onDrawFrame(GL10 glUnused) { glClear(GL_COLOR_BUFFER_BIT); textureProgram.useProgram(); textureProgram.setUniforms(textureId); guangzhouta.bindData(textureProgram); guangzhouta.draw(); } } public class TextureShaderProgram{ .... Public void setimaginative (int textureId) {// Enable textures (GL_TEXTURE0); // bind textureId glBindTexture(GL_TEXTURE_2D, textureId); GlUniform1i (uTextureUnitLocation, 0); }... }Copy the code

The vertex data matrix we used above is

public static final float[] VERTEX_DATA = { // Order of coordinates: X, Y, S, T // Triangle Fan 0F, 0F, 0.5F, 0.5f, -1F, -1F, 0F, 1F, 1f, -1F, 1f, 1f, 1f, 0.0f, -1f, 1f, 1f, 1f, 0.0f, -1f, 1f, 0f, 0.0f, -1f, -1f, 0f, 1f };Copy the code

Results the following

We find it stretched. Why is it stretched? Because the original texture has an aspect ratio of 1:1, the aspect ratio of the phone screen is usually 9:16. In the horizontal direction, 9 equals 1, and in the vertical direction, 16/9 is the multiple of the image being stretched. So how to deal with it? The S and T of the matrix’s data texture coordinates are calculated according to the actual screen aspect ratio.

I’m simply going to double the size of my matrix in the t-coordinate

public static final float[] SPLIT_SCREEN_2_VERTEX_DATA = { // Order of coordinates: X, Y, S, T // Triangle Fan 0F, 0F, 0.5F, 1f, -1F, -1F, 0F, 2F, 1F, -1F, 1f, 1f, 1f, 0.0f, -1f, 1f, 1f, 1f, 0.0f, -1f, 1f, 1f, 0.0f, -1f, 1f, 0f, 0.0f, -1f, -1f, 0f, 2f };Copy the code

It looks like this (2 split screens)

Remember we set the texture wrap to GL_REPEAT above and the texture coordinates are limited to 0 to 1. The extra coordinates are drawn repeatedly

   glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT);
Copy the code

The effect of mirroring repetition

 glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_MIRRORED_REPEAT);
        glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_MIRRORED_REPEAT);
Copy the code

Set edge extension to look like this (really ugly)

       glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
Copy the code

The three-split screen and eight-point screen are similar, only need to modify the matrix can be modified and the effect is as follows

Public static final float[] SPLIT_SCREEN_3_VERTEX_DATA = {0f, 0f, 0.5f, 1.5f, -1f, -1f, 0f, 3f, 1f, -1f, 1f, 3f, 1f, -1f, 1f, 3f, 1f, 0.0 f, 1 f, 1 f, 1 f, 1 f, 0 f, 0.0 f, 1 f, 1 f, 0 f, 3 f};Copy the code

public static final float[] SPLIT_SCREEN_8_VERTEX_DATA = { // Order of coordinates: X, Y, S, T // Triangle Fan 0F, 0F, 1F, -1F, -1F, 0F, 4F, 1F, -1F, 2f, 4f, 1f, 1f, 2f, 0.0f, -1f, 1f, 0f, 0.0f, -1f, -1f, 0f, 4f };Copy the code

Texture mixes with color

The above split-screen, mirror, etc. are implemented by changing the S and T coordinates of the vertex shader directly for the texture image. What do I do if I want to mix the colors on top of the results above? On the first results

The vertex shader adds the attribute and VARYING of color. Then, when the fragment shader generates gl_FragColor, it multiplies the color vector _ and gets the color loAction in GLprograme. Vertex shader matrix data to add RGB values,

attribute vec4 a_Position; attribute vec3 a_Color; attribute vec2 a_TextureCoordinates; varying vec2 v_TextureCoordinates; varying vec3 v_Color; void main() { v_TextureCoordinates = a_TextureCoordinates; v_Color = a_Color; gl_Position = a_Position; } precision mediump float; uniform sampler2D u_TextureUnit; varying vec2 v_TextureCoordinates; varying vec3 v_Color; Void main() {gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates) * vec4(v_Color,1.0f); void main() {gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates) * vec4(v_Color,1.0f); } public static final float[] SPLIT_SCREEN_2_VERTEX_DATA = { // Order of coordinates: X, Y, R, G, B, S, T / / Triangle Fan 0 f, 0 f, f, 1.0 0.0 f, f, 0.0 0.5 f, 1 f, 1 f, 1 f, f 1.0, 1.0 f, f 0.0, 0 f and 2 f, 1 f, 1 f, 0.0 f, f 0.0, 1.0 f, 1 f and 2 f, 1 f, 1 f, f, 0.0 0.0 f, 0.0 f, 1 f, 0.0 f, 1 f, 1 f, f, 1.0 0.0 f, f 0.0, 0 f, 0.0 f, 1 f, 1 f, 1.0 f, f 1.0, 0.0, f 0f, 2f }; public class GuangzhouTa { private static final int POSITION_COMPONENT_COUNT = 2; private static final int COLOR_COMPONENT_COUNT = 3; private static final int TEXTURE_COORDINATES_COMPONENT_COUNT = 2; private static final int STRIDE = (POSITION_COMPONENT_COUNT + COLOR_COMPONENT_COUNT + TEXTURE_COORDINATES_COMPONENT_COUNT) * BYTES_PER_FLOAT; private final VertexArray vertexArray; public GuangzhouTa(float[] vertexData) { vertexArray = new VertexArray(vertexData); } public void bindData(TextureShaderProgram textureProgram) { vertexArray.setVertexAttribPointer( 0, textureProgram.getPositionAttributeLocation(), POSITION_COMPONENT_COUNT, STRIDE); vertexArray.setVertexAttribPointer( POSITION_COMPONENT_COUNT, textureProgram.getColorAttributeLocation(), COLOR_COMPONENT_COUNT, STRIDE); vertexArray.setVertexAttribPointer( POSITION_COMPONENT_COUNT+COLOR_COMPONENT_COUNT, textureProgram.getTextureCoordinatesAttributeLocation(), TEXTURE_COORDINATES_COMPONENT_COUNT, STRIDE); } public void draw() { glDrawArrays(GL_TRIANGLE_FAN, 0, 6); }}Copy the code

Four, data

OpenGL ES 3.0 Programming Guide OpenGL Programming Guide (The Little Red Book)

[Android OpenGL ES 2.0 drawing: rendering texture] [from 0 to start OpenGL learning (five) – texture] [OpenGL texture detail (top)] [OpenGL texture detail (bottom) practice] [OpenGL (twelve) Texture Mapping [OpenGL Texture Display] [Texture]

Five, the harvest

  1. Understand texture coordinates, texture units, texture surround, texture filtering, MipMap, etc
  2. Understand the texture loading process and important API analysis
  3. Practice loading textures and familiarize yourself with concepts and processes
  4. Texture handstand and other problems analysis and solution

Thank you for reading

Next we will learn to practice camera preview when adding filter effect, welcome to pay attention to the public account “audio and video development journey”, learn and grow together.

Welcome to communicate

The original link