primers
JavaScript WebGL is always limited in setting color effects, so it comes to using images, which involves using textures in WebGL, which is much more troublesome than expected.
-
My GitHub
Use pictures
Texture can be used to add detail to simulated objects, and textures are used in all kinds of simulated objects in 3D games. On the basis of drawing a rectangle, there are mainly the following changes:
-
data
-
Vertex shader
-
Chip shader
-
Buffer texture coordinate data
-
Load and create a texture
-
draw
data
Prepare an image, and in order to map the texture to the rectangle, specify the position of the texture for each vertex of the rectangle.
Texture 2d coordinates are on the X and y axes, ranging from 0 to 1. Texture coordinates start at (0, 0), corresponding to the lower left corner of the image, and end at (1, 1), corresponding to the upper right corner of the image. So the corresponding texture coordinates are:
const texCoords = [
1.0.1.0./ / the top right corner
0.0.1.0./ / the top left corner
0.0.0.0./ / the bottom left corner
1.0.0.0./ / the bottom right hand corner
];
Copy the code
Vertex shader
Texture coordinates need to be buffered and passed. We added the variable aVertexTextureCoord to the vertex shader, whose value is passed to the fragment shader.
const source = ` attribute vec3 aVertexPos; attribute vec2 aVertexTextureCoord; varying highp vec2 vTextureCoord; void main(void){ gl_Position = vec4(aVertexPos, 1); vTextureCoord = aVertexTextureCoord; } `;
Copy the code
Chip shader
Accept texture coordinates in the slice shader and define the texture sampler uSampler. Note that this is a global variable that can be accessed at any stage and has no value yet. The built-in method texture2D gets the final color.
const fragmentSource = ` varying highp vec2 vTextureCoord; uniform sampler2D uSampler; void main(void){ gl_FragColor = texture2D(uSampler, vTextureCoord); } `;
Copy the code
Buffer texture coordinate data
Texture coordinate data also needs to be buffered.
/** * Buffers texture coordinate data and activates *@param {*} Gl WebGL context *@param {*} ShaderProgram shaderProgram@param {*} Data Texture coordinate data */
function setTextureBuffers(gl, shaderProgram, data) {
// Create a blank buffer object
const buffer = gl.createBuffer();
// Bind the target
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
// WebGL does not support using JavaScript primitive array types directly
const dataFormat = new Float32Array(data);
// Initialize the data store
gl.bufferData(gl.ARRAY_BUFFER, dataFormat, gl.STATIC_DRAW);
// Get the corresponding data index
const texCoord = gl.getAttribLocation(
shaderProgram,
"aVertexTextureCoord"
);
// Parse vertex data
gl.vertexAttribPointer(texCoord, 2, gl.FLOAT, false.0.0);
// Enable vertex attributes. Vertex attributes are disabled by default.
gl.enableVertexAttribArray(texCoord);
}
Copy the code
Load and create a texture
Make sure the image is loaded before using it. Once you have the image data, you need to create a texture object.
function loadImage(gl) {
var img = new Image();
img.onload = (e) = > {
createTexture(gl, e.target);
};
img.src = "./1.jpg";
}
function createTexture(gl, source) {
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
// Reverse the Y direction of the image
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
// Texture coordinates are horizontally filled with s
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
// Texture coordinates are filled vertically with t
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
// Texture magnification
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
// Texture reduction
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
// Assign the image data to the texture object
gl.texImage2D(
gl.TEXTURE_2D,
0,
gl.RGBA,
gl.RGBA,
gl.UNSIGNED_BYTE,
source
);
// Activate the texture
gl.activeTexture(gl.TEXTURE0);
}
Copy the code
CreateTexture creates the texture object, then use bindTexture and bind it to the target. Here is the 2d image. The first parameter is gl.texture_2d for the 2d texture, and the second parameter is the texture object. After binding, you can do more with the texture.
The pixelStorei method reverses the y-direction coordinates of the image because the coordinate system of the image is different from the coordinate system of the texture reference.
The texParameteri method sets the parameters of a texture. If you want to use images of various sizes, you need to do the same for the horizontal and vertical fill. Otherwise, only images of certain sizes will be displayed.
The texImage2D method assigns the texture source to the texture object, in this case passing the pixel data of the image to the texture object so that the image can be seen when the texture is drawn.
The activeTexture method activates the specified texture. The texture unit ranges from 0 to GL.max_combined_TEXture_image_UNITs-1, where there is only one, with the value gl.texturei0. The default first texture unit is always active, so this line of code can be removed.
draw
Global variables declared in the slice shader are drawn using the Uniform1i method. The second parameter represents the texture unit, where 0 is the first texture unit.
/** * draw *@param {*} Gl WebGL context *@param {*} ShaderProgram shader */
function draw(gl, shaderProgram) {
// Get the texture sampler
const samplerUniform = gl.getUniformLocation(shaderProgram, "uSampler");
// Specify the texture unit associated with the global variable
gl.uniform1i(samplerUniform, 0);
gl.drawElements(gl.TRIANGLES, 6, gl.UNSIGNED_SHORT, 0);
}
Copy the code
The effect
Here’s an example that looks like this:
If you compare the original image, you can see that this image is distorted and not adaptive.
The resources
-
WebGL texture tutorial 1: The basic use of textures
-
texture
Recently “League of Legends: Battle of Two Cities” is very hot, to see the next is really wonderful, even if you do not play this game to see the same.
I think it will attract some new players.