This section to establish how we basically mastered in 2 d or 3 d planar graph are on the basis of (of course did not understand the 2 d and 3 d graphic drawing image can also be reading), in 2 d plane, we can use webgl to draw the pattern of what we need, but in practice when we use a graphics may not all go to draw, Texture mapping solves this problem by filling in graphics in WebGL with existing images.

  • Texture based
  • Texture mapping of 2D graphics
  • Texture mapping for multiple texture units
  • Transition effects between different textures

The original address is from my blog: github.com/fortheallli…

The source code for this series is: github.com/fortheallli…

First, texture foundation

First of all, what is a texture map or texture map?

Attach an image to the surface of a geometry

This image is called a texture, and this job is called texture mapping, which essentially takes the color from the image and assigns it to a certain position on the geometric plane, so that the image is rendered on the surface of the geometric body.

There is a mapping from the texture (image) to the surface of the geometry, and to understand how this mapping happens, we must understand texture coordinates and clipping coordinates.

Cut out the coordinate

The clipping plane coordinates in WebGL are as follows:

As can be seen from the above schematic diagram of the clipping coordinate system, the central coordinate of the whole clipping plane in WebGL is (0,0). For the two-dimensional clipping plane, its horizontal direction is from (-1,0) to (1,0), and its vertical direction is from (0,-1) to (0,1).

That is, the value in any direction of its clipping coordinate is within the interval [-1,1].

Note that the clipping plane determines how to map to the canvas, and because the canvas is two-dimensional, the clipping plane is two-dimensional. The z-axis coordinates in WebGL are not limited to (-1,1) and can be any value.

Texture coordinates

Texture coordinates, unlike clipping coordinates, are not from [-1,1] but from [0,1]:

During the mapping process, we need to match the vertex coordinates in the clipping coordinates to the texture coordinate system points one by one. That is, how to capture the texture from the texture coordinates and paste it into the geometry.

Note that the coordinates of the image itself and the texture are equal left and right, and opposite up and down, so there is an up and down coordinate transformation. In textures in WebGL, pass:

 gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL,1)
Copy the code

To reverse the Y-axis.

Disable Chrome security check

In this example, the HTML file is opened directly from the local file. If there is an Ajax cross-domain request in the HTML file, the browser security check will prompt the following error:

Cross origin requests are only supported for HTTP....
Copy the code

Of course, it doesn’t matter if you start a local server. One lazy solution is to disable security checks. On a MAC, for example, launch Chrome from a command line window with the command

open /Applications/Google\ Chrome.app --args --allow-file-access-from-files
Copy the code

In addition, you can also install http-server to quickly start a server locally.

2. Texture mapping of 2D graphics

Let’s use texture mapping for 2D geometry to introduce how to implement texture mapping in WebGL.

(1) first create a texture buffer

function createTexture (gl, filter, data, width, height) {
  let textureRef = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, textureRef);
  gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL,1)
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, filter);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, filter);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
  gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, data);
  return textureRef;
}
Copy the code

This is an example of creating a texture cache. Create a texture cache with gl.createTexture() and associate the system variable gl.texture_2d. In addition, because in texture rendering, the Y-axis direction is completely opposite to the image, so if the texture rendering results need to be presented in a positive direction, such as the Y-axis inversion, i.e. Gl.pixelstorei (gl.unpack_flip_y_webgl,1).

In addition, we can use the gl.texparameteri function to specify how to render correctly when the image texture is larger than the render area, or when the image texture is smaller than the render area.

Final pass:

gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, data);
Copy the code

Extract elements from data and save them in the texture cache. Data can be image, video, or even the rendering result of another canvas. That is, the source of webGL’s texture map can not only be an image, but also a frame of a video, or even another Cavans.

(2). Use texture buffer to load images

 let image = new Image()
 image.src = './cubetexture.png'
 image.onload = function(){
    let textureRef  = createTexture(gl,gl.LINEAR,image1);
    gl.activeTexture(gl.TEXTURE0)
    gl.bindTexture(gl.TEXTURE_2D,textureRef)
    gl.uniform1i(u_Sampler, 0);  // texture unit 0
    gl.drawArrays(gl.TRIANGLE_STRIP, 0.4);
  }

Copy the code

The above code loads the image, creates the texture with the createTexture defined earlier during onload, and then activates the texture unit 0. Webgl supports multiple texture units at the same time, rendering multiple textures into the same viewable area at the same time. After activating the texture unit, we pass 0 to the sampler variable u_Sampler in the shader.

Finally we accept the texture unit number and texture coordinates in the slice shader, and finally render the texture into the render area:

    uniform sampler2D u_Sampler; // Fetch color variable
    varying lowp vec2 v_TexCoord; // Texture coordinates
    void main() {
      gl_FragColor = texture2D(u_Sampler,v_TexCoord); 
    }
Copy the code

We ended up with the correct image rendered in the specified area, as shown below:

3. Texture mapping of multiple texture units

As mentioned earlier, WebGL can support multiple texture units at the same time. Webgl can handle multiple textures simultaneously. Texture units are designed for this purpose. In the example above, we used only one texture unit to render a single texture into the render area. Next we try to render two images into the same area using two texture units.




Texture picture 1



Texture picture 2

We use two texture units in WebGL China, gl.texture0 and GL.texture1, and modify the code as follows.

The first is the loading and rendering logic:

  let textures = []
  image.onload = function(){
     let textureRef  = createTexture(gl,gl.LINEAR,image);
     gl.activeTexture(gl.TEXTURE0);
     gl.bindTexture(gl.TEXTURE_2D,textureRef)
     textures.push(textureRef)    
  }
  image.src = './cubetexture.png'
  let image1 = new Image();
  image1.onload = function(){
    let textureRef1  = createTexture(gl,gl.LINEAR,image1);
    gl.activeTexture(gl.TEXTURE1)
    gl.bindTexture(gl.TEXTURE_2D,textureRef1)
    textures.push(textureRef1)
  }
  image1.src = './img.png'
  
  // Activate the texture unit and draw
  gl.uniform1i(u_Sampler, 0);  // texture unit 0
  gl.uniform1i(u_Sampler1, 1);  // texture unit 1
  gl.activeTexture(gl.TEXTURE0);
  gl.bindTexture(gl.TEXTURE_2D, textures[0]);
  gl.activeTexture(gl.TEXTURE1);
  gl.bindTexture(gl.TEXTURE_2D, textures[1]);
Copy the code

Finally modify the slice shader:

uniform sampler2D u_Sampler;
uniform sampler2D u_Sampler1;
void main(){
  gl_FragColor = texture2D(u_Sampler,v_TexCoord) + texture2D(u_Sampler1,v_TexCoord); 
}
Copy the code

The final render result is:




Finally, two texture units synthesize the render result

The completed code address is: github.com/fortheallli…

Four, different textures to achieve the transition effect

By using multiple texture units, we can achieve transitions between images. First, let’s see what transitions are.

As the name implies, the transition effect is the animation effect between picture A and picture B, similar to the dynamic effect of fading in and out when we make PPT. In WebGL, the texture nature of rendering is also a picture, so you can control the rendering results of different textures in chronological order between different textures to achieve transitions.

The GIF above lists some of the transition effects.

Let’s look at transitions between images or textures using the same two images above.

Look directly at the code in the pixel shader:

vec4 transition (vec2 uv) {
  float time = progress;
  float stime = sin(time * PI / 2.);
  float phase = time * PI * bounces;
  float y = (abs(cos(phase))) * (1.0 - stime);
  float d = uv.y - y;
  return mix(
    mix(
      getToColor(uv),
      shadow_colour,
      step(d, shadow_height) * (1. - mix(
        ((d / shadow_height) * shadow_colour.a) + (1.0 - shadow_colour.a),
        1.0,
        smoothstep(0.95.1., progress) // fade-out the shadow at the end
      ))
    ),
    getFromColor(vec2(uv.x, uv.y + (1.0 - y))),
    step(d, 0.0)); }Copy the code

The transition function is a transition function that determines how to switch textures over time to render. We specify two different textures using getFromColor and getToColor:

    vec4 getToColor(vec2  uv){
      return texture2D(u_Sampler,uv);
    }
    vec4 getFromColor(vec2 uv){
      return texture2D(u_Sampler1,uv);
    }
Copy the code

The color of the last shard shader is the result of calling transition, which passed in texture coordinates.

void main() {
    gl_FragColor =  transition(v_TexCoord);
}
Copy the code

And finally, in order for the render to move, you have to write a timer that dynamically changes the parameter Progress in transition from 0 to 1.


setInterval((a)= >{
    if(textures.length === 2) {if(i >= 1){
        i = 0.01
      }
      gl.uniform1i(u_Sampler, 0);  // texture unit 0
      gl.uniform1i(u_Sampler1, 1);  // texture unit 1
      gl.activeTexture(gl.TEXTURE0);
      gl.bindTexture(gl.TEXTURE_2D, textures[0]);
      gl.activeTexture(gl.TEXTURE1);
      gl.bindTexture(gl.TEXTURE_2D, textures[1]);
      let progress = gl.getUniformLocation(shaderProgram,'progress')
      gl.uniform1f(progress,i)
      i += 0.05
      gl.drawArrays(gl.TRIANGLE_STRIP, 0.4); }},100)
Copy the code

The final render result we can see the animation result as follows:

The completed code address is: github.com/fortheallli…

In addition, https://github.com/gl-transitions/gl-transitions contains a variety of transitions, transitions can be applied not only to pictures, but also to different frames of video, The next article will cover how to render video in WebGL and how to animate video between frames.