preface
This article mainly gives an overview of WebGL, mainly covering the following aspects:
- GPU vs CPU
- The difference between Canvas2D and WebGL drawing
- Rendering pipeline
- Shader Overview
- Data exchange between the GPU and CPU
This article will not cover the specific code.
CPU vs GPU
Both THE CPU and GPU are processing units, but their structures are different. To put it in perspective, the CPU is like a large industrial pipeline, and the tasks waiting to be processed can only pass through the pipeline in turn, so the speed at which the CPU can process these tasks depends entirely on the amount of time it takes to process individual tasks.
The CPU pipeline can only allow tasks to be executed one by one, but the ABILITY of the CPU to process a single task is very powerful, and this feature is sufficient for large tasks. But the co-action process images, because the logic of processing image usually is not very complex, on the other hand, an image is composed of tens of thousands of pixels, every time we deal with a pixel is a task, let so many small tasks, in turn, through our CPU pipelines, smack of mara the car, at this time, That’s where our Gpus come in.
Gpus are made up of a large number of small processing units, which may not be nearly as powerful as cpus, but are so numerous that each unit can handle a simple task. The GPU ensures that all pixels are processed simultaneously. If you want to make a metaphor, the process of GPU processing is similar to the “movable type printing” invented by our ancestors. All the words are arranged at one time and then printed directly on the paper. “Printing” is the process of GPU.
Canvas2D vs WebGL
Canvas2D
Those who have been exposed to Canvas drawing must be familiar with the following drawing methods
const ctx = canvas.getContext('2d');
function ctxDraw() {
ctx.fillStyle = '#f60';
ctx.fillRect((width - rectWidth) / 2, (height - rectHeight) / 2, rectWidth, rectHeight);
}
Copy the code
We changed the fill color and then called fillRect to draw a rectangle. This is imperative and procedural.
WebGL
How do we draw a rectangle using WebGL, as shown in the following example code
function glDraw() {
const vertexShader = ` precision mediump float; attribute vec4 a_position; void main () { gl_Position = a_position; } `;
const fragmentShader = ` precision mediump float; Void main () {gl_FragColor = vec4(1.0, 0.5, 0.0, 1.0); } `;
let a_positionLocation;
let program = util.initWebGL(gl, vertexShader, fragmentShader);
gl.linkProgram(program);
gl.useProgram(program);
let points = new Float32Array([-0.5, -0.5.0.5, -0.5.0.5.0.5.0.5.0.5,
-0.5.0.5,
-0.5, -0.5,]);const buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, points, gl.STATIC_DRAW);
a_positionLocation = gl.getAttribLocation(program, "a_position");
gl.enableVertexAttribArray(a_positionLocation);
gl.vertexAttribPointer(a_positionLocation, 2, gl.FLOAT, false.0.0);
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.drawArrays(gl.TRIANGLES, 0.6);
}
Copy the code
Drawing a rectangle requires that much code to use WebGL, not including some of the utility methods. So, what kind of approach does WebGL use to draw graphics?
We can think of WebGL as a giant circuit, where we can customize the direction of some wires in the circuit, or add some components to the circuit, etc., and then we just press the start switch and the circuit will operate by itself.
Rendering pipeline
Let’s explore how this circuit actually works, and here we call the whole process of this circuit “render pipeline”.
As shown above, the render pipeline is divided into the following steps:
- The vertex shader processes vertices
- Primitive assembly
- rasterizer
- Chip shader coloring
- Test & mix
We will now elaborate on each step in turn
The vertex shader processes vertices
Here, let’s explain what “shader” means. You can simply interpret the word “shader” as a program. It’s just a slightly different name.
In the vertex shader, we will process the vertex information that is passed to the GPU (for example, when we draw the rectangle above, we pass the vertex data to WebGL via bufferData method), we may need to perform some clipping space transformation, translation, scaling, rotation, etc. All of these operations are performed on vertices, which directly changes the position of the vertices.
Primitive assembly
By processing the vertex shader, we get the vertex positions we want, assuming that we now have the positions of four points of the rectangle (we actually passed in six points). In this step, we need to tell the GPU how to combine the six points in what form (which points are in a group). Here, we select every three points as a group, and each group represents a triangle. The process of assembling vertices into basic shapes is called primitive assembly (the only basic shapes that WebGL can assemble are points, lines, and triangles).
rasterizer
In the previous step, we told the GPU how to assemble our vertices. So far, we still only have the information about the six vertices and the assembly, but how do we use the six points and the assembly to represent the rectangle on the screen? This is the process of rasterization. A simple way to raster is to:
Until we have traversed all the pixels, we will determine in turn whether they fall into the image we just assembled. If they are in the image, we will do the next step (coloring) on the pixel.
In addition to determining whether the operation is within the graph, the non-vertex positions are interpolated, giving each pixel additional information. Since a pixel has more than just color information, we call it a “slice.”
Chip shader coloring
In the rasterization process, we determine which elements fall into our graphics, we now just need to color those elements. The easiest way to do this is to just set a color. Of course, element shaders can also be very complex, such as lighting, materials and so on are basically done in the element shader.
Test & mix
Here is a brief talk about depth testing, in addition to depth testing and template testing. Depth testing means that because you can draw multiple objects in WebGL and they are hierarchically related to each other, some of the occluded parts of the object will not be shown after passing the depth test.
Blending is the blending of transparency values, and we can set different blending methods to achieve different results.
In the above process, vertex shader handles vertex shader and chip shader shader both processes are generally programmable, which is called shader program. Next, let’s take a look at shader
Overview of shaders
Shader is a computational program, mainly used for graphics processing.
Vertex Shader
In the vertex shader, the main point is the vertex and some data contained in the processing. Shaders contain some built-in variables, such as gl_Position
// A_position is a variable that can be obtained in the js runtime to pass data to WebGL
attribute vec4 a_position;
uniform mat4 u_matrix;
void main () {
// gl_Position is a variable built into GLSL
// Assign the value of a_position to gl_Position, and the value of gl_Position goes to the next stage of the graphics rendering pipeline
// u_matrix is multiplied by a_position. In this case, the matrix is multiplied by the vector. The result is still a vector
gl_Position = u_matrix * a_position;
}
Copy the code
Fragment Shader
The main function of the chip shader is to shader. We can assign a different color to each chip through a series of procedures. The following program is shown:
precision mediump float;
void main () {
gl_FragColor = vec4(1.0.1.0.0.0.1.0);
}
Copy the code
This code can be interpreted as follows:
for (let y = 0; y < height; y++){
for (let x = 0; x < width; x++) {
// The contents of the double loop are equivalent to the contents of the slice shader
data[y * width * 4 + x + 0] = 255;
data[y * width * 4 + x + 1] = 255;
data[y * width * 4 + x + 2] = 0;
data[y * width * 4 + x + 3] = 255; }}Copy the code
It is important to understand that the slice shader is executed simultaneously for each slice and that there are no dependencies between the slices.
Storage qualifier
In the above program, you may have noticed that attribute, Uniform, and other keywords are called storage qualifiers.
- Attribute: Occurs only in vertex shaders and represents per-vertex data. Attribute variables are interpolated during rasterization. Data can be passed from outside to inside WebGL
- Uniform: Can appear in vertex shader and slice shader, representing the uniform value, which is used by every vertex/slice.
- Varying: Can appear in the vertex shader and the chip shader and represents the changed values. During the rasterizing phase, the GPU assigns the result of the interpolation of the Attribute variable to varying, which acts as a bridge linking the vertex shader and the chip shader variables.
The data transfer
So how do we pass data from the JS runtime to WebGL? We are mainly divided into several parts:
- Pass attribute variable data (usually vertex information)
- Pass general uniform variable data (integer, floating point, vector, matrix) (usually some auxiliary information, such as time information, a program needs to calculate the final value according to the change of time)
- Pass the texture
Pass Attribute variable
Passing attribute variable data requires the use of WebGLBuffer, a built-in WebGL data structure.
Steps:
- Create WebGLBuffer
- BindBuffer to ARRAY_BUFFER (gl.bindbuffer ())
- Incoming data
Here the ARRAY_BUFFER acts as a bridge. We are actually passing data to the ARRAY_BUFFER, which is bound to the WebGLBuffer we created in the previous step. So the data is written directly to the WebGLBuffer.
Passing Uniform variables
The process of passing a variable of uniform type is relatively simple. The steps are as follows:
- Obtain the address of uniform variables in WebGL program through API (gl.getUniformLocation)
- Fill in the API address (gl. Uniform1f, gl.uniform1i, gl.uniform2f……).
Pass the texture
First, we need to understand what a texture is. Simply speaking, a texture is a picture. In the Web world, textures can be tags, as well as ImageBitmap and TypedArray objects.
Passing a texture is similar to passing an attribute variable; instead of using a WebGLBuffer, we use a WebGLTexture object, and we need to set the parameters to the WebGLTexture object.
Steps:
- CreateTexture object (WebGLTexture) (gl.createtexture ())
- Binding texture objects (gl.bindTexture)
- Setting texture Parameters
- Pass in a texture (gl.texImage2D)
Gl.texture_2d acts as a bridge to gl.texture_2d, but we have bound the texture object to TEXTURE_2D in advance, thus indirectly operating the WebGLTexture object.
The following figure briefly describes how JS passes data to WebGL.
Remember when we likened WebGL to a giant circuit diagram? Filling WebGL with data is adding electrical elements (WebGLTexture, WebGLBuffer) to the circuit. Changing the binding between WebGLBuffer/ARRAY_BUFFER and WebGLTexture/TEXTURE_2D is actually changing the way wires are connected in the circuit. When this is all in place, the GL.Drawarrays API acts as a switch, calling the function and the entire circuit will automatically run.
conclusion
This article has covered some of the more important concepts in WebGL. We first introduced the difference between CPU and GPU in data processing. The CPU performs tasks sequentially, which is sequential. GPU processes data in parallel and can simultaneously process a large number of tasks, but the computing power of each processing core is not strong.
Then I introduced the difference between drawing in Canvas2D, which is imperative, and drawing in WebGL, which is a “linked” process. You can imagine that WebGL is a huge circuit. We lay out the circuit and its electrical components in advance, and once the switch is pressed, the circuit automatically performs.
The rest of the rendering pipeline and shaders will be unfamiliar to you, but that’s ok, as you learn more you will find that everything is natural. The following will also launch the actual combat article. Stay tuned.