📖 preface
In the last article, we drew the first cube, the color is ten (flower) points (li) gorgeous (hu) beautiful (shao)😜 but what if we change each side of the cube to plain white? It should look something like the following:
But in our real life, when a pure white cube appears in front of us, it is not this effect. Even if it is pure white, we can clearly see the edges and corners of the cube, as well as the light and dark side! This shows the importance of lighting to make our WebGL world more realistic 😉
🔆 light
Light plays a vital role in our life, crop growth, human production are inseparable from light. The sun provides us with the light we need, and the importance of light is illustrated by the fact that Apollo was the son of Zeus, the god of the gods.
Apollo is often referred to as the sun god by modern people, but in fact, in the 5th century BC, he helios and other Greek indigenous civilization believed in the sun god was associated with. In late Greek mythology, Apollo already had the character of the sun god 🔥
God said, “Let there be light. “There is light. (From the Bible) :
Therefore, we added a ray of sunshine to our white cube. “The sunset never speaks but also warms the sunset”. “The sunset” becomes warm when shining on the white object, which is also in line with our common sense of life. As for how to add light to our WebGL world, keep in mind the following two questions before adding light:
- The sources of light in our life are the sun, lamps, flames, etc., so what types of light sources are there?
- The color an object appears to be related to the color of light reflected (hence a black hole), but how does an object reflect light?
The light source type
There are two common types of light source in our daily life: parallel light source and point light source. Like our natural sunlight, it’s parallel; A point source contains light from a bulb and so on. In addition, there is ambient light, which is used to simulate indirect light in the real world, that is, light triggered by a light source that reflects off a wall or other object.
Parallel light: As the name implies, parallel light rays are parallel to each other, and parallel light has direction. Parallel light can be thought of as light from a source at an infinite distance. Because the sun is so far away from the Earth, sunlight can be thought of as being parallel when it reaches the Earth. The definition of parallel light is simple: it only needs one direction and one color.
Point light: Point light is the light emitted from a point in all directions around it. Point light can be used to represent real life bulbs, flames, etc. We need to specify the position and color of the point light source. The direction of the light will be calculated based on the position of the point light source and the part of the scene being illuminated, because the point light source is different at different points in the scene.
Ambient light: refers to the light emitted by a light source (point light source or parallel light), which is reflected several times by objects such as walls, and then shines on the surface of the object. Ambient light strikes objects at the same intensity from all angles.
Reflection type
The direction in which the object reflects light, and the color of the reflected light, depend mainly on two factors: the incident light and the type of surface the object has. The incident light information includes the direction and color of the incident light, while the object surface information includes the inherent color of the object (also known as the base color) and reflection characteristics. An object’s surface reflects light in two ways: diffuse and ambient, each of which is described below.
Diffuse reflection
Diffuse reflection is for parallel and point sources. Diffuse reflection is the ideal reflection model for most rough materials in real life (such as plastic, rock, etc.), where the reflected light is uniform in all directions:
In diffuse reflection, the color of reflected light depends on the color of incident light, the surface base color, and the incident Angle formed between incident light and the surface. The incident Angle is defined as the Angle formed between the incident light and the surface normal, and denoted by θ, then the color of diffuse light can be calculated according to the following formula:
Environment reflection
Ambient reflection refers to ambient light. In ambient reflection, the direction of reflected light can be thought of as the opposite direction of incident light. Since ambient light strikes an object in a uniform way and with equal intensity, the reflected light is uniform in all directions:
And how do we get the color of ambient light?
The color of incident light above is actually the color of ambient light. When diffuse reflection and ambient reflection exist at the same time, the sum of the two is the final observed color of the object:
Diffuse reflection under parallel light
According to the above formula, three data are needed to calculate the color of the diffuse light incident with parallel light: the color of the parallel incident light, the base color of the surface and the incident Angle θ formed by the incident light and the surface. The color of the incoming light and the surface base color are easy for us to obtain, but we cannot determine in advance what Angle the light hits each surface. However, we can determine the orientation of each surface, and when specifying the light source, determine the direction of the light, and use these two pieces of information to calculate the Angle of incidence ☀️
In high school, we learned the following formula:
The dot product of two vectors is equal to the product of the magnitudes of the two vectors times the cosine of the pins of the two vectors, and when both vectors are unit vectors, the product of the two vectors is cosine theta. This requires that we adjust both the ray direction and the normal direction to a unit vector while keeping the direction the same, a process called normalization. GLSL ES also provides built-in normalization functions that we can use directly!
Note also that the direction of the ray is actually the opposite of the direction of the incident, that is, from the incident point to the light source, because then the Angle between that direction and the normal direction is the magnitude of the incident Angle.
🏖 Add illumination to the system
First shine a ray of warm sun on the cube!
Parallel light
Add variables to the vertex shader to receive the normal vector, light color, and light direction:
// ...
attribute vec4 a_Normal; / / normal vector
uniform vec3 u_LightColor; // Light color
uniform vec3 u_LightDirection; // Light direction
void main () {
gl_Position = u_FinalMatrix * a_Position;
// Normalize the normal vector
vec3 normal = normalize(vec3(a_Normal));
// Compute the dot product of ray direction and normal vector
float dotResult = max(dot(u_LightDirection, normal), 0.0);
// Calculate the color of diffuse light
vec3 diffuse = u_LightColor * a_Color.rgb * dotResult;
v_Color = vec4(diffuse, a_Color.a);
}
Copy the code
When we calculate the dot product, we use the Max function to avoid the light source being behind the surface, and when the cosine value is negative, we set the value to 0.0. Add some code to add lighting in JavaScript:
function main () {
// ...
const u_LightColor = gl.getUniformLocation(gl.program, 'u_LightColor');
const u_LightDirection = gl.getUniformLocation(gl.program, 'u_LightDirection');
// ...
gl.uniform3f(u_LightColor, 0.8.0.8.0.0); // Set the light color to yellow
const lightDirection = new Vector3([0.5.3.0.4.0]); // Ray direction (world coordinates)
lightDirection.normalize(); / / normalization
gl.uniform3fv(u_LightDirection, lightDirection.elements);
// ...
}
Copy the code
Vector code you can also use the gl-matrix utility functions in ANTV! Here we will define the normal vector in the initVertexBuffers function:
function initVertexBuffers (gl) {
// ...
/ / normal vector
const normals = new Float32Array([
0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0.0.0.0.0.1.0,]); initArrayBuffer(gl,'a_Normal', normals, 3, gl.FLOAT);
// ...
return indices.length;
}
function initArrayBuffer (gl, data, num, type, attribute) {
const buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, data, gl.STATIC_DRAW);
const a_attribute = gl.getAttribLocation(gl.program, attribute);
gl.vertexAttribPointer(a_attribute, num, type, false.0.0);
gl.enableVertexAttribArray(a_attribute);
}
Copy the code
This gives the effect of the cube at the beginning of the article 😽
A point source
In the previous figure, you can see that the Angle between the point light source and the surface varies from place to place, so our method of passing the light direction to the vertex shader doesn’t work here. But we can calculate the Angle of the light ray to the normal vector in the vertex shader from the position of the light source 😜 to start writing shaders:
// ...
uniform mat4 u_ModelMatrix;
uniform vec3 u_LightPosition;
void main () {
gl_Position = u_FinalMatrix * a_Position;
vec3 normal = normalize(vec3(a_Normal));
// Compute the world coordinates of vertices
vec4 vertexPosition = u_ModelMatrix * a_Position;
// Calculate the ray direction and normalize
vec3 lightDirection = normalize(u_LightPosition - vec3(vertexPosition));
float dotResult = max(dot(lightDirection, normal), 0.0);
vec3 diffuse = u_LightColor * a_Color.rgb * dotResult;
v_Color = vec4(diffuse, a_Color.a);
}
Copy the code
I don’t know if you remember what model matrices do. In the JavaScript code, we need to change the vertex shader to pass point light position data and model matrix data:
function main () {
// ...
const u_LightPosition = gl.getUniformLocation(gl.program, 'u_LightPosition');
gl.uniform3f(u_LightPosition, 0.0.3.0.4.0);
const u_ModelMatrix = gl.getUniformLocation(gl.program, 'u_ModelMatrix');
const modelMatrix = new Matrix4();
// Perform a series of transformations...
gl.uniformMatrix4fv(u_ModelMatrix, false, modelMatrix.elements);
// ...
}
Copy the code
Then, the display effect in browsing is as follows:
So far, have you noticed that the vertex shader only calculates the color values of the four vertices for each face, while the color values of the other parts of the face are achieved through variable interpolation (see variable interpolation). This method can sometimes cause unnatural lighting, so to solve the problem of unnatural lighting, we need to calculate the color value on a per-pixel basis. All you need to do is modify the shader program. Vertex shaders:
// ...
void main () {
gl_Position = u_FinalMatrix * a_Position;
// Compute the world coordinates of vertices
v_Position = vec3(u_ModelMatrix * a_Position);
v_Normal = normalize(vec3(u_NormalMatrix * a_Normal));
v_Color = a_Color;
}
Copy the code
Slice shader:
uniform vec3 u_LightColor;
uniform vec3 u_LightPosition;
varying vec3 v_Normal;
varying vec3 v_Position;
varying vec4 v_Color;
void main () {
// Normalize the discovery
vec3 normal = normalize(v_Normal);
// Calculate the ray direction and normalize
vec3 lightDirection = normalize(u_LightPosition - v_Position);
// Compute the dot product of ray direction and normal vector
float dotResult = max(dot(lightDirection, normal), 0.0);
/ / calculate diffuse
vec3 diffuse = u_LightColor * v_Color.rgb * dotResult;
gl_FragColor = vec4(diffuse, v_Color.a);
}
Copy the code
The final effect is shown below:
It doesn’t seem to make a difference here, but when you encounter an unnatural lighting effect, consider using pixel by pixel color calculations to solve the problem!
The ambient light
Add a variable to the vertex shader to receive the color of the ambient light:
// ...
uniform vec3 u_AmbientLight;
void main () {
// ...
// Calculate ambient light
vec3 ambient = u_AmbientLight * a_Color.rgb;
// Add ambient light and diffuse light
v_Color = vec4(diffuse + ambient, a_Color.a);
}
Copy the code
Pass ambient light data to the vertex shader in JavaScript:
function main () {
// ...
const u_AmbientLight = gl.getUniformLocation(gl.program, 'u_AmbientLight');
gl.uniform3f(u_AmbientLight, 0.2.0.2.0.2);
// ...
}
Copy the code
Add ambient light and the effect is as follows:
Because there is more ambient light than the previous single parallel light, so it will appear brighter 🤗 is that the end? No way! Objects in the WebGL world can be restless, and we talked about how graphics transform, and when graphics transform, the effects of illumination also change!
Shine some light on restless objects
Let’s review the formula above:
We use the unit vector to calculate the value of cosθ, which can be converted to:
Let’s make it clear that when objects transform, the color of the incident light, the color of the surface and the direction of the light will not change, so only the normal vectors of each plane will change. And then think about what kinds of transformations are there in geometry? Rotation, scaling, and translation (understand graphic transformations). Which of these transformations will affect the normal vector:
The far left is the initial position, and the blue arrow represents the normal vector of the plane, starting from the second image is the geometry after translation, rotation, and scaling. We can observe:
- It doesn’t affect the normal vector of the plane;
- Rotation changes the normal vector of the plane;
- Scaling does not change the normal vector of the plane;
But is that really the case? Take a look at the picture below:
I made the width of the rotated graph double along the X-axis, and it was obvious that the normal vector changed. It is possible to change the normal vector of the plane when scaling. So how do we figure out the normal vector to the plane when we transform it? This requires a grand introduction to a magic matrix: ** inverse matrix! ** Just multiply the normal vector before transformation by the inverse transpose matrix of the model matrix. Transpose and inverse matrices are also available in the gl-matrix utility functions:
Transpose function is transpose function, invert function is converted to inverse matrix. Detailed code will not be pasted, very simple, I believe you can complete their own 👍🏻
🎬 conclusion
Lighting is an essential part of the 3D world, it can make the 3D world more realistic, so that the user experience is more pleasant! That’s all for the interesting lighting system. Please follow our public account: Refactor. We will share more interesting and useful articles with you in the future