preface

This is the fourth part of the three-js series. The first three are:

Write a rain animation with three.js

Write a small scene with three.js

Write a reflective ball with three.js

As for the basic knowledge of three.js, it is put in the first part: write a rain animation with three.js, which can be viewed, and the following cases will not be repeated.

Start learning shaders today, here is a simple 3D earth completed using shader animation. Let’s get started.

Introduction to shaders

Drawing graphics in Webgl is a drawing mechanism based on shader. Shader provides a flexible and powerful way to draw 2d or 3D graphics, which must be used by all Webgl programs.

The shader language is similar to C in that it is embedded in the javascript language as strings when we write webGL programs.

For example, to draw a dot on the screen, the code looks like this:

<! DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta Name ="viewport" content="width=device-width, initial-scale=1.0"> <title>Document</title> <style> body {margin: 0px; text-align: center; 0} </style> </head> <body> <canvas ID ="webgl"></canvas> </body> <script Document.getelementbyid ('webgl') Canvas. height = window.innerheight Canvas. width = window.innerwidth Gl = Canvas. GetContext ('webgl') gl.clearColor(0.0, 0.0, 0.0) Var VSHADER_SOURCE = 'void main () {gl_Position = vec4(0.5, 0.5, vec4) 0.0, 1.0); X: 0.5, y: 0.5, z: 0 Gl_PointSize = 10.0; Var FSHADER_SOURCE = 'void main () {gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // The color of the dots: // initShaders(gl, VSHADER_SOURCE, Gl.points gl.drawarrays (gl.points, 0, 1) function initShaders(gl, vshader, 0) fshader) { var program = createProgram(gl, vshader, fshader); if (! program) { console.log('Failed to create program'); return false; } gl.useProgram(program); gl.program = program; return true; } function createProgram(gl, vshader, fshader) { var vertexShader = loadShader(gl, gl.VERTEX_SHADER, vshader); var fragmentShader = loadShader(gl, gl.FRAGMENT_SHADER, fshader); if (! vertexShader || ! fragmentShader) { return null; } var program = gl.createProgram(); if (! program) { return null; } gl.attachShader(program, vertexShader); gl.attachShader(program, fragmentShader); gl.linkProgram(program); var linked = gl.getProgramParameter(program, gl.LINK_STATUS); if (! linked) { var error = gl.getProgramInfoLog(program); console.log('Failed to link program: ' + error); gl.deleteProgram(program); gl.deleteShader(fragmentShader); gl.deleteShader(vertexShader); return null; } return program; } function loadShader(gl, type, source) {var shader = gl.createshader (type); if (shader == null) { console.log('unable to create shader'); return null; } gl.shaderSource(shader, source); gl.compileShader(shader); var compiled = gl.getShaderParameter(shader, gl.COMPILE_STATUS); if (! compiled) { var error = gl.getShaderInfoLog(shader); gl.deleteShader(shader); return null; } return shader; } </script> </html>Copy the code

The above code draws a dot in the upper right area of the screen.

Drawing this point requires the necessary three pieces of information: location, size, and color.

  • The vertex shader specifies the position and size of a point. In the following code,gl_Position,gl_PointSizegl_FragColorAre built-in global variables for shaders.
Var VSHADER_SOURCE = 'void main () {gl_Position = vec4(0.5, 0.5, 0.0, 1.0); Gl_PointSize = 10.0; // Specify the size of the point} 'Copy the code
  • The chip shader specifies the color of the point.
Var FSHADER_SOURCE = 'void main () {gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // Specify the color of the dot} 'Copy the code

Attribute variables and uniform variables

In the example above, we specify the position, size, and color of the point directly in the shader. In practice, these information is basically passed to shader by JS.

The variables used to communicate between JS code and shader code are attribute variables and uniform variables.

Which variable to use depends on the data itself. Attribute variables are used to pass vertex-specific data, and uniform variables are used to pass vertex-independent data.

In the following example, the coordinates of the points to be drawn are passed in by JS.

Var VSHADER_SOURCE = 'attribute vec4a_position; Void main () {gl_Position = a_Position; // declare an attribute variable a_Position that accepts vertex position passed by js void main () {gl_Position = a_Position; // assign a_Position to gl_Position; gl_PointSize = 10.0; } 'var FSHADER_SOURCE =' void main () {gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); } 'initShaders(gl, VSHADER_SOURCE, FSHADER_SOURCE) Var a_Position = gl.getattribLocation (gl.program, 'a_Position') gl.vertexattrib3f (a_Position, 0.5, 0.5, 0.5) 0.0) gl. DrawArrays (gl) POINTS, 0, 1)Copy the code

Varying variables

What we pass from JS to shaders is usually vertex-related data. For example, if we want to draw a triangle, the vertex positions and color of the triangle are passed from JS. The position of the three vertices determines the position of the triangle, so what determines the color of the entire triangle?

This is where varying comes in.

Color calculation in WebGL:

inVertex shaderReceives the position and color data for each vertex passed in by JS. Based on the vertex data,Interpolation calculationIn the region between vertices, the color value of each element (which can be understood as the smallest render point that makes up the image). Interpolation calculations are done automatically by the WebGL system.

The calculated color value for each element is passed toChip shader.Chip shaderRender the image based on the color value of each element.

fromVertex shadertoChip shader, transfer the work byVarying variablesTo complete.

The code is as follows.

  • Vertex shader code
var VSHADER_SOURCE = `attribute vec4 a_Position; Vec4a_color; // Attribute vec4a_color; // Varying vec4 v_Color; // Calculate the color value of each slice in the triangle based on the vertex color, and then pass the color value of each slice to the slice shader. void main () { gl_Position = a_Position; v_Color = a_Color; // a_Color = v_Color} 'Copy the code
  • Chip shader code
var FSHADER_SOURCE = `precision mediump float; varying vec4 v_Color; Void main () {gl_FragColor = v_Color; } `Copy the code
  • Js code
Var verticesColors = new Float32Array([// vertex position and color 0.0, 0.5, 1.0, 0.0, 0.0, 0.0, // first point, first two coordinates (x,y; z default is 0), After three colors - 0.5, 0.5, 0.0, 1.0, 0.0, / / the second point 0.5, 0.5, 0.0, 0.0, Var vertexColorBuffer = gl.createBuffer() gl.bindBuffer(gl.array_buffer, gl.array_buffer, vertexColorBuffer) gl.bufferData(gl.ARRAY_BUFFER, verticesColors, gl.STATIC_DRAW) var FSIZE = verticesColors.BYTES_PER_ELEMENT var a_Position = gl.getAttribLocation(gl.program, 'a_Position') gl.vertexAttribPointer(a_Position, 2, gl.FLOAT, false, FSIZE * 5, 0) gl.enableVertexAttribArray(a_Position) var a_Color = gl.getAttribLocation(gl.program, 'a_Color') gl.vertexAttribPointer(a_Color, 3, gl.FLOAT, false, FSIZE * 5, FSIZE * 2) gl. EnableVertexAttribArray (a_Color) / / draw a triangle, the first parameter to gl TRIANGLES gl. DrawArrays (gl) TRIANGLES, 0, 3)Copy the code

Here is the final drawing:

A simple understanding of texture mapping

In the example above, we are specifying a color value for each vertex.

By extension, the texture map specifies texture coordinates for each vertex, and the WebGL system interpolates to calculate the texture coordinates for each pixel based on the vertex texture coordinates.

Based on the incoming texture image and the texture coordinates of each element, the color value (texture element) of the corresponding texture coordinates in the texture image will be taken out as the color value of the element and rendered.

Features of texture coordinates:

  • The bottom left corner of the texture image is the origin (0, 0).
  • To the right is the positive direction of the horizontal axis, and the maximum value of the horizontal axis is 1 (right edge of the image).
  • Upward is the positive direction of the vertical axis, and the maximum value of the vertical axis is 1 (upper edge of the image).

Regardless of the size of the texture image, the texture coordinates range is: X-axis: 0-1, Y-axis: 0-1

Draw a 3D earth

Using WebGL to draw, the steps and API are tedious. Fortunately, we can use three.js.

The ShaderMaterial in three.js allows us to customize our own shaders and directly manipulate pixels. We just need to understand the basics of shaders.

Start drawing the earth.

Based on a sphere

The drawing of the basic sphere is relatively simple, just use the material provided by three.js. On the basis of material, there is a more detailed introduction in writing a reflective ball with three.js.

Var loader = new three.textureLoader () var group = new three.group ( SphereGeometry(20,30,30) var earthMaterial = new three.meshphongmaterial ({// create material map: Loader.load ('./images/earth.png'), // Base texture specularMap: Loader.load ('./images/specular. PNG '), // specular texture, specifying which parts of the object's surface are brighter and which parts are dimmer. Loader.load ('./images/normal.png'), // normal texture, create more detailed bump and wrinkle normalScale: New three.vector2 (3, 3)}) var sphere = new three.mesh (geometry, earthMaterial)Copy the code

Flow of the atmosphere

Use ShaderMaterial to customize shaders. The flow of the atmosphere is achieved by changing the texture coordinates each time in the requestAnimationFrame render loop. To make the flow more natural, noise is added.

Var VSHADER_SOURCE = 'vec2 vUv; void main () { vUv = uv; Gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0); } '// Chip shader var FSHADER_SOURCE =' uniform float time; // Time variable uniform sampler2D fTexture; // Uniform sampler2D nTexture; // Varying vec2 vUv; Void main () {vec2 newUv= vUv + vec2(0, 0.02) * time; Vec4 noiseRGBA = texture2D(nTexture, newUv); vec4 noiseRGBA = texture2D(nTexture, newUv); Newuv.x += noiseergba.r * 0.2; Newuv.y += noiseergba.g * 0.2; gl_FragColor = texture2D( fTexture, newUv ); } 'var flowTexture = Loader.load ('./images/flow.png') flowTexture. WrapS = three.repeatwrapping flowTexture.wrapT = THREE.RepeatWrapping var noiseTexture = loader.load('./images/noise.png') noiseTexture.wrapS = Var flowMaterial = new three.shaderMaterial ({var flowMaterial = new three.shaderMaterial ({var flowMaterial = new three.shaderMaterial uniforms: { fTexture: { value: flowTexture, }, nTexture: { value: noiseTexture, }, time: { value: 0.0}}, / / the vertex shader vertexShader: VSHADER_SOURCE, / / a fragment shader fragmentShader: FSHADER_SOURCE, transparent: True}) var fsphere = new THREE.SphereGeometry(20.001,30,30) Add (fsphere) scene.add(group)Copy the code

Creating the group, the base sphere and the atmospheric sphere, all add to the group as a whole, and setting the rotation and position, all directly modify the properties of the group.

Var animate = function () {requestAnimationFrame(animate) var delta = Clock. GetDelta () group. Rotation. Y = 0.002 / / overall rotation flowMaterial uniforms. Time. Value + = delta / / change the uniforms. The value of time, Render (scene, camera)} animate()Copy the code

halo

The halo was created using a Sprite, which is a plane that is always facing the camera. It is used here to simulate a halo, no matter how the sphere rotates, it will always appear to be in halo.

Var ringMaterial = new three. SpriteMaterial({// create a SpriteMaterial map: Loader.load ('./images/ring.png')}) var Sprite = new three.sprite (ringMaterial) // Create Sprite, Sprite.scale.set (53,53, 1) // set the size of the sprite.add (Sprite)Copy the code

Final effect: