preface

To undertake the last article “about combing and packaging Threejs tools this thing”, as the front end is actually from time to time and GPU this term has contact, but there is no feeling that the distance is so close, and so far. This article is mainly around the 3D effect are inseparable shader to introduce the relationship between this.

Here are some interesting questions that may be answered after reading this article.

  • Can the front end control the RENDERING of the GPU?
  • What is GPU acceleration that is often heard about?
  • What is a shader?
  • Why do special effects write better with shaders?

GPU acceleration is one of the most common, and can be triggered by using CSS attributes such as Transform: translateZ(0).

There are a lot of strange nouns coming up, so take your time and start with the shader.

Render Pipelines and Shaders

To understand what a shader is, you need to understand what a rendering pipeline is. Suppose we have a bunch of graphic data, and all the rendering pipeline does is render the data (points, points, faces, etc.) through a series of processes to the screen. This is also the main workflow for graphics cards (Gpus). Shader is a very important part of the graphics card workflow, and this process is editable, also known as the programmable pipeline.

Taking a sphere model as an example, this model object contains information about the position of the sphere in space and the composition of the points. The approximate flow of the rendering pipeline can go like this

Sphere’s vertex and surface data input -> Vertex Shader -> Pixel Assembly -> Raster -> Slice shader -> Mix Test -> Final output to device.

In Threejs, shaders can be understood as tools to determine the position of the object’s vertices and determine the color of the object’s surfaces, and involve the processing of vertex shaders and slice shaders in the rendering pipeline. So this gives you an idea of what a shader is. Here are two ways to write dynamic effects in Threejs.

CPU rendering versus GPU rendering

This matter starts from the birth of GPU, we all know that the screen has a lot of pixels, for example, 4K screen is 4096× 2160,8 million pixels. It was obvious that using a single-threaded CPU for serial processing was not appropriate, and a device that could process large amounts of data in parallel was needed, so gpus were born, such as the 1280 stream processors in the GTX1060.

Note that pixels on the screen do not know what each other is rendering, which is something to get used to when writing shaders.

What’s the difference between CPU rendering and GPU rendering? This is the difference between processing object state in CPU or GPU. Taking particle motion animation in Threejs as an example, if CPU rendering is used, the position state of particles needs to be maintained in JS in every frame. There is no problem with this, the main problem is that the CPU is single threaded, which can cause performance problems. If GPU rendering is used, the position state of particles is maintained in the shader for each frame, which is processed by GPU.

CPU render

/ / CPU rendering
const points = new THREE.Points(geometry, material);
function render(time){
    points.forEach(item= >{
        item.position.setX(100*Math.cos(time));
        item.position.setY(100*Math.sin(time));
    })
}
requestAnimationFrame(render)
Copy the code

The GPU to render

/ / the GPU rendering
// Vertex shader
// vertexShader.glsl
uniform float time;
void main(){
    vec3 uPosition;
    uPosition.x = 100.*cos(time);
    uPosition.y = 100.*sin(time);

    gl_PointSize = 4.;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(uPosition, 1.);
}
Copy the code

If we had 20,000 particles, it would be much slower to sequentially process the vertex positions in a CPU than it would be to process them in parallel on a GPU. Moreover, the common functions in shaders can be compiled to correspond to hard-coded processing in gpus, so data can be processed as fast as electrons can flow. That’s why we talk about hardware acceleration being able to accelerate.

GLSL grammar

Shading Language GLSL (Graphics Library Shading Language) the Shading code above is written by GLSL (Graphics Library Shading Language), the syntax can be seen here, the rules are very easy, here will not do the introduction of syntax. The specific effects and graphics of the functions will be covered in a later article on the use of shader built-in functions and related effects (in draft).

Let’s look at how shaders are written.

VertexShader vertexShader and fragmentShader

In Threejs/ WebGL, the shaders we can write are vertex shaders and fragment shaders.

Vertex shaders can be used to process the position, size, color, texture coordinates of each vertex, while pixel shaders are used to process each pixel of the area covered by the rasterized pixel.

Shader is commonly used

The most common use is to write associated shaders in script tags

// main.html
<script id="vertexShader" type="x-shader/x-vertex">
void main(){
    gl_PointSize = 4.;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
</script>
Copy the code
// main.html
<script id="fragmentShader" type="x-shader/x-fragment">
uniform vec3 color;
void main(){
    gl_FragColor = vec4(color , 1.0);
}
</script>
Copy the code

Use it in the appropriate position

/ / use
const vertexShader = document.getElementById('vertexShader').innerText;
const fragmentShader = document.getElementById('fragmentShader').innerText;
Copy the code

Use shader gracefully

In a Webpack environment, there is raw-loader support. This way we can place GLSL in a separate file.

// vShader.glsl
void main(){
    gl_PointSize = 4.;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
Copy the code
// fShader.glsl
uniform vec3 color;
void main(){
    gl_FragColor = vec4(color , 1.0);
}
Copy the code
const vShader = require('! raw-loader! ./vShader.glsl').default;
const fShader = require('! raw-loader! ./fShader.glsl').default;
Copy the code

More modular use

Github.com/glslify/gls…

Threejs to shader data transfer

First of all, the data transfer process is one-way, the shader code does not provide event callbacks (it would be weird if pixels on the screen did), and Threejs creates materials that by default pass object positions, normals, maps, etc. into the vertex shader. Variables can be passed in either of the imaginative /attribute routes.

Imaginative variables are passed in when an IMAGINATIVE shader is created and then used in the shader with the same name. Attribute is the attribute of the geometry of the object. It is not passed in when the shader is created, but is automatically passed in when the Mesh is created.

Should uniforms and attributes be constructed

// ...
// Use the default colors in Uniform
const material = new THREE.ShaderMaterial({
	uniforms: {
		uColor: {
			value: new THREE.Color(0x338899),},test: {
		    value: 0.5,}},vertexShader: vShader,
	fragmentShader: fShader
});
// ...
// positionTest: redefined vertex position
// const positionTest = new Float32Array(...) ;
geometry.setAttribute(
    "positionTest".new THREE.BufferAttribute(positionTest, 3));// ...
Copy the code
// Vertex shader
// Use custom vertex position positionTest
attribute vec3 positionTest;
void main(){
    gl_Position = projectionMatrix * modelViewMatrix * vec4(positionTest, 1.0);
}
Copy the code
// Chip shader
// Use imaginative type variables
uniforms vec3 uColor;
void main(){
    gl_FragColor = vec4(uColor , 1.0);
}
Copy the code

This allows you to use custom vertices and custom color values.

Shader dynamic effect

So how do you get the shader rendering to move? There are many methods, animation is related to time, we can pass time in render. You can also use the tween. js library to generate incoming changes over time.

RequestAnimationFrame (callback), the first argument to the callback function callback returns the time in milliseconds. Tween.update () is called when render is used with tween.js.

The graphic effect above is also very simple and is generated using the fract and step functions.

// fragmentShader.glsl
varying vec2 vUv;
uniform float uVal;
uniform vec2 uResolution;

// The processing here is to match the UV coordinates with the object size
vec2 centerUv(vec2 uv,vec2 resolution){
    float aspect=resolution.x/resolution.y;
    uv.x*=aspect;
    return uv;
}

void main(){
    vec2 cUv = centerUv(vUv, uResolution);
    float dis = distance(cUv,vec2(0.5.0.5));
    
    gl_FragColor = vec4(vec3(step(0.05.fract(dis*10.0 - uVal))+uVal),1.0);
}
Copy the code

Here tween. js is used to generate periodic time changes.

function crateMaterial(uResolution: THREE.Vector2) :ShaderMaterial {
	const material = new THREE.ShaderMaterial({
		uniforms: {
			uVal: {
				value: 0,},uResolution: {
				value: uResolution,
			},
		},
		vertexShader: vShaderScale,
		fragmentShader: fShaderScale,
	});

	// Provide periodic changes to the shader
	const pos = { uVal: 0 };
	const tween = new TWEEN.Tween(pos)
		.to({ uVal: 1 }, 2000)
		.easing(TWEEN.Easing.Quadratic.InOut)
		.onUpdate(function ({ uVal }) {
			material.uniforms.uVal.value = uVal;
		});
	const tweenBack = new TWEEN.Tween(pos)
		.to({ uVal: 0 }, 2000)
		.easing(TWEEN.Easing.Quadratic.InOut)
		.onUpdate(function ({ uVal }) {
			material.uniforms.uVal.value = uVal;
		});
	tween.chain(tweenBack);
	tweenBack.chain(tween);
	tween.start();
	return material;
}
Copy the code
threetool.continuousRender((time) = > {
	TWEEN.update();
});
Copy the code

Scenarios with custom shaders

After seeing here is not feel shader is very powerful, can actually customize the vertex and slice display, and performance relative to CPU rendering and good. Is it better to use custom shaders for everything when using Threejs? That’s not the answer, and again, looking at this, you can ask the question, what is a shader? To recap, what shaders do is one loop (or several) in the GPU rendering pipeline, so rendering a figure can’t be done without shaders anyway. So how do you render geometry without custom shaders? The answer is that textures in Threejs have their own default shaders, such as the lambert shader, in which vertex shaders are written based on the diffuse lighting model. Most of the time we can use it directly, reducing unnecessary duplication of development.

As you can see, Threejs source code has shaders for various materials, which correspond to different types of Material

Borrow Threejs built-in shader

Take a look at the process of using shaders in native WebGL. To use shaders, you need to go through the following steps

  • Compiler shaderconst shader = gl.compileShader(vertexShader);
  • Create a GLSL program objectconst program = gl.createProgram();
  • Assigns shaders to GLSL program objectsgl.attachShader(program, shader);
  • Link this program objectgl.linkProgram(program);
  • Set program object validgl.useProgram(program);

So when overwriting the shader provided by Threejs, you need to do something about it before compiling the shader. Let’s rewrite onBeforeCompile in Threejs’ onBeforeCompile lifecycle.

const material = new THREE.MeshBasicMaterial();

material.onBeforeCompile = (shader) = > {
	// replace some code to do something
	// shader.vertexShader = xxx;
	// shader.fragmentShader = xxx;
	return shader;
};
Copy the code

As an aside, if you’re looking at the source code for Three’s shader, you might be wondering how it’s possible to use #include for modularization.

A closer look reveals that the following regular expression is actually a shader string replaced by replace, which is probably the original implementation of modularity.

At the end

The next post will be about writing a particle transform plugin for special effects Max based on the content of these two posts.

The source code

Original is not easy, reprint please contact the author. Everyone should not point a like meaning, point is the author open source code and continue to update the power. Here is the Threejs utility class code that I covered in the previous article and that I may use in this article. The Demo from the previous article is also included in the Threejs utility class, and the plug-in from the next article will inherit this utility class.

Ready to update the series

  • The Threejs Tool Class Thing
  • About writing a particle effects transform plugin (in final preparation)
  • “On the use of shader built-in functions and effects” (in draft)
    • Various functions and special effects
  • “This thing about shader lighting” (draft)
    • Flat coloring
    • Per-vertex shading (Gouraud shading)
    • Per-pixel coloring (Phong Coloring)
    • BlinnPhong Lighting model
    • Fresnel effect
    • Cartoon shaders
  • “On local Coordinates, World coordinates, projection coordinates” (in draft)
    • Local coordinates
    • The world coordinates
    • The projection coordinate
    • Matrix transformation
  • About github homepage Earth Special Effects
  • About this D3js thing
  • About the visualization of a data diagram
  • About writing a little hop game.
    • Scenario generation
    • Collision detection
    • The game logic