preface

The most important thing to learn programming is to do it! Before you start, you need to build a framework, so today I’m going to share with you how to build a WebGL project using VUE3.

This is not to say that you need React, VUE or other front-end libraries/frameworks to write WebGL projects. You can still write comfortably and smoothly in native JavaScript. But based on the front-end framework to write, there are still many benefits: for example, the company is using VUE, so write with VUE can fit the company’s technology stack, two to learn VUE3. Why not?

Write WebGL programs based on VUE3

First I chose @vue/cli. For Vite, WebPack is a little more familiar. In this way, the cost of learning is not very high and can be controlled. All right, let’s cut to the chase!

Build VUE3

Build a project using the @vue/cli command line

vue create webgl-test
Copy the code

I’ll then write VUE code in JSX style, so I’ll also install the plug-in

npm install @vue/babel-plugin-jsx -D
Copy the code

Next, configure Babel and add the plugins attribute to babel.config.js:

{
  "plugins": ["@vue/babel-plugin-jsx"]}Copy the code

Next, create a clickpoints.js file because the following is an example of clicking to add points.

import { defineComponent, ref, onMounted } from 'vue';
export const ClickedPointes = defineComponent(({
    setup () {
        const root = ref(null);
        const canvasDown = (e) = > {}
        return () = > <canvas ref={root} onClick={canvasDown}  width="400" height="400"></canvas>}}))Copy the code

This is the setup of VUE (see the official document for the syntax).

Write WebGL

Before writing, let’s talk about the WebGL process. It can be roughly divided into these five steps:

  • To obtain<canvas>Element to createWebGLDrawing context
  • Write vertex shader and fragment shader source code
  • Creates the shader object and loads and compiles the shader code
  • Creates a program object and inserts and links the shader object
  • draw

Get canvas element to create WebGL drawing context: Use ref in VUE to get canvas element

const root = ref(null);
onMount () {
    const canvas = root.value;
}
return () = > <canvas ref={root}></canvas>
Copy the code

This allows you to retrieve the Canvas element from the onMount lifecycle function. Use Canvas to create a WebGL drawing context, which is abstracted as a function for convenience.

const getWebglContext = (canvas) = > {
  const ctx = canvas.getContext('webgl');
  return ctx
}
onMount () {
    const canvas = root.value;
    const gl = getWebglContext(canvas);
}
Copy the code

Write vertex shader and fragment shader source code: Shader is a beautiful visual effect implemented using a programming language similar to C. The language in which shaders are written is also called shading Language. OpenGL ES2.0 gives Shading to OpenGL Shader Language (GLSL), so the latter is also called OpenGL ES Shader Language (GLSL ES). WebGL is based on OpenGL ES2.0, so shaders are also written in GLSL ES.

In JavaScript, shader programs are “embedded” as strings. WebGL requires two kinds of shaders:

  • Vertex shader(Vertex shader)As the name suggests, vertex shaders are used to describe the properties of vertices (position, color, etc.). A vertex is a point in two or three dimensions.
  • Fragment shader(Fragment shader): A program in which a fragment is operated, a fragment(fragments)Is aWebGLCan be simply understood as a pixel.
const VShader = ` attribute vec4 a_Position; void main() { gl_Position = a_Position; Gl_PointSize = 10.0; } `
const FShader = 'void main() {gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); } `
Copy the code

In this case, the shader program is very simple, just setting the vertex position and size and the color of the fragment. Note that gl_Position is a built-in variable and must be assigned, otherwise the shader will not work properly.

Unlike JavaScript, GLSL ES is a strongly typed language, and variable types must be explicitly specified.

  • Float: indicates a floating point number
  • Vec4: represents a vector consisting of four floating point numbers

Vertex shader control point position and size, segment shader control point color. They all start with the main function.

Create shader objects and load and compile shader code: in order to create a WebGL shader that can be loaded into a GPU and can draw geometric shapes. You need to create a shader object, load the source code into it, and then compile and link to the shader. Since the vertex and fragment shader objects are created the same way, this step can also be abstracted as a function:

const loadShader = (gl, type, source) = > {
  const shader = gl.createShader(type);
  if (shader === null) {
    console.log('unable to create shader')
    return null
  }
  gl.shaderSource(shader, source)
  gl.compileShader(shader)

  const compiled = gl.getShaderParameter(shader, gl.COMPILE_STATUS)
  if(! compiled) { gl.getShaderInfoLog(shader)console.log('Failed to compile shader')
    gl.deleteShader(shader)
    return null;
  }
  return shader;
}
Copy the code

Create a shader object using the gl.createshader method. The parameter can be gl.VERTEX_SHADER or gl.FRAGMENT_SHADER. 2, then use the gl.shaderSource method to load the source code into the shader object. The first parameter is the created shader object and the second parameter represents the source code for the shader. 3, after loading, call the gl.compileShader method to compile the shader. Finally, use the gl.getShaderParameter method to check the compilation status. In case of a compilation error, the shader object is removed using the gl.deleteshader method.

Create the program object and insert and link the shader object: Once you have created the shader object, you need to create the program object.

const createProgram = (gl, vshader, fshader) = > {
  const vertexShader = loadShader(gl, gl.VERTEX_SHADER, vshader)
  const flagShader = loadShader(gl, gl.FRAGMENT_SHADER, fshader)
  if(! vertexShader || ! flagShader) {return null
  }
  const program = gl.createProgram();
  if(! program) {return null;
  }
  gl.attachShader(program, vertexShader)
  gl.attachShader(program, flagShader)

  gl.linkProgram(program)
  var linked = gl.getProgramParameter(program, gl.LINK_STATUS);
  if(! linked) {var error = gl.getProgramInfoLog(program);
    console.log('Failed to link program: ' + error);
    gl.deleteProgram(program);
    gl.deleteShader(flagShader);
    gl.deleteShader(vertexShader);
    return null;
  }
  gl.useProgram(program)
  gl.program = program
}
Copy the code
  • callgl.createProgramMethod to create a program object.
  • And through thegl.attachShaderMethod to load the compiled vertex shader object and fragment shader object into the program object.
  • And then callgl.linkProgramMethod performs the link operation, and if the link succeeds, you get a program object.
  • callgl.useProgramMethod, tellWebGLThe engine can use this program object to draw graphics

After the link, the WebGL implementation binds the used properties used by the vertex shader and the fragment shader to the generic property index. The WebGL implementation has assigned a fixed number of slots to attributes of vertices, and the generic attribute index is the identifier of one of those slots.

Drawing: Shaders are created in the WebGL system after a series of initializations. WebGL then parses the shader and identifies the attribute variables that the shader has. Each variable has a storage address so that data can be transmitted to the variable through the storage address. For example, to transfer data to the a_Position variable of the vertex shader, we first use the gl.getAttribLocation method to request the location of the variable from the WebGL system.

let a_Position = gl.getAttribLocation(gl.program, 'a_Position');
Copy the code

The first argument to the method is the program object, because it contains the vertex shader and the fragment shader, and the second argument is the name of the variable you want to get.

Once you have the location of the variable, you need to transfer data to the variable, in this case by clicking to get the location. The information about the location of the mouse click is stored in the event object E, and the coordinates of the location can be obtained by means of e. cadentx and E. cadenty. But this coordinate cannot be used directly:

  • The coordinates of the mouse click are browser coordinates, notcanvasCoordinates of elements
  • canvasThe coordinate system ofWebGLThe coordinate system of theta is different, its origin position and the positive direction of the Y axis are different

First, convert the coordinates from browser coordinate to Canvas coordinate, and then to WebGL coordinate:

const getBounding = (e) = > {
  const canvas = wRoot.value;
  let x = e.clientX;
  let y = e.clientY;
  const rect = e.target.getBoundingClientRect();
  x = ((x - rect.left) - canvas.width/2)/(canvas.width/2);
  y = (canvas.height/2 - (y - rect.top))/(canvas.height/2);
  return {
    x,
    y
  }
}
Copy the code
  • usegetBoundingClientRectMethods to obtaincanvasCoordinates,The rect. The left and the rect. TopiscanvasThe origin. such(x - rect.left)与(y - rect.top)You can convert the browser coordinates tocanvasCoordinates in the coordinate system.
  • willcanvasTransform the coordinate system toWebGLCoordinate system, first getcanvasThe center of the(Canvas.height / 2, Canvas.width / 2)
  • use(x-rect.left) - Canvas.width /2 and Canvas.height /2 - (Y-rect.top)willcanvasThe origin is shifted to the center
  • The lastcanvasThe x-coordinate of PI is from 0 to PIcanvas.widthAnd the Y-axis is from 0 to PIcanvas.height. becauseWebGLThe coordinates of the center axis are from -1.0 to 1.0, so dividing the x and y coordinates by the center coordinates is ok.
const canvasDown = (e) = > {
  const { x, y } = getBounding(e)
  g_points.push(x);
  g_points.push(y)
  gl.clear(gl.COLOR_BUFFER_BIT)
  const len = g_points.length;
  for (var i = 0; i < len; i+= 2) {
    gl.vertexAttrib3f(a_Position, g_points[i], g_points[i+1].0.0)
    gl.drawArrays(gl.POINTS, 0.1)}}Copy the code

usegl.vertAttrib3fMethods toa_PositionVariables transfer data. And then togl.drawArraysMethod to draw a point.

Look at the effect:

Transform WebGL

While good, for this example it was just a simple click on the render point on the screen, but the process was too complicated. Another point is that this process is the same for every WebGL application, so can it be abstracted?

Moreover, as mentioned earlier, shader language is a programming language, but in JavaScript it is just a string, and it does not give corresponding treatment!

Based on these points, let’s modify the WebGL program: Write a separate shader language: First install two Loaders:

npm install glslify-loader raw-loader
Copy the code

Next, configure WebPack. In the @vue/cli build project, configure WebPack in the vue.config.js file.

module.exports = {
  chainWebpack: config= > {
    config.module
      .rule('webgl')
      .test(/\.(glsl|vs|fs|vert|frag)$/)
      .exclude
        .add(/node_modules/)
        .end()
      .use('raw-loader')
        .loader('raw-loader')
        .end()
      .use('glslify-loader')
        .loader('glslify-loader')
        .end()
  }
}
Copy the code

After the configuration is complete, if you are not sure whether the configuration was successful, you can use the following command to display the final WebPack configuration:

vue inspect > output.js
Copy the code

And that’s what you getoutput.jsThe file. Inside it iswebpackconfigurationAnd then install twovscodePlug-in:

  • Glsl-literal: Used for syntax highlighting
  • GLSL Lint: For code detection

So you can writeglslThe file

Integrating the WebGL process with the REGL library:

npm install regl
Copy the code

Use require to introduce

import { defineComponent, ref, onMounted } from 'vue';
import VSHADER from './vshader.glsl';
import FSHADER from './fshader.glsl';
const regl = require('regl');
export const ClickedPointes = defineComponent(({
    setup () {
        const root = ref(null); .const canvasDown = (e) = > {
            const point = getBounding(e);
            gl.clear(gl.COLOR_BUFFER_BIT)
            g_points.push(point);
            g_points.forEach(item= >{ drawPoint(item); })}; onMounted(() = > {
          gl = getWebglContext(wRoot)
          const reglCtx = regl(gl);
          drawPoint = reglCtx({
            frag: () = > FSHADER,
            vert: () = > VSHADER,
            count: 1.primitive: 'points'.attributes: {
              'a_Position': reglCtx.prop('point')}}); gl.clearColor(0.0.0.0.0.0.1.0);

          // Clear <canvas>
          gl.clear(gl.COLOR_BUFFER_BIT);
        })
        return () = > <canvas ref={root} onClick={canvasDown}  width="400" height="400"></canvas>}}))Copy the code

Now all you need to do is configure reglCtx, and a few lines of code will replace the previous processes.

So much for a basic WebGL project, if there are better examples in the comments section, I’ll learn!

At the end

For more articles, please visit Github. If you like, please click star. It is also a kind of encouragement to the author.