This is the sixth day of my participation in the First Challenge 2022. For details: First Challenge 2022.

preface

Following Hello WebGPU — Rotating Cube — Digging Gold (juejin. Cn), we introduced the method of operating 3D objects in WebGPU, and learned that besides filling data when creating Buffer, You can also use the device.queue.writeBuffer API to write data. Today, let’s continue to operate the cube, today’s content is to map the cube ~

Map the cube

Mapping the cube is divided into the following steps:

  1. “Dress pattern” – Create texture
  2. “Dress print” – copies the texture of the original image into the texture object
  3. “Manufacturing Specification” — Create sampler
  4. “Send express” – pass texture and Sampler to GPU
  5. “Dress according to instructions” — Shader render

The clothing proofing

Now let’s put some nice clothes on the bare cube. First we need to “make” the dress:

cubeTexture = device.createTexture({
  dimension: '2d'.size: [imageBitmap.width, imageBitmap.height],
  format: 'rgba8unorm'.usage:
    GPUTextureUsage.TEXTURE_BINDING |
    GPUTextureUsage.COPY_DST |
    GPUTextureUsage.RENDER_ATTACHMENT,
});

Copy the code

The meanings of parameters are as follows:

  • Dimension: Textures can be 1D, 2D or 3D, so we’ll just use 2d textures here.
  • Size: Corresponds to dimension and represents the size of the texture
  • Format: ‘rgba8unorm’
    • R, G, B, A: stands for red, green, blue, alpha
    • Unorm: indicates that unsigned normalized values range from 0 to 1. In addition, there are the following formats:
      • snorm: signed normalized
      • uint: unsigned int
      • sint: signed int
      • float: floating point
  • Usage: At least three usage types are required.

Garment printing

Now the “proofing” of the clothes is finished, but there is no pattern on the clothes now, we need an image, and COPY the pattern over

    device.queue.copyExternalImageToTexture(
      { source: imageBitmap },
      { texture: cubeTexture },
      [imageBitmap.width, imageBitmap.height]
    );

Copy the code

CopyExternalImageToTexture this function takes three arguments:

  • source: GPUImageCopyExternalImage, which is an object that needs to have the following properties:
    • Must have the source attribute: onlyImageBitmap | HTMLCanvasElement | OffscreenCanvasObject. Note that this is not acceptableHTMLImageElementIn WebGL, however, this is ok. In WebGPU, the texture passed into it must be a decoded texture, and WebGL can help me decode it. This is also an optimization point to improve the performance of incoming textures in WebGL.
    • Origin (Optional) : Texture data is copied into the GPU, so we need to provide the point where to start copying.
  • destination: GPUImageCopyTextureTagged, which needs to have the following attributes:
    • Must have the texture property, forGPUTexture
    • Origin (Optional) : Indicates where the copied data should be pasted into the texture.
  • CopySize: Indicates the size of the copied texture area.

Manufacturing specification

Now the production of the clothes has been completed, and the express delivery has been sent to GPU, but the clothes are a little flashy, maybe THE GPU doesn’t know how to wear the new clothes, so now we need to send an instruction letter to GPU.

  const sampler = device.createSampler({
    magFilter: 'linear'.minFilter: 'linear'});Copy the code

So the spec is there, so what does that do, and how do we compare it to WebGL?

this.texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, this.texture);
gl.texImage2D(gl.TEXTURE_2D, level, internalFormat, srcFormat, srcType, img);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
Copy the code

In WebGL, the instructions should be sent to the GPU along with the product when the clothes are made. In WebGPU, the instruction is sent separately, which has one advantage. This improves the reusability of the instruction. If another similar garment is sent to the GPU, it can continue to reuse the previous instruction.

Send the Courier

Now, we come to the familiar part of sending packages. Similar to the Hello WebGPU — Rotating Cube — Digging Gold (juejin. Cn) article, we need to create a BindGroup to send data to and from the GPU.

  const uniformBindGroup = device.createBindGroup({
    layout: pipeline.getBindGroupLayout(0),
    entries: [{binding: 0.resource: {
          buffer: uniformBuffer,
        },
      },
      {
        binding: 1.resource: sampler,
      },
      {
        binding: 2.resource: cubeTexture.createView(),
      },
    ],
  });
Copy the code

Pack these data together and wait for the final delivery process ~! (Submit drawCall request)

Dress according to instructions

Ok, so after we ship, now we go to the GPU, and the GPU receives the goods, and the Shader is up and running. Let’s see how it reads the instructions and gets dressed.

[[group(0), binding(1)]] var mySampler: sampler;
[[group(0), binding(2)]] var myTexture: texture_2d<f32>;

[[stage(fragment)]]
fn main([[location(0)]] fragUV: vec2<f32>,
        [[location(1)]] fragPosition: vec4<f32>) -> [[location(0)]] vec4<f32> {
  return textureSample(myTexture, mySampler, fragUV) * fragPosition;
}

Copy the code

As you can see, we declare a variable mySampler of type sampler. The WGSL standard explains this:

Sampler. Mediates access to a sampled texture.

Simply put, sampler is used to read texture information, and the way to read texture varies according to the configuration of Sampler.

TextureSample is a built-in function in the WGSL language for sampling images based on the provided texture, Sampler, texture coordinates. Similar to the texture2D function in WebGL, with the addition of a Sampler object.

Finally, let’s see what our cube looks like in clothes.

conclusion

Today we learned how to map the cube. In a nutshell, it is divided into the following steps:

  1. “Dress pattern” – Create texture
  2. “Dress print” – copies the texture of the original image into the texture object
  3. “Manufacturing Specification” — Create sampler
  4. “Send express” – pass texture and Sampler to GPU
  5. “Dress according to instructions” — Shader render

The rest of the steps are similar to what we learned before, but not too complicated today. If you’re interested, you can come down and do it yourself. Well, if you find this article useful, don’t forget to like it. Your encouragement is the author’s motivation to update it!