Using G underlying graphics renderer produced by Ali Antvis and audio data acquisition API provided by AudioContext, audio effects similar to netease Cloud playback are realized.

Project Address:

  • ⚡ making
  • 💡 Demo

🍭 about the G renderer

G is an easy-to-use, efficient and powerful 2D visual rendering engine, providing Canvas, SVG and other rendering methods. At present, there are several top open source visualization projects based on G, such as graph syntax library G2, graph visualization library G6, etc.

As a low-level renderer, it has many commonly used built-in graphics, provides a complete simulation of DOM events, and provides an animated implementation of the flow, which is necessary for our implementation of audio effects this time.

At present, there are other rival products similar to G, Echart’s ZRender. In my personal opinion, ZRender provides richer API, but the difficulty to get started is a little higher than G, while G’s API is relatively simple.

Similarly, there is Big Brother D3, which is more low-level and has richer apis than the above two, but it is more difficult to get started. At the same time, some methods in G seem to refer to d3 algorithm ideas.

The official documentation of G (here to make fun of, the official documentation of G feels that there is still a lot of room for improvement, it is too concise, many OF the APIS are just mentioned, and the usage is not explained very much)

🌟AudioContext Reads audio data

The premise of realizing audio special effects animation is to get the audio data of an audio. After browsing some schemes on the Internet, I found that the AudioContext contains relevant APIS.

Principle:

  • The first one is based onAudioContext.createAnalyser()To create aAnalyser
  • forAnalyserAssociate audio sources. Currently, the common audio source methods are as follows
    • createMediaElementSource(): associated withaudioorvideoTAB (selected for the current schema)
    • createMediaStreamSource(): Associated with a local computer or network audio media stream object
  • createGainVolume node and associated toAnalyserthedestinationIn the
  • throughAnalyserNode.getByteFrequencyData()The Uint8Array method copies the current frequency data into the Uint8Array that is passed in to finally read the audio

To encapsulate the above operations into a class for easy initialization, refer to the following code:

// src/plugins/MusicVisualizer.ts
const _analyser = new window.AudioContext();

typeMusicVisualizerOptions = { audioEl? : HTMLAudioElement; size? :number;
}
export class MusicVisualizer {
    private analyser: AnalyserNode;
    private gainNode: GainNode;
    privateaudioSource? : MediaElementAudioSourceNode;private options: MusicVisualizerOptions & {
            size: number
    };
    private visualArr: Uint8Array;
    constructor(options? : MusicVisualizerOptions) {
        const defaultOptions = {
            size: 128
        }
        this.options = { ... defaultOptions, ... options }this.analyser = _analyser.createAnalyser();
        this.analyser.fftSize = this.options.size * 2;
        this.gainNode = _analyser.createGain();
        this.gainNode.connect(_analyser.destination);
        this.analyser.connect(this.gainNode);
        if (this.options.audioEl) {
            this.audioSource = _analyser.createMediaElementSource(this.options.audioEl)
            this.audioSource.connect(this.analyser)
        }
        this.visualArr = new Uint8Array(this.analyser.frequencyBinCount);
        this.resumeAudioContext();
    }
    // The new Version of Chrome Audio requires interactive behavior before playing with JS
    private resumeAudioContext() {
        if (_analyser) {
            const resumeAudio = () = > {
                if (_analyser.state === 'suspended') _analyser.resume();
                document.removeEventListener('click', resumeAudio)
            }
            document.addEventListener('click', resumeAudio)
        }
    }
    / / change the Audio
    setAudioEl(el: HTMLAudioElement) {
        if (this.audioSource) {
            this.audioSource.disconnect(this.analyser)
        }
        this.audioSource = _analyser.createMediaElementSource(el)
        this.audioSource.connect(this.analyser)
    }
    // Get the frequency domain data
    getVisualizeValue() {
        this.analyser.getByteFrequencyData(this.visualArr)
        return this.visualArr;
    }
    // Change the volume
    changeVolumn(value: number) {
        this.gainNode.gain.value = value
    }
    / / unloading
    destory() {
        this.analyser.disconnect(this.gainNode);
        this.audioSource? .disconnect(this.analyser)
        this.gainNode.disconnect(_analyser.destination); }}Copy the code

GetVisualizeValue () : getVisualizeValue() : getVisualizeValue() : getVisualizeValue() : getVisualizeValue() : getVisualizeValue() : getVisualizeValue() : getVisualizeValue() RequestAnimationFrame is used to read and render each frame.

Another point to note is that when the Audio source is network Audio, there may be a problem that Audio data cannot be read. This problem is probably due to the cross-domain limitations of network Audio, which requires adding the crossOrigin=”anonymous” attribute to the Audio tag. General CDN resources rarely set the AccessHeader cross domain limit, but after adding this attribute, there is still a cross domain error, indicating that the network path is set to cross domain limit, at this time can consider using Nginx reverse proxy or server to solve.

<audio controls src={audioURL} crossOrigin="anonymous"></audio>
Copy the code

🌈 Visual effects implementation

The following describes the implementation principles of some functions of the project

Album picture rotation animation

Because each example requires the album image rotation animation, the creation of the album image is removed for convenience.

To draw a circular image in G, you need to use a Clip, which is not documented, but found in Github.

Basic attributes cannot be directly used to simulate rotation animation. Matrix transformation is used here. Shape.getmatrix () is used to obtain the initial matrix, and then the corresponding matrix of each ratio is calculated by transform.

Transform is an extended matrix transformation method provided by G. It takes two parameters, the first is the current matrix, and the second parameter is the Action array. The rotation action is:

['t', -x, -y], ['t', x, y],Copy the code

The code reference is as follows:

import { Canvas } from "@antv/g-canvas";
import { ext } from '@antv/matrix-util';

const { transform } = ext // G provides the matrix transformation shortcut method

type ImageCircleConfig = {
  x: number;
  y: number;
  r: number; shadowColor? :string
}
export function getImageCircle(canvas: Canvas, { x, y, r, shadowColor }: ImageCircleConfig) {
  const shadowConfig = shadowColor ? {
    shadowColor,
    shadowBlur: 16
  } : {}
  canvas.addShape('circle', {
    attrs: {
      x,
      y,
      r,
      fill: '# 262626'. shadowConfig } })const shape = canvas.addShape('image', {
    attrs: {
      x: x - r,
      y: y - r,
      width: 2 * r,
      height: 2 * r,
      img: `https://source.unsplash.com/random/The ${2 * r}xThe ${2 * r}? Nature`
    }
  })
  shape.setClip({
    type: 'circle'.attrs: {
      x,
      y,
      r
    }
  })
  // Rotate the animation
  const matrix = shape.getMatrix()
  const radian = 2 * Math.PI // Rotate 360 degrees
  shape.animate((ratio: number) = > {
    return {
      matrix: transform(matrix, [
        ['t', -x, -y],
        ['r', radian * ratio],
        ['t', x, y],
      ])
    }
  }, {
    duration: 10000.repeat: true
  })
  // Pause the animation and wait for it to play
  setTimeout(() = > {
    shape.pauseAnimate()
  })
  return shape
}

Copy the code

A point on a circle

The points on the circle are often calculated in the examples. In the case of the bar effect (Example 1), the initial coordinates are an average of 64 points around the circle.

The offset of x and y can be calculated by using the Angle between the current point and the center of the circle combined with simple trigonometric functions.

In the figure below, l = cos(θ) * r, t = sin(θ) * r, the coordinates of point A can be calculated by adding the offset to the coordinates of the center O.

// POINT_NUM = 64 bars
sArr.current = Array.from({ length: POINT_NUM }, (item, index: number) = > {
    const deg = index * (360 / POINT_NUM) - 150;  // The current Angle
    const l = Math.cos(deg * Math.PI / 180)  	  // Offset in x direction
    const t = Math.sin(deg * Math.PI / 180)       // Offset coefficient in the y direction
    const r = R + OFFSET
    return (canvas.current as Canvas).addShape('rect', {
        attrs: {
            width: RECT_WIDTH,
            height: RECT_WIDTH,
            radius: RECT_WIDTH / 2.x: X + l * r - RECT_WIDTH / 2.y: Y + t * r - RECT_WIDTH / 2.fill: RECT_COLOR
        }
    }).rotateAtPoint(X + l * r, Y + t * r, (deg - 90) * Math.PI / 180)})Copy the code

Here each bar needs to be rotated to align around the circle, using rotateAtPoint to rotate the corresponding Angle around the initial point.

In almost all cases, the coordinates of the points around the circle need to be computed first, and this is how they are computed.

Draw circles using Path

Some scenarios require round-like animations (examples 2 and 3) that are not possible with circles, so use Path.

When the initial state is not played, a circle is displayed by default. This reduces the need to create an instance of a circle. The circle can be drawn directly with Path, and the Path instance can be changed directly in subsequent animations.

You can use 2 arc generation to generate a circular Path, refer to the code below

export function getCirclePath(cx: number, cy: number, r: number) {
  return `M ${cx - r}.${cy}
  a ${r}.${r}0, 1, 0${r * 2}, 0 
  a ${r}.${r}0, 1, 0${-r * 2}, 0 `
}
Copy the code

A smooth curve is formed through points

Just connecting a set of points to a line would be visually jarring, and changing Path to a curve would not be smooth enough.

In this case, we can use interpolation method to insert intermediate points for continuous target points to make the Path smoother. Generally speaking, cubic spline interpolation algorithm is used to achieve this.

There are a number of wired algorithm schemes built into D3 that can be used directly. In this example, Path Path was generated by curveCardinalClosed algorithm D3 when multiple points were encountered to generate smooth curves.

// s-path.tsx
import { line, curveCardinalClosed } from 'd3'
// some other code...
useEffect(() = > {
    if(props.data? .length) {const pathArr: any[] = [[],[],[],[]]
        getArray(props.data).map((item,index) = > {
            pathArr[index % 4].push(getPointByIndex(index, item * item / 65025 * POINT_OFFSET + 4))
        })
        pathArr.map((item,index) = > {
            // Generate a smooth curve Path using D3 curveCardinalClosed interpolation for the target point group
            const path = line().x((d: [number.number]) = > d[0]).y((d: [number.number]) = > d[1]).curve(curveCardinalClosed)(item) sPathArr.current[index]? .attr('path', path)
        })
    }
}, [ props.data ])
Copy the code

For other examples of smoothing curve algorithms, please refer to the Demo I wrote long ago: Click Here

Points on a circle follow the circle as it enlarges

The animation in Example 5 shows the point on the circle moving in a circle as the circle enlarges. This animation can be implemented in two ways:

In the first way, the large circle is simulated with Path, and then in each frame of the animation, path.getPoint (ratio: number) is used to obtain the coordinates of a corresponding point under the current frame of the middle point of the current large circle.

The second method is to directly calculate the position of the point on the circle under the current frame, and the coordinate of the current point can be calculated by using trigonometric function combined with the amplification offset coefficient and ratio of the large circle.

When the first scheme was implemented, the effect was not ideal. I don’t know if there is a setTimeout, so I abandoned it and chose scheme 2 for implementation.

Some reference codes are as follows:

Array.from({ length: CIRCLE_NUM }, (item, index) = > {
    circleArrStart.current.push(false)
    / / circle circle
    circleArr.current.push(addCircle())
    circleArr.current[index].animate((ratio: number) = > {
        return {
            r: R + ratio * CIRCLE_SCALE_OFFSET,
            // path: getCirclePath(X, Y, R + ratio * 80),
            opacity: ratio > 0.02 && ratio < 0.9 ? 0.8 - ratio * 0.8 : 0
        }
    }, animateOption)
    // circle-dot Points on a large circle
    circleDotArr.current.push(addCircleDot())
    circleDotDegArr.current.push(0)
    circleDotArr.current[index].animate((ratio: number) = > {
        if (props.data && ratio < 0.05 && !circleDotDegArr.current[index]) {
            circleDotDegArr.current[index] = pickStartPoint()
        } else if (ratio > 0.9) {
            circleDotDegArr.current[index] = 0
        }
        const deg = circleDotDegArr.current[index] + ratio * 360 - 180
        const l = Math.cos(deg * Math.PI / 180)
        const t = Math.sin(deg * Math.PI / 180)
        const r = R + ratio * CIRCLE_SCALE_OFFSET
        return {
            x: X + l * r,
            y: Y + t * r,
            r: DOT_R * (1 - ratio / 2),
            opacity: ratio > 0.05 && ratio < 0.9 ? 0.8 - ratio * 0.8 : 0
        }
    }, animateOption)
})
Copy the code

The realization of particle effects

Example 6 is a particle effect, but also to achieve so many examples of a time-consuming one, here to take out the implementation principle.

Initialize the album circle as in the other examples.

Then prepare to initialize the particle, define the circle as the particle shape, as small as possible, can turn on the Shadow effect, but the performance will be poor, this time I will turn off Shadow.

Define the number of particles around each sample point, currently 64 audio samples, set 12 particles per sample point (can be more, again more energy consumption), the final number of particles is 64 X 12.

Use random value to generate particle sample point, here can use the current Angle of sample point and then random offset a certain amount to generate uniform particles.

The hard part about particle effects is the animation, choosing a proper float animation function. In this example, sine function is selected to realize the left and right uniform floating, and setTimeout is used to randomly delay the particle generation time to complete the animation of particle floating under a certain law.

When defining particle animation, the actual x and Y coordinates of each frame can be calculated by sine function and ratio. Because the current audio data will also be combined this time, so that the particles at a sample point will float a little higher, so that the offset of the particles will be larger, and then the animation needs to be further changed.

// POINT_NUM = 64
// PARTICLE_NUM = 12
Array.from({ length: POINT_NUM }, (point, index1) = > {
    Array.from({ length: PARTICLE_NUM }, (particle, index2) = > {
        const deg = index1 * (360 / POINT_NUM) - 150 + (Math.random() - 0.5) * 10;
        const l = Math.cos(deg * Math.PI / 180)
        const t = Math.sin(deg * Math.PI / 180)
        const r = R + OFFSET
        const x = X + l * r
        const y = Y + t * r
        const particleShape = (canvas.current as Canvas).addShape('circle', {
            attrs: {
                x,
                y,
                r: 0.8.fill: '#fff'.opacity: 0.// ⚠ Drop frames if shadows are enabled
                // shadowColor: '#fcc8d9',
                // shadowBlur: 1
            }
        })
        particleShape.animate((ratio: number) = > {
            const deg = index1 * (360 / POINT_NUM) - 150 + Math.sin(ratio * 20) * 4;
            const l = Math.cos(deg * Math.PI / 180)
            const t = Math.sin(deg * Math.PI / 180)
            const _index = POINT_NUM * index1 + index2
            if (particleActiveArr.current[_index]) {
                if (ratio < 0.02) {
                    particleActiveArr.current[_index] = 
                        index1 >= currentActiveIndex.current - 1 && index1 <= currentActiveIndex.current + 1 
                        ? POINT_ACTIVE_MOVE_LENGTH 
                        : POINT_MOVE_LENGTH
                } else if (ratio > 0.98) {
                    particleActiveArr.current[_index] = POINT_MOVE_LENGTH
                }
            }
            const offset = particleActiveArr.current[_index] || POINT_MOVE_LENGTH
            return {
                x: x + l * ratio * offset,
                y: y + t * ratio * offset,
                opacity: 1 - ratio
            }
        }, {
            duration: POINT_CREATE_DELAY,
            repeat: true.easing: 'easeSinInOut'
        })
        particleArr.current.push(particleShape)
        particleStartArr.current.push(false)
        particleActiveArr.current.push(POINT_MOVE_LENGTH)
    })
})
Copy the code

✨ Other Instructions

This project is a training project based on Vite, React and Typescript. React is not often used in the project. Please kindly advise if there is any problem or bad writing in the project.

Or if you have some nice special effects, you can also make an ISSUE or PR to communicate how to achieve it.

Project Github: Click Here

Project Demo: Click Here

I recommend articles in the past

  • Vite + Vue3 develop a custom browser start page website
  • How to implement a lightweight breakpoint continuation personal web disk system