Recently, I listened to a very nice song “Flowers All the way”, so I wanted to use three. js to make a visualization of the music spectrum, and the final effect is like this:

The code address is here: github.com/QuarkGluonP…

There are two things you can learn from implementing this effect:

  • The AudioContext decodes the audio and does all sorts of things
  • 3d scene rendering of three.js

So what are we waiting for? Let’s get started.

Thought analysis

To visualize the music spectrum, the first thing you need to do is get the spectrum data, and this uses the API of the AudioContext.

AudioContext’s API decodes audio and performs a series of processing steps on it, each called a Node.

We need to use the analyser to get the spectrum data after decoding and pass it to the audioContext for playback. So there are three processing nodes: Source, Analyser, Destination

context audioCtx = new AudioContext();

const source = audioCtx.createBufferSource();
constanalyser = audioCtx.createAnalyser(); DecodeAudioData (audio binary data,function(decodedData) {
    source.buffer = decodedData;
    source.connect(analyser);
    analyser.connect(audioCtx.destination);
});
Copy the code

The audio is decoded, the BufferSource node is created to hold the decoded data, then passed to Analyser for the spectrum data, and finally passed to Destination for playback.

Call source.start() to start passing the audio data, so analyser can get the music spectrum data and Destination plays normally.

The analyser API for getting the audio spectrum data looks like this:

const frequencyData = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(frequencyData);
Copy the code

FrequencyData is 1024 ata time. If you average it by 50, you’ll only have 1024/50 = 21 frequency units.

Then you can plot the spectrum data using three.js.

21 values, which can be plotted into 21 cube BoxGeometry. For the material, MeshPhongMaterial (because this reflection calculation method was proposed by a feng, so it is called Phong) is featured by the reflective surface, If you use MeshBasicMaterial, it’s not reflective.

Then add the petal rain effect, which we did earlier, by using a Sprite (a plane that always faces the camera) to make the texture, and then making the position change frame by frame.

Get an introduction to three.js by playing “A Shower of Flowers”

Then set the light and camera respectively:

For lighting, we use PointLight, which irradiates from one position. With the material of Phong, the effect of reflection can be achieved.

Camera with a perspective camera PerspectiveCamera, it features from a point to see, will have almost coscodl little effect, more space. Since OrthographicCamera is a parallel projection, it does not have the effect of large near and small far. Objects are the same size no matter how far away they are.

It is then rendered by Renderer and refreshed frame by frame using requestAnimationFrame.

Next we write the code:

Code implementation

We first fetch the audio file on the server and turn it into an ArrayBuffer.

ArrayBuffer is an API for storing binary data. It is similar to Blob and Buffer.

  • ArrayBuffer is a generic API for storing binary data provided by the JS language itself
  • Blob is a browser-provided API for file processing
  • Buffer is an API provided by Node.js for IO operations

Here, of course, we use an ArrayBuffer to store the audio binary.

fetch('./music/ all the way to flower.mp3 ')
.then(function(response) {
    if(! response.ok) {throw new Error("HTTP error, status = " + response.status);
    }
    return response.arrayBuffer();
})
.then(function(arrayBuffer) {});Copy the code

Then use the API of AudioContext for decoding and subsequent processing, which is divided into Source, Analyser and Destination:

let audioCtx = new AudioContext();
let source, analyser;

function getData() {
    source = audioCtx.createBufferSource();
    analyser = audioCtx.createAnalyser();

    return fetch('./music/ all the way to flower.mp3 ')
        .then(function(response) {
            if(! response.ok) {throw new Error("HTTP error, status = " + response.status);
            }
            return response.arrayBuffer();
        })
        .then(function(arrayBuffer) {
            audioCtx.decodeAudioData(arrayBuffer, function(decodedData) {
                source.buffer = decodedData;
                source.connect(analyser);
                analyser.connect(audioCtx.destination);
            });
        });
};
Copy the code

You get the audio, you process it with the AudioContext, and you can’t play it directly because the browser restricts it. Audio can be played only after the user has done something.

To get around this limitation, we listen for mousedown events, and when the user clicks, they play.

function triggerHandler() {
    getData().then(function() {
        source.start(0); // Start from position 0

        create();  // Create various objects for three.js
        render(); / / rendering
    });
    document.removeEventListener('mousedown', triggerHandler)
}
document.addEventListener('mousedown', triggerHandler);
Copy the code

You can then create various objects in the 3D scene:

Create a cube:

Since the spectrum is 1024 data, we only need to render 21 cubes by dividing 50 into groups:

const cubes = new THREE.Group();

const STEP = 50;
const CUBE_NUM = Math.ceil(1024 / STEP);

for (let i = 0; i < CUBE_NUM; i ++ ) {
    const geometry = new THREE.BoxGeometry( 10.10.10 );
    const material = new THREE.MeshPhongMaterial({color: 'yellowgreen'});
    const cube = new THREE.Mesh( geometry, material );

    cube.translateX((10 + 10) * i);

    cubes.add(cube);
}
cubes.translateX(- (10 +10) * CUBE_NUM / 2);

scene.add(cubes);
Copy the code

The cube object Mesh is set as BoxGeometry, length, width and height are 10, the material is MeshPhongMaterial, and the color is yellow and green.

Each cube should make a displacement of the X-axis, and finally make another displacement of the grouping of the whole, moving half of the width of the whole, to achieve the purpose of middle.

The spectrum can be visualized through these cubes.

Then there are the petals, created with Sprite because Sprite is the plane that always faces the camera. Paste random texture map and set random position.

const FLOWER_NUM = 400;
/** ** petals grouped */
const petal = new THREE.Group();

var flowerTexture1 = new THREE.TextureLoader().load("img/flower1.png");
var flowerTexture2 = new THREE.TextureLoader().load("img/flower2.png");
var flowerTexture3 = new THREE.TextureLoader().load("img/flower3.png");
var flowerTexture4 = new THREE.TextureLoader().load("img/flower4.png");
var flowerTexture5 = new THREE.TextureLoader().load("img/flower5.png");
var imageList = [flowerTexture1, flowerTexture2, flowerTexture3, flowerTexture4, flowerTexture5];

for (let i = 0; i < FLOWER_NUM; i++) {
    var spriteMaterial = new THREE.SpriteMaterial({
        map: imageList[Math.floor(Math.random() * imageList.length)],
    });
    var sprite = new THREE.Sprite(spriteMaterial);
    petal.add(sprite);

    sprite.scale.set(40.50.1); 
    sprite.position.set(2000 * (Math.random() - 0.5), 500 * Math.random(), 2000 * (Math.random() - 0.5))
}

scene.add(petal);
Copy the code

After adding a cube of the spectrum and a bunch of petals to the scene, the object is created.

Then set up the camera, we are using perspective camera, specify the Angle of view, the nearest and furthest distance, and the aspect ratio of the viewing area respectively.

const width = window.innerWidth;
const height = window.innerHeight;

const camera = new THREE.PerspectiveCamera(45, width / height, 0.1.1000);
camera.position.set(0.300.400);
camera.lookAt(scene.position);
Copy the code

Then set up the light with a point light source:

const pointLight = new THREE.PointLight( 0xffffff );
pointLight.position.set(0.300.40);
scene.add(pointLight);
Copy the code

The renderer can then be used to render a sequence of frames in combination with the requestAnimationFrame.

const renderer = new THREE.WebGLRenderer();

function render() {
    renderer.render(scene, camera);
    requestAnimationFrame(render);
}
render();
Copy the code

During rendering, each frame calculates the position of the petals and the height of the spectrum cube.

The position of a petal is to fall continuously until it reaches a certain height and then returns to the top:

 petal.children.forEach(sprite= > {
    sprite.position.y -= 5;
    sprite.position.x += 0.5;
    if (sprite.position.y < - height / 2) {
        sprite.position.y = height / 2;
    }
    if (sprite.position.x > 1000) {
        sprite.position.x = -1000; }});Copy the code

For the spectrum cube, analyser should be used to obtain the latest spectrum data, calculate the average value of each group, and then set to the scaleY of the cube.

// Get spectrum data
const frequencyData = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(frequencyData);

// Calculate the average spectrum data of each group
const averageFrequencyData = [];
for (let i = 0; i< frequencyData.length; i += STEP) {
    let sum = 0;
    for(let j = i; j < i + STEP; j++) {
        sum += frequencyData[j];
    }
    averageFrequencyData.push(sum / STEP);
}
// Set the cube scaleY
for (let i = 0; i < averageFrequencyData.length; i++) {
    cubes.children[i].scale.y = Math.floor(averageFrequencyData[i] * 0.4);
}
Copy the code

You can also render the scene around the X-axis, turning each frame at a certain Angle.

scene.rotateX(0.005);
Copy the code

Finally, add track controller can, its role is to use the mouse to adjust the position of the camera, adjust the distance to see things, Angle and so on.

const controls = new THREE.OrbitControls(camera);
Copy the code

The end result is this: the petals fly and the spectral cube dances to the music.

The full code is submitted to Github:

Github.com/QuarkGluonP…

Post a copy here, too:

<! DOCTYPE html><html lang="en">
<head>
    <meta charset="UTF-8">
    <title>Music spectrum visualization</title>
    <style>
        body {
            margin: 0;
            overflow: hidden;
        }
    </style>
    <script src="./js/three.js"></script>
    <script src="./js/OrbitControls.js"></script>
</head>
<body>
<script>
    let audioCtx = new AudioContext();
    let source, analyser;

    function getData() {
        source = audioCtx.createBufferSource();
        analyser = audioCtx.createAnalyser();

        return fetch('./music/ all the way to flower.mp3 ')
            .then(function(response) {
                if(! response.ok) {throw new Error("HTTP error, status = " + response.status);
                }
                return response.arrayBuffer();
            })
            .then(function(arrayBuffer) {
                audioCtx.decodeAudioData(arrayBuffer, function(decodedData) {
                    source.buffer = decodedData;
                    source.connect(analyser);
                    analyser.connect(audioCtx.destination);
                });
            });
    };

    function triggerHandler() {
        getData().then(function() {
            source.start(0);
            create();
            render();
        });
        document.removeEventListener('mousedown', triggerHandler)
    }
    document.addEventListener('mousedown', triggerHandler);

    const STEP = 50;
    const CUBE_NUM = Math.ceil(1024 / STEP);
    const FLOWER_NUM = 400;

    const width = window.innerWidth;
    const height = window.innerHeight;

    const scene = new THREE.Scene();

    const camera = new THREE.PerspectiveCamera(45, width / height, 0.1.1000);

    const renderer = new THREE.WebGLRenderer();
    /** ** petals grouped */
    const petal = new THREE.Group();

    /** * spectrum cube */
    const cubes = new THREE.Group();

    function create() {
        const pointLight = new THREE.PointLight( 0xffffff );
        pointLight.position.set(0.300.40);
        scene.add(pointLight);


        camera.position.set(0.300.400);
        camera.lookAt(scene.position);

        renderer.setSize(width, height);
        document.body.appendChild(renderer.domElement)

        renderer.render(scene, camera)

        for (let i = 0; i < CUBE_NUM; i ++ ) {
            const geometry = new THREE.BoxGeometry( 10.10.10 );
            const material = new THREE.MeshPhongMaterial({color: 'yellowgreen'});
            const cube = new THREE.Mesh( geometry, material );
            cube.translateX((10 + 10) * i);
            cube.translateY(1);

            cubes.add(cube);
        }
        cubes.translateX(- (10 +10) * CUBE_NUM / 2);


        var flowerTexture1 = new THREE.TextureLoader().load("img/flower1.png");
        var flowerTexture2 = new THREE.TextureLoader().load("img/flower2.png");
        var flowerTexture3 = new THREE.TextureLoader().load("img/flower3.png");
        var flowerTexture4 = new THREE.TextureLoader().load("img/flower4.png");
        var flowerTexture5 = new THREE.TextureLoader().load("img/flower5.png");
        var imageList = [flowerTexture1, flowerTexture2, flowerTexture3, flowerTexture4, flowerTexture5];

        for (let i = 0; i < FLOWER_NUM; i++) {
            var spriteMaterial = new THREE.SpriteMaterial({
                map: imageList[Math.floor(Math.random() * imageList.length)],
            });
            var sprite = new THREE.Sprite(spriteMaterial);
            petal.add(sprite);

            sprite.scale.set(40.50.1); 
            sprite.position.set(2000 * (Math.random() - 0.5), 500 * Math.random(), 2000 * (Math.random() - 0.5))
        }

        scene.add(cubes);
        scene.add(petal);
    }

    function render() {
        petal.children.forEach(sprite= > {
            sprite.position.y -= 5;
            sprite.position.x += 0.5;
            if (sprite.position.y < - height / 2) {
                sprite.position.y = height / 2;
            }
            if (sprite.position.x > 1000) {
                sprite.position.x = -1000; }});const frequencyData = new Uint8Array(analyser.frequencyBinCount);
        analyser.getByteFrequencyData(frequencyData);

        const averageFrequencyData = [];
        for (let i = 0; i< frequencyData.length; i += STEP) {
            let sum = 0;
            for(let j = i; j < i + STEP; j++) {
                sum += frequencyData[j];
            }
            averageFrequencyData.push(sum / STEP);
        }
        for (let i = 0; i < averageFrequencyData.length; i++) {
            cubes.children[i].scale.y = Math.floor(averageFrequencyData[i] * 0.4);
        }

        scene.rotateX(0.005);
        renderer.render(scene, camera);

        requestAnimationFrame(render);
    }

    const controls = new THREE.OrbitControls(camera);

</script>
</body>
</html>
Copy the code

conclusion

In this article we learned how to do spectral visualization of audio.

First of all, audio data is obtained through FETCH and stored in ArrayBuffer, which is the STANDARD API of JS for storing binary data. Other similar apis are Blob and Buffer. Blob is the browser’s API for storing binary data in files, and Buffer is the Node.js API for storing IO data.

We then use the API of the AudioContext to get the spectrum data and play the audio, which is made up of a series of nodes. We save the audio data through Source, pass it to Analyser to get the spectrum data, and pass it to Destination.

After that, the 3D scene is drawn. Spectrum cube and petal rain are drawn respectively, using Mesh and Sprite. Mesh is an object composed of geometry and materials, where BoxGeometry and MeshPhongMaterial (reflective) are used. A Sprite is a flat surface that always faces the camera and is used to display petals.

Then set point light source, with Phong material can achieve reflective effect.

Using a perspective camera, you can achieve a 3D perspective effect of near large and far small, but the orthogonal camera can not do this effect, it is a plane projection, no matter how far the same size.

Then in each frame rendering, change the position of the petals and get the spectrum data to change the scaleY of the cube.

In this paper, we learned not only AudioContext to obtain audio spectrum data, but also learned to use three.js to do 3D rendering. The combination of data and rendering is what visualization does: through an appropriate display way, better display data.

Visualization is an application scenario of Three.js, and games are also an application scenario, which we will explore later.