What is audio visualization

Audio visualization, as the name suggests, is the process of taking the waveform, frequency and other data from audio and converting them into images that are displayed on the screen. With it, we were able to make some cool front-end music interfaces.

Below, I will analyze an audio visualization development case from the cloud music technology team to quickly help Xiao Bai and make their own cool audio visualization interface.

The above first

To develop such a cool audio interface, we can talk about canvas first

What is the canvas

Canvas is a container element used for drawing graphics in HTML5. It usually completes drawing graphics through JavaScript scripts. To complete our audio visualization development below, we can borrow several methods of the group structure canvas. We will help beginners understand some properties and methods of canvas by developing a small case of page countdown.

On the first code

	<canvas id="myCanvas">  
    </canvas>
    <script>
        const canvas = document.getElementById('myCanvas');
        var ctx = canvas.getContext('2d');
        ctx.fillStyle = 'red';
        ctx.font = "50px Verdana";        
        let dis = 550;
        let i = 10;
        function animation(){
        requestAnimationFrame(function(){        
            if(dis >= 0){
                --dis;
                if(dis%50= =0 ){
                    ctx.clearRect(0.0.300.150);
                    ctx.fillText(i--,100.100); } animation(); }}); } animation();</script>
Copy the code

We create a Canvas canvas element on the HTML page and set the id to myCanvas, which is 300 by 150 by default. With this canvas created, we can draw the graphics in our JavaScript script.

Find this element by getElementById(), then create canvas CTX and set the fill color to red, font to Verdana, size to 50px, set dis variable for conditional control, and define an I variable to display countdown numbers. Then we can start drawing countdown numbers in our canvas.

Create an animation function. In this function, we use an html5 APIrequestAnimationFrame that is specially used to request animation frames. Compared with timer setTimeout, it does not cause frame loss and looks more smooth.

In requestAnimationFrame, set dis>=0, dis is reduced by one for 550 times, and then set dis to perform one draw every 50 times. Clear the given rectangle on the canvas using the clearRect(x,y,width,height) method. Its four parameters represent the x and Y coordinates of the top left corner of the rectangle to clear, and the width and height of the rectangle to clear, in pixels. This method takes four parameters: the output text of text, x draws the x coordinate of the text, and y draws the Y coordinate of the text. Note: both values are relative to the canvas. The last parameter, maxWidth, represents the maximum allowable width of the text, which is an optional parameter.

Then we recursively call the animation() function until the countdown is over. Finally, we call the animation() function externally. At this point, a simple countdown interface is complete. We can also set the canvas with innerWidth and innerHeight

Canvas size.

Canvas can draw all kinds of graphics. Please refer to canvas content for more information

Now that we’re done talking about Canvas, it’s time to get down to business.

Web Audio

Before we begin, we also need to understand what Web Audio is.

Web Audio is a set of apis for processing and analyzing Audio on the Web side. It allows users to perform audio operations in an audio context, features modular routing, and also allows us to control the spatialization of audio.

With Web Audio, we can implement both fetching data and mapping data, which we will implement in the following sections.

Project implementation

We’ll start by creating a Canvas element on the page with an Audio tag and an A tag that will act as a play button.

Then get the audio and a elements in JavaScript and set the click event for the A tag.

		var btn = document.getElementById('play-btn');
        var audio = document.getElementById('audio');
        btn.addEventListener('click'.function(){
            btn.style.display = 'none';
            audio.play();
            onloadAudio();
        })
Copy the code

In onLoadAudio(), we get the Canvas element, set it to fill the page, and then create the Canvas object.

		var canvas = document.getElementById('canvas');
        canvas.width = window.innerWidth;
        canvas.height = window.innerHeight;
        var ctx = canvas.getContext('2d');
Copy the code

Create an AudioContext object that controls the creation of the nodes it contains and the execution of audio processing and decoding operations.

		var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
Copy the code

Create **AnalyserNode ** by createAnalyser() method to obtain audio time and frequency data and achieve audio data visualization.

		var analyser = audioCtx.createAnalyser();
		analyser.fftSize = 512;
Copy the code

FftSize is a parameter of the fast Fourier transform in MDN. The value must be a non-zero power of 2 in the range 32 to 32768. The default value is 2048. In addition, the value of fftSize determines the length of frequencyData.

Associate the audio node to the AudioContext as input to the entire audio analysis.

We will adopt MediaElementAudioSourceNode < audio > node as the input source, and the audio analyzer, linked to associate analyzer to output devices.

		var source = audioCtx.createMediaElementSource(audio);
        source.connect(analyser);
        analyser.connect(audioCtx.destination);
Copy the code

Next get the frequency array.

        var bufferLength = analyser.frequencyBinCount;
        var dataArray = new Uint8Array(bufferLength);
Copy the code

FrequencyBinCount () is half the value of fftSize, so the Uint8Array() array is 256 in length.

The width of the tone bar is then set, and the height is only defined as a variable without assigning a value, which is left to be set dynamically via the dataArray[] array

		var barWidth = WIDTH / bufferLength*1.5;
        var barHeight;
Copy the code

Draw the column

Define a renderFrame() function to draw the soundbar, and each time clear the entire canvas and update the frequency array.

CTX. ClearRect (0, 0, WIDTH, HEIGHT); analyser.getByteFrequencyData(dataArray);Copy the code

Use the for loop to set the height of each rectangle, then set a background color based on the height, then draw the rectangle and fill the background color. The function is then called recursively.

		barHeight = dataArray[i];
        var r = barHeight + 25 * (i / bufferLength);
        var g = 250 * (i / bufferLength);
        var b = 50;
        ctx.fillStyle = "rgb(" + r + "," + g + "," + b + ")";
        ctx.fillRect(x,HEIGHT-barHeight,barWidth,barHeight);
        x += barWidth+2;
Copy the code

Finally, run the code and experience your own visual music.

summary

This article briefly introduces the use of Canvas and how to obtain the frequency data of Audio through the relevant API of Web Audio.

However, canvas and Web Audio are more useful than that. Readers can also develop more interesting projects with their imagination and creativity.

Attached project source: github.com/anpeier/les…