Write first:

Recently I was doing some research on new technology, and I happened to see someone sharing a series of h5 technology application on Nuggets, which mainly includes the application of h5 core equipment interface such as audio, media stream, user location and ambient light, so I am going to do some work according to that directory. This article is about the audio object AudioContext to do a small DEMO, of course, see others DEMO, I hope you enjoy ~

As usual, post my blog address and don’t forget to like it

Achievements:

Git address:

Github.com/ry928330/we…

Function Description:

1. Visually display the rhythm of music according to the music you load on the page; 2. Click mode switch to display the current music rhythm in the way of time-domain waveform.

Implementation details:

The whole DEMO is relatively simple, mainly familiar with some API related to Web Audio, so I will introduce the use of some API used in this DEMO. Also, one of the difficulties in this DEMO is the falling effect of the little square at the top that you see on the bar chart, and I’ll explain that as well. All the animations you see on the page are drawn on canvas, but I won’t go into that. Next, get to the subject.

Create audio based voice environment as follows:

window.AudioContext = window.AudioContext || window.webkitAudioContext || window.mozAudioContext;
var audioContext = new AudioContext(); 
var analyser = audioContext.createAnalyser(); 
analyser.fftSize = 2048;
var audio = document.getElementById('audio');
var audioSrc = audioContext.createMediaElementSource(audio); 
audioSrc.connect(analyser);
analyser.connect(audioContext.destination);
Copy the code

It’s just a few lines of code, explained one by one.

1.window.AudioContext = window.AudioContext || window.webkitAudioContext || window.mozAudioContext;
2.var audioContext = new AudioContext(); 
Copy the code

Create the voice environment context in which all future operations on the voice will take place. This is exactly the same as the Canvas getContext operation. Different prefixes are added for compatibility with different browsers.

3.var analyser = audioContext.createAnalyser();
4.analyser.fftSize = 2048;
Copy the code

Create an Analyser node that you can use to get the event and frequency data for the audio signal. Then statement 4 can set the number of FFT (fast Fourier transform) sampling points in the frequency domain. The value of the fftSize attribute must be a non-zero power of 2 ranging from 32 to 32768. The default value is 2048. However, we did not use so many points, so we only selected some points to show the rhythm of the music.

5.var audioSrc = audioContext.createMediaElementSource(audio);
Copy the code

Create a MediaElementAudioSourceNode interfaces to associate HTMLMediaElement. This can be used to play and process audio from the video or audio element. Also, the createMediaStreamSource() interface can be used to retrieve MediaStream information for audio streams from computer microphones or other sources.

6.audioSrc.connect(analyser);
7.analyser.connect(audioContext.destination);
Copy the code

Associate the Analyser node to the destination audio environment, where audioContext.destination returns the AudioDestinationNode object, The AudioDestinationNode interface represents the final output address of an audio graphic ina particular situation, usually a speaker. It means associated with audio devices.

Get audio data, render visual rhythm interface:

After creating the environment, we should take the relevant audio data and render the visual rhythm interface based on Canvas. The code is as follows:

var array = new Uint8Array(analyser.frequencyBinCount); 
analyser.getByteFrequencyData(array);
Copy the code

Analyser. FrequencyBinCount half of the value to obtain fftSize, usually is equal to the number of data will be used for visualization of numerical. Then, through the analyser. GetByteFrequencyData (array) function, the current frequency domain data copies into Uint8Array array. Then, the difficulty of this time, how to render the top landing square.

var barWidth = 10; var gap = 2; var barNum = Math.round(cWidth / (barWidth + gap)); // Draw how many barsfor (var i = 0; i < barNum; i++) {
    var value = array[i];
    if (topArr.length < barNum) {
        topArr.push(value)
    }
    if (value < topArr[i]) {
        ctx.fillRect(i * (barWidth + gap), cHeight - (topArr[i] -= 1) - topHeight, barWidth, topHeight);
    } else{ ctx.fillRect(i * (barWidth + gap), cHeight - value - topHeight, barWidth, topHeight); topArr[i] = value; }}Copy the code

We calculate the number of bars to be drawn based on the canvas width and the width and spacing of the bars, and this value determines how much audio data we use to select. For the top cube, we create an array with the initial value of the audio data. If the new audio data is larger than the previously initialized value, we draw the top cube and update the initialized value based on the new audio data. Conversely, if the new audio data is smaller than the initial data, at this time, through the initial value of a gradually decreasing effect, to achieve the effect of small cubes falling. Finally, render the bar as follows:

var grd = ctx.createLinearGradient(i * barWidth, cHeight - value, i * barWidth, cHeight);
grd.addColorStop(0, "yellow"); The GRD. AddColorStop (0.3,"rgb(255,0,0)"); The GRD. AddColorStop (0.5,"rgb(200,0,0)"); The GRD. AddColorStop (0.7,"RGB (150,20,20)");
grd.addColorStop(1, "rgb(100,0,0)");
ctx.fillStyle = grd;
ctx.fillRect(i * (barWidth + gap), cHeight - value, barWidth, topArr[i]);
ctx.fill();
Copy the code

Here, we added a linear gradient to the color of the bar to make it look slightly nicer.

Switching of waveform mode:

The following code is used to copy the time domain data into the array array. The rest of the drawing is done by using the Canvas moveTo and lineTo API. I won’t go into details.

analyser.getByteTimeDomainData(array);
Copy the code

References:

1. http://www.codes5…

2. Zhang Xinxu used HTML5Web Audio API to add sound to webpage JS interaction

3.MDN Web Audio interface