Audio Kit view file waveform related source code, see, pretty simple

Take the audio samples, display them in the view. Done

1. Get the audio sampling data

Get floating point data that looks like this

-0.014434814-0.016998291-0.0184021-0.017547607 //...Copy the code
Gets floating point data between -1 and 1

1.1 Get the audio file

let url = Bundle.main.resourceURL? .appendingPathComponent("Samples/beat.aiff") let file = try! AVAudioFile(forReading: url!)Copy the code

1.2 Get the data of the first channel

public convenience init? (file: AVAudioFile) { let size = Int(file.length) self.init(count: Size) guard let data = file.tofloatChannelData () else {return nil} Self [I] = data[0][I]}}Copy the code
1.2.1 Obtain floating point data of sound channel
 public func toFloatChannelData() -> FloatChannelData? {
        guard let pcmBuffer = toAVAudioPCMBuffer(),
            let data = pcmBuffer.toFloatChannelData() else { return nil }
        return data
    }

Copy the code

Audio files to PCM buffer,

public func toAVAudioPCMBuffer() -> AVAudioPCMBuffer? { guard let buffer = AVAudioPCMBuffer(pcmFormat: processingFormat, frameCapacity: AVAudioFrameCount(length) else {return nil} do {framePosition = 0 // Buffer try read(into: buffer) } catch let error as NSError { return buffer }Copy the code

Get the floating point data in the audio PCM buffer

public func toFloatChannelData() -> FloatChannelData? { guard let pcmFloatChannelData = floatChannelData else { return nil } let channelCount = Int(format.channelCount) let frameLength = Int(self.frameLength) let stride = self.stride var result = Array(repeating: [Float](zeros: FrameLength), count: channelCount) // Each frame of data is divided into multiple channels, // stereo has left channel, and right channel for channel in 0.. < channelCount {// Get data from each frame for sampleIndex in 0.. < frameLength { result[channel][sampleIndex] = pcmFloatChannelData[channel][sampleIndex * stride] } } return result }Copy the code

2. Get the audio sampling data. Show it in a view

Drawing is connecting dots, dots, dots

For points, the point is the x and y coordinates

public override func draw(_ rect: CGRect) {let width = Double(frame.width) let height = Double(frame.height) / 2.0 let padding = 0.9 // Let bezierPath = UIBezierPath() // start a point // start a point bezierPath.move(to: CGPoint(x: 0.0, y: 0.0)) (1.0 - Double(table[0])/absmax) * height)) for index in 1.. <table.count {// count coordinates, Let xPoint = Double(index)/Double(table.count) * width let yPoint = (1.0 - Double(table[index])/absmax * padding) * height bezierPath.addLine(to: CGPoint(x: xPoint, y: yPoint))} Double(frame.width), y: (1.0 - Double(table[0])/absmax * padding) * height) uicolor.green.setStroke () bezierPath.lineWidth = 1 bezierPath.stroke() }Copy the code
  • The x-coordinate, which is easy to calculate,

I give you a rectangle and I distribute the points evenly

// let xPoint = Double(index)/Double(table.count) * widthCopy the code
  • Compute y coordinates

Height is half the height of the rectangle

Let height = Double(frame.height) / 2.0Copy the code

Let’s move to the center line

1.0 * height
Copy the code

And then flip the y value, because we’re used to seeing the y value grow up, the default y value in iOS, like grow down

- Double(table[index]) * height
Copy the code

I’m going to make y bigger, and then I’m going to make it smaller

Absmax is the maximum value inside the point of arrival,

Example: all the points are between 0.1 and 0.2. If you do not enlarge them, the coordinates will be very ugly

- Double(table[index]) * height * absmax
Copy the code

In order to leave the top and bottom blank, make it neat

- Double(table[index]) * height * padding
Copy the code

2.1, SwiftUI calls UIView objects

Encapsulated by a structure using the UIViewRepresentable protocol

struct TableDataView: UIViewRepresentable {// TableView, the actual rendered View, is a class TypeAlias UIViewType = TableView var View: Func makeUIView(context: Context) -> TableView { view.backgroundColor = UIColor.black return view } func updateUIView(_ uiView: TableView, context: Context) { // } }Copy the code

The waveform of another channel

For normal audio files, the data on the left channel is the same as the data on the right channel

Using data from another channel, look at the corresponding waveform

public convenience init? (file: AVAudioFile) { let size = Int(file.length) self.init(count: Size) guard let data = file.tofloatChannelData () else {return nil} Self [I] = data[1][I]}}Copy the code

Why do waveforms look like this?

The waveform should be a broken line, why is it usually a lump

In this example, the audio length is 8.78 seconds, the sampling rate is 44100, and the audio channel is double.

There are 387_072 frames, the rectangle in the picture is 343,

One point will draw 1,128 lines,

So it’s going to be a bunch

github repo