Recording, we’re gonna need the microphone

Every app on an iOS device has an Audio Session.

App call audio related, will naturally use iOS hardware functions.

So the Audio Session, the Audio Session, is all about managing Audio operations.

IOS uses audio and has very fine management granularity

What do you think: should the music playing in the background be mixed with the audio of your app?

The Audio Session handles Audio by its Category Audio Session Category setting

The default category,

1. Playback is allowed, recording is not allowed.

2. After the mute button is turned on, your app will be mute and no sound will be heard when playing audio.

3. After the screen is locked, your app becomes mute and plays audio without sound.

4. If there are other apps playing audio in the background, when your app starts playing audio, the other apps will be mute.

More categories, as shown below:

The first thing we need to do is do some configuration for audio.

In general, we will use the AVFoundation framework to operate audio, so let’s import AVFoundation

Set the Audio Session, the classification of AVAudioSession. CategoryOptions. DefaultToSpeaker, allow our app, a call to the built-in microphone to the recording, can play Audio.

Here to do the recording function, the classification option has also changed.

The default option for categorizing is that the audio plays to the listener, which is the horn above, and the scene is usually you’re holding the phone to your ear and making a call.

Now take the audio path, point it to the speaker, the horn below the microphone.

// This is a global variable that records the microphone permissions var appHasMicAccess =true/ /... // get an instance of AVAudioSessionlet session = AVAudioSession.sharedInstance()
        do{/ / here, set up the classification the try session. SetCategory (AVAudioSession. Category. PlayAndRecord, the options: AVAudioSession.CategoryOptions.defaultToSpeaker) try session.setActive(trueCheck app) / / for permissions, use this equipment microphone session. RequestRecordPermission ({(isGranted: Bool)in
                ifIsGranted {// If your app wants to record audio, the user must grant microphone access appHasMicAccess =true
                }
                else{
                    appHasMicAccess = false
                }
            })
       } catch let error as NSError {
            print("AVAudioSession configuration error: \(error.localizedDescription)")}Copy the code

Enter the recording,

Var audioStatus: audioStatus = audiostatus. Stopped var audioRecorder: AVAudioRecorder! funcsetupRecorder() {
         //  getURLforThe Memo method takes a temporary path to the recording file // getURLforMemo, please refer to the GitHub link belowlet fileURL = getURLforMemo() // Sets the description information for recording sampling /* Linear pulse coding modulation, uncompressed data format sampling frequency, 44.1 KHZ, CD-level effect mono, recording a single tone */letRecordSettings = [AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 44100.0, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] as [String : Any]do{// Instantiate audioRecorder audioRecorder = try AVAudioRecorder(URL: fileURL, Settings: recordSettings) audioRecorder.delegate = self audioRecorder.prepareToRecord() } catch {print("Error creating audio Recorder."}} // Start recording funcrecordStartUpdateLoop () {startUpdateLoop() {startUpdateLoop(); AudioRecorder. Record ()} // Stop recording funcstopRecording() {
        recordButton.setBackgroundImage(UIImage(named: "button-record"), for: UIControl.State.normal  )
        audioStatus = .stopped
        audioRecorder.stop()
        stopUpdateLoop()
    }

Copy the code

When recording ends, update the status through the AVAudioRecorderDelegate agent

func audioRecorderDidFinishRecording(_ recorder: AVAudioRecorder, successfully flag: Bool) {audioStatus =.stopped // Because this scene is recorded and must be manually clicked, // so there is no need to update UI here}Copy the code

The recording is ready. Let’s play it

Play the tape

var audioPlayer: AVAudioPlayer! // Start playing funcplay() {
          //  getURLforThe Memo method takes a temporary path to the recording file // getURLforMemo, please refer to the GitHub link belowlet fileURL = getURLforMemo()
        doAudioPlayer = try AVAudioPlayer(contentsOf: FileURL) audioplayer. delegate = self // Check that the audio file is not empty before playing the audio fileifAudioPlayer. Duration > {0.0setPlayButtonOn(flag: true)
                audioPlayer.play()
                audioStatus = .Playing
                startUpdateLoop()
            }
        } catch {
            print("Error loading audio Player"}} // Stop playing funcstopPlayback() {
        setPlayButtonOn(flag: false)
        audioStatus = .stopped
        audioPlayer.stop()
        stopUpdateLoop()
    } 
Copy the code

After playing, update the UI via the AVAudioPlayerDelegate agent

Func audioPlayerDidFinishPlaying (_ player: AVAudioPlayer, successfully flag: Bool) {/ / because only here, we know that the timing of the play outsetPlayButtonOn(flag: false)
        audioStatus = .stopped
        stopUpdateLoop()
    }
Copy the code

UI to display recording/playback progress

To display the recording/playback progress, you use a timer,

Because the recording/playback, from moment to moment, changes.

Timer in three steps:

Turn on the timer,

Var soundTimer: CFTimeInterval = 0.0 var updateTimer: CADisplayLink! funcstartUpdateLoop() {ifupdateTimer ! UpdateTimer = CADisplayLink(target: self, selector:#selector(ViewController.updateLoop))
          updateTimer.preferredFramesPerSecond = 1
          updateTimer.add(to: RunLoop.current, forMode: RunLoop.Mode.common)
    }
Copy the code

Set a timer and get things done

   @objc func updateLoop() {ifAudioStatus ==. Recording {// Recording status is periodically refreshedifCFAbsoluteTimeGetCurrent () - soundTimer > {timeLabel. Text = 0.5 formattedCurrentTime (UInt (audioRecorder. CurrentTime)) soundTimer = CFAbsoluteTimeGetCurrent() } }else ifAudioStatus ==. Playing {// The playing status is refreshed periodicallyifCFAbsoluteTimeGetCurrent() -SoundTimer > 0.5 {timelabel.text = formattedCurrentTime(UInt(AudioPlayer.currentTime)) soundTimer = CFAbsoluteTimeGetCurrent() } } }Copy the code

Destroy timer

Call this method when you need to stop, for example, after playing the proxy method, click the play button again…

func stopUpdateLoop(){updatetimer.invalidate () updateTimer = nil Timelabel. text = formattedCurrentTime(UInt(0))}Copy the code

Sampling volume size measurement

AVAudioPlayer has the audio metering function, and when you play the audio, the audio metering can detect, for example, the average energy level of the waveform

The AVAudioPlayer method, averagePower(forChannel:), returns the current decibel value ranging from -160 db to 0 db, where 0 is loud and -160 is quiet

The waveform looks like this

Make a mouth-opening animation, which is a simple visualization of volume. The bigger the volume, the more you open your mouth. See GitHub Repo at the end of this article

// Create a structure, MeterTable // audio metering returns floating point number range -160 ~ 0, first do decibel to amplitude, convert between 0 ~ 1 // open mouth mouth animation image has 5, divided into 5 levels, the above value range, // MeterTable will map the collected sounds to the corresponding imagesletmeterTable = MeterTable(tableSize: 100) // ... Before / / play to activate the volume decibel value detection audioPlayer. IsMeteringEnabled =true/ /... // The volume is mapped to the image number. // The status update method must use a timer. Github repo func meterLevelsToFrame() -> Int{guardlet player = audioPlayer else {
            return1} player.updatemeters () // The player is monolet avgPower = player.averagePower(forChannel: 0)
        letLinearLevel = metertable.valueForPower (Power: avgPower) // Continue processing the data and convert it to an energy level, as detailed in the GitHub repo at the end of this articleletPowerPercentage = Int(round(linearLevel * 100)) // There are 5 imagesletTotalFrames = 5let frame = ( powerPercentage / totalFrames ) + 1
        return min(frame, totalFrames)
    }

Copy the code

Audio playback control: including volume control, left and right channel switch, playback cycle, playback rate control and so on

Controls playback volume

The volume ranges from 0 to 1. 0 indicates mute, and 1 indicates maximum

func toSetVolumn(value: Float){
        guard let player = audioPlayer else {
            return} // Set audioPlayer volume player.volume = value}Copy the code

Set the left and right channels

It ranges from -1 to 1,

-1 is all left, 1 is all right, 0 is equalized channel

func toSetPan(value: Float) {
        guard let player = audioPlayer else {
            returnPan = value; pan = value; pan = value;Copy the code

Set the play loop

The loop ranges from -1 to in.max,

NumberOfLoops 0 to int. Max will play that number more times

func toSetLoopPlayback(loop: Bool) {
        guard let player = audioPlayer else {
            return} // Apple is all wrapped up, set audioPlayer numberOfLoopsif loop == true{// numberOfLoops = -1, loop endlessly until audioPlayer stops player.numberOfLoops = -1}elsePlayer. numberOfLoops = 0}}Copy the code

Setting the Playback rate

The playback rate of the audioPlayer ranges from 0.5 to 2.0

0.5 is half-speed playback, 1.0 is normal playback, 2.0 is double speed playback

// Before playing, turn on the playback rate control of the audioPlayer as available audioplayer. enableRate =true

// ...

func toSetRate(value: Float) {
        guard let player = audioPlayer else {
            return} // Set audioPlayer's rate player.rate = value}Copy the code

Making a link

Sequel:

IOS Audio Hand by Hand: Voice change, reverb, Voice synthesis TTS, Swift5, Based on AVAudioEngine, etc