Summary of # # # #
-
Audio and video capture is the first link of live broadcast architecture and the source of video
- In fact, video collection has multiple application scenarios: such as two-dimensional code development
-
Audio and video collection includes two parts:
- Video acquisition
- Audio collection
-
In iOS development, it is possible to capture video and audio synchronously, and it is very simple to use
-
Related collection API are encapsulated in AVFoundation framework, import the corresponding framework, to achieve the function
#### Collection procedure ##### Collection procedure Text description
-
PS: If you have done QR code development, you should be familiar with the relevant steps (very similar)
-
Import the framework
- The API is mainly in the AVFoundation framework, so it needs to be imported into the framework first
-
Creating a Capture session (AVCaptureSession)
- This session is used for the input source & output source after the connection
- Input source: camera & microphone
- Output source: Get the corresponding audio and video data outlet
- Sessions: Used to connect input and output sources
-
Set video input source & output source
- Input source (AVCaptureDeviceInput) : Input from the camera
- AVCaptureVideoDataOutput: You can set up the proxy to get data in the proxy method
- Add input & output to the session
-
Sets audio input source & output source
- Input source (AVCaptureDeviceInput) : Input from the microphone
- AVCaptureAudioDataOutput: You can set up the proxy to get data in the proxy method
- Add input & output to the session
-
Add preview layer (optional)
- If you want the user to see the captured image, add a preview layer
- This preview layer is not required and can be collected without adding it
-
Just start collecting
- The collection begins by calling the startRunning method of the session (AVCaptureSession)
##### Code parsing
- Overall code steps
- Function 1 (Set video input/output)
- Function 2 (Set audio input/output)
- Add preview layer
- Comply with the protocol and implement the proxy method
##### implementation code
- Integral step code
1. Create a capture sessionletsession = AVCaptureSession() // 2. SetupVideoSource (session: session) // 3 Set audio input and output setupAudioSource(session: session) // 4. Add the preview layer setupPreviewLayer(session: session) // 5. Start scanning session.starTrunning ()Copy the code
- Function 1 (Set video input/output)
// Set the video source (input source & output source) fileprivate func setupVideoSource(session: AVCaptureSession) {// 1. Create input // 1.1. Get the Guard for all devices (including front & rear cameras)let devices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice] else { return} // 1.2. Remove the front cameralet d = devices.filter({ return $0.position ==.front}).first // 1.3. Create the input device Guard with the front cameralet videoInput = try? AVCaptureDeviceInput(device: d) else { return} // 2. Create an output sourceletVideoOutput = AVCaptureVideoDataOutput() // 2.2. Set up the agent and the execution queue for the agent method (get the collected data from the agent method)letqueue = DispatchQueue.global() videoOutput.setSampleBufferDelegate(self, queue: queue) // 3. Add input & Output to session // 3.1. Add input sourceifSession.canaddinput (videoInput) {session.addInput(videoInput)} // 3.2. Add an output sourceifsession.canAddOutput(videoOutput) { session.addOutput(videoOutput) } // 4. Assign connect to videoConnect = videoOutput.connection(withMediaType: AVMediaTypeVideo)}Copy the code
- Function 2 (Set audio input/output)
// Set the audio source (input source & output source) fileprivate func setupAudioSource(session: AVCaptureSession) {// 1. Create input Guardlet device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio) else { return }
guard let audioInput = try? AVCaptureDeviceInput(device: device) else { return} // 2. Create an output sourcelet audioOutput = AVCaptureAudioDataOutput()
letqueue = DispatchQueue.global() audioOutput.setSampleBufferDelegate(self, queue: queue) // 3. Add input & output to the sessionif session.canAddInput(audioInput) {
session.addInput(audioInput)
}
if session.canAddOutput(audioOutput) {
session.addOutput(audioOutput)
}
}
Copy the code
- Add preview images
// Add preview layer fileprivate func setupPreviewLayer(session: AVCaptureSession) {// 1. Create preview layer Guardlet previewLayer = AVCaptureVideoPreviewLayer(session: session) else { return} // 2. Set the layer properties previewLayer.frame = view.bounds // 3. The layer is added to the view of the view. The layer. InsertSublayer (previewLayer, at: 0)}Copy the code
- Comply with the protocol and implement the proxy method
extension ViewController : AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate { func captureOutput(_ captureOutput: AVCaptureOutput! , didOutputSampleBuffer sampleBuffer: CMSampleBuffer! , from connection: AVCaptureConnection!) {if connection == videoConnect {
print("Video data")}else {
print("Audio data")}}}Copy the code
##### Stop scanning
- For example, if the user is no longer direct, we need to stop scanning
- Remove the preview layer (you definitely don’t need the preview layer anymore)
- Stop scanning (call session stopRunning method)
- Set session to nil (object no longer used, pointer null)
@IBAction func stopScanning() {// 1. Remove previewLayer? .removeFromSuperLayer () // 2. .stoprunning () // 3. Reset object to nil session = nil}Copy the code
#### Switch lens & Focus & write file ##### Switch lens (front & rear camera)
-
Switching steps
-
Animate the switch process
-
Gets whether the current camera is front or rear
-
Take out the opposite camera (front before, back this time)
-
Retrieve the device with a new camera (AVCaptureDevice)
-
Create a new input from the device (AVCaptureDeviceInput)
-
Remove old input& add new input
- Note: Before modifying the session configuration, call enable modify configuration option, and after the configuration is complete, call submit modify configuration option
- session? .beginConfiguration()
- session? .commitConfiguration()
-
Save the new input
-
-
The code is as follows:
@IBAction func switchScene() {// 0. Execute animationlet rotaionAnim = CATransition()
rotaionAnim.type = "oglFlip"
rotaionAnim.subtype = "fromLeft"RotaionAnim. Duration = 0.5 view. The layer. The add (rotaionAnim,forKey: nil) // 1. Verify that videoInput has guardlet videoInput = videoInput else { return} // 2. Get the current shotletposition : AVCaptureDevicePosition = videoInput.device.position == .front ? .back : .front // 3. Create a new input Guardlet devices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice] else { return }
guard let newDevice = devices.filter({$0.position == position}).first else { return }
guard let newVideoInput = try? AVCaptureDeviceInput(device: newDevice) else { return} // 4. Remove old input, add new input session? .beginConfiguration() session? .removeInput(videoInput) session? .addInput(newVideoInput) session? .commitConfiguration() // 5. Save the new input self.videoInput = newVideoInput}Copy the code
##### Write file
-
Procedure for writing files
-
Create an AVCaptureMovieFileOutput object
- Used to write audio and video files
-
Add the movieFileOutput object to the session output
- Writing a file is also an output
-
Set the stable mode of the video
- If this parameter is not set, problems such as video frame hopping may occur
- It is usually set to automatic
-
Began to write
-
When recording is complete, stop writing
-
-
The code is as follows:
-
Create, add, and set code
// Add file outputletmovieFileoutput = AVCaptureMovieFileOutput() self.movieFileOutput = movieFileoutput session.addOutput(movieFileoutput) // Get the connection of the videoletConnection = movieFileoutput. Connection (withMediaType: AVMediaTypeVideo) / / set the video mode of stable connection? . PreferredVideoStabilizationMode =. Auto / / write video movieFileoutput. StartRecording (toOutputFileURL: outputFileURL, recordingDelegate: self)Copy the code
- Stop writing code
// 0. Stop writing self.movieFileOutput? .stopRecording()Copy the code
- Listen for start and end events in the proxy method
extension ViewController : AVCaptureFileOutputRecordingDelegate { func capture(_ captureOutput: AVCaptureFileOutput! , didStartRecordingToOutputFileAt fileURL: URL! , fromConnections connections: [Any]!) {print("Start recording.") } func capture(_ captureOutput: AVCaptureFileOutput! , didFinishRecordingToOutputFileAt outputFileURL: URL! , fromConnections connections: [Any]! , error: Error!) {print("Stop recording")}}Copy the code