IOS camera development step on the pit

Camera Settings, this demo using GPUImageview as a baseline, made a basic demo, processing for the underlying OpenGL direction processing, other functions can be referenced in the last reference link can be achieved. If I have time later, I will improve it slowly. If you need it, you can write it in the comments below, or join our QQ group :237305299.

Ps: IN 2016, I was basically engaged in the development of AVFoundation. I feel THAT I have a pretty good understanding of the Apple library. There are also many examples on the Internet, but after further understanding, I found that many of them actually have a lot of problems.

I. Overall architecture

There is a big problem with screen rotation. Now most of the video or live streaming apps I’ve observed basically don’t support turning the phone in the direction. The choice of this affects the entire architecture. The difficulty will be much lower. Sadly, we started with four directions.

The recording architecture I used before was written by AVWritter of AVFoundation, but later I found there was a big problem. When it comes to beauty, there’s a problem with recording. And then I went to GPUMovieWritter. There’s a built-in audio recording.

GPUImageVideoCamera this we use when the recommendation is to inherit a write, at least to look at the source code, this custom things too much. And then it’s not convenient to set up slow motion and so on. So I don’t use this. I’m using AVFoundation’s own pants, and I’m doing a little bit of openGL to mimic the rotation and so on.

According to the appeal, let’s sort out the process:

Two. Video recording

1. Set video parameters

Here are the core configuration parameters to work with StreamEye and camera capture to see the difference between your recording and system recording, mainly qa, sharpness, etc. Bit rate can be viewed with apple’s own camera software, and then use Alt+I shortcut to pop up a message to see the bit rate. Never use Airdrop to transfer files to your computer. This will cause recoding. Use the Camera capture tool in the Apple app to get a good look.

The above points are the stomping pits of countless practices. It is important to note that if the parameters are incorrectly set, if you record in 4K, you will not be able to record 30fps, or it will take up much more space than the system can record. Below I provide a reference setting, which is based on the comparison with the system and some parameters from practice. If there are any mistakes, please let me know in the comments below, or join qq group 237305299. Thank you.

NSDictionary *videoCompressionProps; NSDictionary *videoSettings; switch (cameraModel.videoResolution) { case AVCaptureSessionPreset3840x2160: VideoCompressionProps = @ {AVVideoAverageBitRateKey: @ (50 * 1024.0 * 1024), AVVideoH264EntropyModeKey:AVVideoH264EntropyModeCABAC, AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoAllowFrameReorderingKey:@NO, AVVideoExpectedSourceFrameRateKey:@30, }; break; case AVCaptureSessionPreset1920x1080: videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithDouble: 18 * 1024.0 * 1024], AVVideoAverageBitRateKey, AVVideoH264EntropyModeCABAC,AVVideoH264EntropyModeKey, nil]; break; case AVCaptureSessionPreset1280x720: videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithDouble: 8 * 1024.0 * 1024], AVVideoAverageBitRateKey, AVVideoH264EntropyModeCABAC,AVVideoH264EntropyModeKey, nil ]; break; default: break; } videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264,AVVideoCodecKey, videoCompressionProps, AVVideoCompressionPropertiesKey, AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey, [NSNumber numberWithInteger:videoSize.width],AVVideoWidthKey, [NSNumber numberWithInteger:videoSize.height],AVVideoHeightKey, nil]; self.writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; if(cameraModel.devicePosition == DJIIPhone_DevicePositionFront){ self.writerInput.transform = [OSMOMediaUtils getAngleTransformFromScreenOritationFront]; }else{ self.writerInput.transform = [OSMOMediaUtils getAngleTransformFromScreenOritationBack]; } self.writerInput.expectsMediaDataInRealTime = YES;Copy the code

2. Recording direction

Set an initial Transform for AVWritter to enter the recording stream directly from the CMSampleBuffer from the camera, without doing the on-screen flow. Because there’s a problem with not doing that. You click record video, rotate the screen during this process, and when you support four directions, the rotation will flash. You can use Apple’s own camera to try it out.

Iv. After GPU beautifying, CVPixBuffer is used to encode and live broadcast

  1. GPU beauty processing

Error handling methods:

CGSize outputSize = {720, 1280};
    GPUImageRawDataOutput *rawDataOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(outputSize.width, outputSize.height) resultsInBGRAFormat:YES];
    [self.beautifyFilter addTarget:rawDataOutput];
    Copy the code

Add rawDataoutput as Target, and then get pixBuffer for VideoToolbox processing

__weak GPUImageRawDataOutput *weakOutput = rawDataOutput; __weak typeof(self) weakSelf = self; [rawDataOutput setNewFrameAvailableBlock:^{ __strong GPUImageRawDataOutput *strongOutput = weakOutput; [strongOutput lockFramebufferForReading]; / / here can obtain the data to add filter GLubyte * outputBytes = [strongOutput rawBytesForImage];  NSInteger bytesPerRow = [strongOutput bytesPerRowInOutput]; CVPixelBufferRef pixelBuffer = NULL;  CVPixelBufferCreateWithBytes(kCFAllocatorDefault, outputSize.width, outputSize.height, kCVPixelFormatType_32BGRA, outputBytes, bytesPerRow, nil, nil, nil, &pixelBuffer); / / after can use VideoToolBox hard coding combining the RTMP protocol streaming [weakSelf encodeWithCVPixelBufferRef: pixelBuffer];  [strongOutput unlockFramebufferAfterReading]; CFRelease(pixelBuffer); }]; ` ` ` everyone can have a look at why I don't recommend this method, CVPixelBufferCreateWithBytes here is very time-consuming, to recreate. The GPUFrameBuffer contains a renderTarget, add a class to it to get the renderTarget (CVPixBuffer), lock it, and unlock it.  ## 5. Slow motion [slow motion setting] (/ / https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/index.html) The technical principle here is to perform a recording with a specified length of more than 60fps, such as 240fps, and then write the data filling period of 240fps to 30fps, so there is slow motion effect. The following code sets the maximum speed. ```objc -(void)configureCameraForHighestFrameRate:(AVCaptureDevice*) device { AVCaptureDeviceFormat *bestFormat = nil; AVFrameRateRange *bestFrameRateRange = nil; for(AVCaptureDeviceFormat *format in [device formats] ) { for ( AVFrameRateRange *range in format.videoSupportedFrameRateRanges ) { if ( range.maxFrameRate > bestFrameRateRange.maxFrameRate ) { bestFormat = format; bestFrameRateRange = range; } } } if ( bestFormat ) { if ( [device lockForConfiguration:NULL] == YES ) { device.activeFormat = bestFormat; device.activeVideoMinFrameDuration = bestFrameRateRange.minFrameDuration; device.activeVideoMaxFrameDuration = bestFrameRateRange.minFrameDuration; [device unlockForConfiguration]; }}}Copy the code

6. Time-lapse photography

Because the system is a callback 30 times per second, so now we own timer per second, take a picture, and then use AVAssetWriterInputPixelBufferAdaptor and AVAssetWriterInput, AVAssetWriter to picture framing with video.

7. Satellite orbit shooting

Use GPUBlenderFilter double input channel can be superimposed. Make sure to filter useNextCapture before generating photos.

Vi. Reference links

  1. To take pictures and videos of the official demodeveloper.apple.com/library/pre…

  2. Apple about taking pictures of a comprehensive demo video, I do is also a reference to the developer.apple.com/library/pre…

  3. Integrate beauty function for live APP in iOS

  4. Record video and segment recording github address

  5. Extended GPU support for video recording pause and resume. Support to turn on and off the flash lamp. GPUImageExtend

  6. AVAssetWriter is used to record small videos, and the recording of sound is also done inside with AVAssetWriter to record small videos,