1. Video capture AVCapture

/ / get all the cameras NSArray * carmes = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; / / get a front-facing camera NSArray * captureDeviceArray = [carmes filteredArrayUsingPredicate: [NSPredicate predicateWithFormat: @ "position == %d", AVCaptureDevicePositionFront]]; if (! Capturedevicearray.count) {NSLog(@" failed to get front camera "); return; } / / into output device AVCaptureDevice * camera = captureDeviceArray firstObject; NSError *error = nil; self.captureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error]; If (error) {NSLog(@"AVCaptureDevice failed to convert AVCaptureDeviceInput ",error.description); return; }Copy the code

2. Initialize output

Initialize the video output and set the video data format, set the collection data callback thread.

/ / set the video output self. CaptureVideDataOutput = [[AVCaptureVideoDataOutput alloc] init]; / / set the video data format NSDictionary * videoSetting = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8PlanarFullRange], kCVPixelBufferPixelFormatTypeKey, nil]; // Set the output proxy, Serial queue and data correction dispatch_queue_t outQueue = dispatch_queue_create (" AVCaptureViewDataOutputQueue, "DISPATCH_QUEUE_SERIAL); [self.captureVideDataOutput setSampleBufferDelegate:self queue:outQueue]; self.captureVideDataOutput.alwaysDiscardsLateVideoFrames = YES;Copy the code