-
The problem
The leader looked at the photo and said, “Where’s that noise?”
“It’s systemic. It’s built-in. Look at the sound in the system.”
“Is there a way to get rid of it? It’s a mess.”
“Let me try.”
-
Train of thought
I see a long way to go, I’m searching in Baidu +SDK
Clap brick AVCaptureVideoDataOutput, CMSampleBufferRef to UIImage proxy method
-
On the code
-
The session setting is not mentioned
-
For the layer Settings, please refer to the previous part [AVCapturePhotoOutput for iOS Photo Customization] and the previous part [iOS write before customizing the camera].
-
Get camera, get input to device add to session, initialize videoOutput add to session
AVCaptureDevice *device = [self cameraDevice]; if (! Device) {NSLog(@" get rear camera problem "); return;; } NSError *error = nil; self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil]; / / equipment added to the session the if ([self. CaptureSession canAddInput: self. The videoInput]) {[self. CaptureSession addInput: self. The videoInput]; } [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue]; if ([self.captureSession canAddOutput:self.videoOutput]) { [self.captureSession addOutput:self.videoOutput]; } // lazy load - (AVCaptureVideoDataOutput *)videoOutput {if (! _videoOutput) { _videoOutput = [[AVCaptureVideoDataOutput alloc] init]; _videoOutput.alwaysDiscardsLateVideoFrames = YES; _videoOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; } return _videoOutput; } - (dispatch_queue_t)videoQueue { if (! _videoQueue) { _videoQueue = dispatch_queue_create("queue", DISPATCH_QUEUE_SERIAL); } return _videoQueue; }Copy the code
- Agent AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { @autoreleasepool { if (connection == [self.videoOutput ConnectionWithMediaType: AVMediaTypeVideo]) {/ / video @ synchronized (self) image = {UIImage * [the self bufferToImage:sampleBuffer rect:self.scanView.scanRect]; self.uploadImg = image; }}}}Copy the code
- CMSampleBufferRef turns to UIImage, this method is adjusted, take a screenshot of a part of the whole image, and set it as needed. You need to adjust the specific pictures of the specified area
- (UIImage *)bufferToImage:(CMSampleBufferRef)sampleBuffer rect:(CGRect)rect { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); CGRect dRect; CGSize msize = UIScreen.mainScreen.bounds.size; msize.height = msize.height - 150; CGFloat x = width * rect.origin.x / msize.width; CGFloat y = height * rect.origin.y / msize.height; CGFloat w = width * rect.size.width / msize.width; CGFloat h = height * rect.size.height / msize.height; dRect = CGRectMake(x, y, w, h); CGImageRef partRef = CGImageCreateWithImageInRect(quartzImage, dRect); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:partRef]; // Release the Quartz image CGImageRelease(partRef); CGImageRelease(quartzImage); return image; }Copy the code
-
-
We have the chart. Call it a day. How do you use the map? Business time