“IOS native scanning” “AVCaptureSession” “AVCaptureDevice” “rectOfInterest” by Xs·H


Recently, in the IoT project, there is a need to scan the TWO-DIMENSIONAL code/bar code on the device or manual to read the information of the device in the process of intelligent device network allocation. The results to be achieved are generally as follows:

I thought I was developing a scan function in account guard a few years ago, so I wrapped it up (available from QiQRCode) so that I could reuse it in my project. The package includes two classes, QiCodeManager and QiCodePreviewView. QiCodeManager is responsible for scanning function (qr code/bar code recognition and reading, etc.), QiCodePreviewView is responsible for scanning interface (scan box, scan line, prompt, etc.). Two classes can be used in a project as follows.

Sweep / / initialization code interface _previewView = [[QiCodePreviewView alloc] initWithFrame: self. The bounds]; _previewView.autoresizingMask = UIViewAutoresizingFlexibleHeight; [self.view addSubview:_previewView]; __weak Typeof (self) weakSelf = self; _codeManager = [[QiCodeManager alloc] initWithPreviewView:_previewView completion:^{// Start scanning [Weakself. codeManager startScanningWithCallback:^(NSString * _Nonnull code) {} autoStop:YES]; }];Copy the code

QiCodePreviewView uses CAShapeLayer to draw a mask maskLayer, a scan box rectLayer, a cornerLayer and a scan line lineLayer. Because this part involves more code, this article does not elaborate, you can view the source code from QiQRCode. QiShare’s use of CAShapeLayer is covered in the iOS Drawing Rounded Corners article.

Next, I will focus on the implementation process of the scan function in QiCodeManager.

First, identify (capture) qr code/bar code

QiCodeManager is a encapsulation of AVCaptureSession and related classes in AVFoundation framework based on iOS 7+. AVCaptureSession is the core class for capturing audio and video data in the AVFoundation framework. In addition to using AVCaptureSession, Also want to use AVCaptureDevice, AVCaptureDeviceInput, AVCaptureMetadataOutput and AVCaptureVideoPreviewLayer. The core code is as follows:

// input AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil]; // output AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init]; [output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()]; // session _session = [[AVCaptureSession alloc] init]; _session.sessionPreset = AVCaptureSessionPresetHigh; if ([_session canAddInput:input]) { [_session addInput:input]; } if ([_session canAddOutput:output]) { [_session addOutput:output]; / / the output can be set up only after be add to the session metadataObjectTypes attribute output. MetadataObjectTypes = @ [AVMetadataObjectTypeQRCode, AVMetadataObjectTypeCode128Code AVMetadataObjectTypeEAN13Code]; } // previewLayer AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session]; previewLayer.frame = previewView.layer.bounds; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [previewView.layer insertSublayer:previewLayer atIndex:0];Copy the code
// AVCaptureMetadataOutputObjectsDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
    
    AVMetadataMachineReadableCodeObject *code = metadataObjects.firstObject;
    if (code.stringValue) { }
}
Copy the code

With “the brain” oriented programming ideas to explain the above code: 1, we need to use examples of AVCaptureVideoPreviewLayer previewLayer shows a bar code is scan qr code/see images; 2. However, the initialization of previewLayer requires AVCaptureSession instance session to control the input and output of data; 3, then we can initialize a session, and set the quality of the output stream for high quality AVCaptureSessionPresetHigh; 4. Because session relies on AVCaptureDeviceInput and AVCaptureMetadataOutput to control data input and output; 5. Initialize an input with AVCaptureDevice instance Device, specifying that device is of type AVMediaTypeVideo. Initialize an output and set the delegate and Queue as well as the supported metadata types (qr codes and bar codes in different formats). Add input and output to session and call [session startRunning]. You can scan the QR code; 8, finally from captureOutput: didOutputMetadataObjects: fromConnection: methods of capture the qr code/barcode data.

At this point, the QR/barcode can be recognized in the previewLayer scope.

2. Specify the area for identifying the QR code/barcode

To control the identification of qr codes/barcodes in the specified area of the previewLayer, modify the rectOfInterest property of the output to do so. The code is as follows:

// Calculate recT coordinates CGFloat y = rectframe.origin. CGFloat x = previewView.bounds.size.width - rectFrame.origin.x - rectFrame.size.width; CGFloat h = rectFrame.size.height; CGFloat w = rectFrame.size.width; CGFloat rectY = y / previewView.bounds.size.height; CGFloat rectX = x / previewView.bounds.size.width; CGFloat rectH = h / previewView.bounds.size.height; CGFloat rectW = w / previewView.bounds.size.width; // Output. RectOfInterest = CGRectMake(rectY, rectX, rectH, rectW);Copy the code

1, the above CGRectMake(rectY, rectX, rectH, rectW) is different from the traditional definition of CGRectMake(x, Y, W, H). RectOfInterest can be understood as the flipped CGRect; 2. RectY, rectX, rectH and rectW are not values of controls or regions, but their corresponding ratios. For example, the calculation formula in the above code, the values of Y, x, H and W can be referred to the following figure. 3. The default value of rectOfInterest is CGRectMake(.0,.0, 1.0, 1.0), indicating that the area for identifying qr code/bar code is full-screen (previewLayer area).

PS: In fact, iOS provides an official API to convert standard RECt to rectOfInterest, but it only works after [session startRunning], and there will be a stoning flash from time to time. The code is as follows:

/ / can be [session startRunning] with the statements set sweep metadataOutput code area. After rectOfInterest = [previewLayer metadataOutputRectOfInterestForRect:rectFrame];Copy the code

Iii. Zoom in on the QR code/bar code

Zooming the QR code/barcode is a nice feature when the QR code/barcode is far away from us, as follows:

The above effect is achieved by using the double finger zoom method, the specific code is as follows:

// add pinchGesture UIPinchGestureRecognizer *pinchGesture = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:@selector(pinch:)]; [previewView addGestureRecognizer:pinchGesture];Copy the code
- (void)pinch:(UIPinchGestureRecognizer *)gesture { AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; CGFloat minZoomFactor = 1.0; CGFloat minZoomFactor = 1.0; CGFloat maxZoomFactor = device.activeFormat.videoMaxZoomFactor; If (@ the available (iOS 11.0, *)) {minZoomFactor = device. MinAvailableVideoZoomFactor; maxZoomFactor = device.maxAvailableVideoZoomFactor; } static CGFloat lastZoomFactor = 1.0; If (gesture. State = = UIGestureRecognizerStateBegan) {/ / record last zoom ratio, For this zoom, overlay lastZoomFactor = device.videoZoomFactor; / / lastZoomFactor as external variables} else if (gesture. State = = UIGestureRecognizerStateChanged) {CGFloat zoomFactor = lastZoomFactor * gesture.scale; zoomFactor = fmaxf(fminf(zoomFactor, maxZoomFactor), minZoomFactor); [device lockForConfiguration:nil]; // Lock device before modifying device properties. VideoZoomFactor = zoomFactor; // Modify the device's video zoom ratio [device unlockForConfiguration]; Unlock}}Copy the code

Add a pinchGesture on the previewView and set the target and selector. 2. Adjust device.videoZoomFactor in the Selector method according to Gesture. scale; 3. Before modifying device properties, lock and unlock.

Four, under the weak light environment turn on the flashlight

Low-light environment has a great impact on the code scanning function, and providing users with the option to turn on the flashlight by monitoring the light brightness will improve a lot of experience, as shown in the following figure:

The code of weak light monitoring is as follows:

- (void)observeLightStatus:(void (^)(BOOL, BOOL))lightObserver { _lightObserver = lightObserver; AVCaptureVideoDataOutput *lightOutput = [[AVCaptureVideoDataOutput alloc] init]; [lightOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; if ([_session canAddOutput:lightOutput]) { [_session addOutput:lightOutput]; } } // AVCaptureVideoDataOutputSampleBufferDelegate - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Brightness to light brightness values obtained through sampleBuffer CFDictionaryRef metadataDicRef = CMCopyDictionaryOfAttachments (NULL, sampleBuffer, kCMAttachmentMode_ShouldPropagate); NSDictionary *metadataDic = (__bridge NSDictionary *)metadataDicRef; CFRelease(metadataDicRef); NSDictionary *exifDic = metadataDic[(__bridge NSString *)kCGImagePropertyExifDictionary]; CGFloat brightness = [exifDic[(__bridge NSString *)kCGImagePropertyExifBrightnessValue] floatValue]; / / initialize variables, whether as a passthrough the brightness factor AVCaptureDevice * device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; BOOL torchOn = device.torchMode == AVCaptureTorchModeOn; BOOL Dimmed = Brightness < 1.0; static BOOL lastDimmed = NO; If (_lightObserver) {if (! _lightObserverHasCalled) { _lightObserver(dimmed, torchOn); _lightObserverHasCalled = YES; lastDimmed = dimmed; } else if (dimmed ! = lastDimmed) { _lightObserver(dimmed, torchOn); lastDimmed = dimmed; }}}Copy the code

Weak light monitoring is dependent on AVCaptureVideoDataOutput and AVCaptureVideoDataOutputSampleBufferDelegate. 1. After initializing lightOutput of AVCaptureVideoDataOutput, set the delegate and add lightOutput to the session; 2, implementation AVCaptureVideoDataOutputSampleBufferDelegate callback method – captureOutput: didOutputSampleBuffer: fromConnection:; 3. Perform various operations on the sampleBuffer in the callback method (see the above code for details) and finally get the brightness of the light. 4. Set the standard of weak light according to the Brightness value and whether it is transparent to the service logic.

Call -obServelightStatus: and implement BLCK to receive transparent light state and flashlight state, and make corresponding adjustments to the UI according to the state, the code is as follows:

__weak typeof(self) weakSelf = self; [self observeLightStatus:^(BOOL dimmed, BOOL torchOn) {if (dimmed | | torchOn) {/ / into a weak light or flashlight in an open position [weakSelf. PreviewView stopScanning]; Weakself. previewView showTorchSwitch [weakself. previewView showTorchSwitch]; // Display flashlight switch} else {// Turn to light and turn flashlight off [weakself. previewView startScanning]; Weakself. previewView hideTorchSwitch];// Hide flashlight switch}}];Copy the code

When the flashlight switch appears, we can change the status of the flashlight by clicking the switch. Switch flashlight code is as follows:

+ (void)switchTorch:(BOOL)on { AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureTorchMode torchMode = on? AVCaptureTorchModeOn: AVCaptureTorchModeOff; if (device.hasFlash && device.hasTorch && torchMode ! = device.torchMode) { [device lockForConfiguration:nil]; // Lock [device setTorchMode:torchMode] before modifying device attributes; // Modify device flashlight state [device unlockForConfiguration]; Unlock}}Copy the code

Flashlight switch (button) is packaged in QiCodePreviewView, Through QiCodePreviewViewDelegate QiCodeManager – codeScanningView: didClickedTorchSwitch: method to get the flashlight switch click event, and make the corresponding logical processing. The code is as follows:

// QiCodePreviewViewDelegate - (void)codeScanningView:(QiCodePreviewView *)scanningView didClickedTorchSwitch:(UIButton *)switchButton { switchButton.selected = ! switchButton.selected; [QiCodeManager switchTorch:switchButton.selected]; _lightObserverHasCalled = switchButton.selected; }Copy the code

To sum up, the function of scanning qr code/bar code is completed. In addition, QiCodeManager also encapsulates the method of generating qr codes/barcodes, which will be introduced in the next article.


QiQRCode is available from the QiShare open source library on GitHub.


QiShare(Simple book) QiShare(digging gold) QiShare(Zhihu) QiShare(GitHub) QiShare(CocoaChina) QiShare(StackOverflow) QiShare(wechat public account)

IOS KVC and KVO introduction to Strange dance weekly