preface
Have you ever wondered, is it really safe to swipe your face to unlock it? If someone impersonates me with a photo or video of me, can the phone detect that it’s not me in front of the camera? B: Sure. Huawei HMS ML Kit in vivo detection technology can accurately distinguish real faces from “fake faces”. Whether it is face retaking photos, face video replay, or face mask, in vivo detection technology can immediately expose these “false face”, so that “false face” there is no hiding!
Application scenarios
Liveness detection is usually used before face comparison to make sure it’s a real person in front of the camera, rather than someone using a photo or mask, and then to check whether the current face and the recorded face are the same person. In vivo detection technology has a wide range of application scenarios in daily life. For example, when unlocking a mobile phone, liveness detection technology can prevent someone impersonating themselves to unlock the phone, resulting in the disclosure of personal information.
Or when dealing with financial business, in vivo detection technology can be used in the process of real-name authentication, first judge the current is the real face, and then compare the current face and the photo information on the ID card, confirm that the transaction is the person on the ID card, effectively prevent others from posing as themselves and causing property losses.
In addition, HMS ML Kit living detection technology supports silent living detection, which can determine whether it is a real face without the user’s cooperation. How about that? Is it very convenient? The following small series to introduce how to quickly integrate living detection technology.
The development of actual combat
1. Development preparation
Detailed preparation step can refer to huawei developer alliance: developer.huawei.com/consumer/cn… Here are the key development steps.
1.1 Configuring Maven storehouse addresses in Project gradle
buildscript {
repositories {
...
maven {url 'https://developer.huawei.com/repo/'}
}
}
dependencies {
...
classpath 'com. Huawei. Agconnect: agcp: 1.3.1.300'
}
allprojects {
repositories {
...
maven {url 'https://developer.huawei.com/repo/'}}}Copy the code
1.2 App Gradle configures SDK dependencies
dependencies{
// Introduce livisection collection package.
implementation 'com. Huawei. HMS: ml - computer vision - livenessdetection: 2.0.2.300'
}
Copy the code
1.3 Adding the Configuration in the File Header
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Copy the code
1.4 Add the following statement to the androidmanifest.xml file to automatically update the machine learning model to the device
The < meta - data android: name ="com.huawei.hms.ml.DEPENDENCY"android:value="livenessdetection"/>
Copy the code
1.5 Applying for Camera Permission
Camera permission to apply for specific operation steps can reference: developer.huawei.com/consumer/cn…
2. Code development
2.1 Create a callback for living detection results to obtain detection results.
private MLLivenessCapture.Callback callback = newMLLivenessCapture.Callback() {@Override
public void onSuccess(MLLivenessCaptureResult result) {// The test result may be alive or not alive.}@Override
public void onFailure(int errorCode) {// The detection is not complete, such as camera exception CAMERA_ERROR, add failed processing logic.}};Copy the code
2.2 Create a living detection instance and start detection.
MLLivenessCapture capture = MLLivenessCapture.getInstance();
capture.startDetect(activity, callback);
Copy the code
Effect of the Demo
The following demo shows the detection results of living detection technology when there are real faces and face masks in front of the camera. Isn’t the effect great?
Making the source code
Github.com/HMS-Core/hm…
For more details about the development guide, see the official website of Huawei Developer Alliance
Developer.huawei.com/consumer/cn…
The original link: developer.huawei.com/consumer/cn… Author: Leave the leaves behind