1. Project background

As required by the company, we need to develop a set of apps similar to face punch card function. However, since our company does not have strong native Android developers, we chose the third-party cross-platform Uniapp according to the current situation. I think most people know about this platform now, so I will not repeat it, but go directly to itOfficial website of Uniapp, it has a disadvantage is a lot of complex functions can not be realized, such as today we are going to say based on the rainbow soft open platform face recognition function, so how to do? Of course, there is a way to use the Android native integrated Rainbow SDK and make it into a plug-in for UniApp, which is our topic today. In addition, you can go to the official official to do a deeper understanding of what hongsoft open platform does, and go to the official link:Rainbow soft official“, why use rainbow soft, more do not say, I will say a point: free, free, free, how about this reason? ! The following is the solution provided by Hongsoft open platform:


Warm prompt \color{DarkTurquoise}{color}


This article is written for small white, small white need not be afraid of other also need certain a n d r o i d Native foundation, the introduction can understand the code, do not need to be proficient \color{DarkTurquoise}{color{DarkTurquoise}}

2. Technology stack and SDK used in this article

– Hongsoft face recognition SDK V3.0

– android

– vue

– uniapp

3. Technical access

1, to the rainbow console (to log in oh) download face recognition Demo,The transmitting array.

Notice you need to create an application, as shown in the following figure. Demo is included in the SDK

2. Import Demo into AndroidStudio. Here is what Demo looks like:


Note: A n d r o i d S t u d i o The imported project path must not contain Chinese \color{DarkTurquoise}{note: AndroidStudio import project path must not have Chinese}

*

3, if there is no accident, run the project will appear the following interface, so soft Demo will run up

If something goes wrong, check out the article’s
Errors you might encounter \color{DarkTurquoise}{possible error}
chapter

4, next to run uniApp Demo, first go to the official Download uniAppUni native plug-in development for Android platformDemo

5. Import Demo into AndroidStudio. Here is what Demo looks like:


Note: A n d r o i d S t u d i o The imported project path must not contain Chinese \color{DarkTurquoise}{note: AndroidStudio import project path must not have Chinese}

6, run the project, will appear
Not configured a p p k e y Or incorrectly configured \color{DarkTurquoise}{appkey not configured or not configured}
For the solution, please refer to:How to apply for AppKey Array\


Note that solving this problem is a little more complicated, please read the official documents carefully, do not doubt the accuracy of the official documents \color{DarkTurquoise}{please read the official document carefully, do not doubt the correctness of the official document}

// Generate sha1 in the process. Run CMD keytool.exe -list -v -keystore in C:\Program Files\Java\jre1.8.0_291\binCopy the code

7, after getting the Appkey, write the meta-data in the AndroidManifest.xml file, and then configure the certificate applied in the application process of Appkey to the project, run the project again, if there is no accident, run the project will appear the following interface, At this point the UniApp Demo is up and running

The next step is to integrate the two demos. First right-click in the UniApp Demo and create a Module

9. Select Android Library and fill in the following properties on the right side. Pay attention to the Package name as consistent as possible with that in Demo of Rainbow Software, because it will avoid solving some unnecessary errors later

10. Copy all contents (including folders) from Demo to the same location in the Module you just created

libs 
java 
jniLibs
res
Copy the code

11. Delete “build. Gradle” in Module and add it to “Dependencies” below

compileOnly fileTree(dir: '.. /app/libs', include: [' uniapp - v8 - the aar ']) implementation 'com. Alibaba: fastjson: 1.1.46. Android' implementation 'com. Squareup. Okhttp3: okhttp: 4.9.1' implementation 'com. Making. Bumptech. Glide: glide: 4.9.0' implementation 'the IO. Reactivex. Rxjava2: rxjava: 2.2.6' implementation 'IO. Reactivex. Rxjava2: rxandroid: 2.1.0' compileOnly "Com. Android. Support: recyclerview - v7:28.0.0" compileOnly "com. Android. Support: support - v4:28.0.0" compileOnly "Com. Android. Support: appcompat - v7:28.0.0" implementation 'com. Android. Support. The constraint, the constraint - layout: 2.0.1' TestImplementation 'junit: junit: 4. +' androidTestImplementation 'com. Android. Support. Test: runner: 1.0.2' AndroidTestImplementation 'com. Android. Support. Test. Espresso: espresso - core: 3.0.2'Copy the code

12. So far, we have all the preparations in place. Next we need to create the following three files

FaceReco_AppProxy.java // Used to initialize the dynamic link library
FaceReco.java // To activate the Rainbow SDK
FaceRecoView.java // For face detection view
Copy the code

13, we find the FaceAttrPreviewActivity file, copy the core code of face recognition to the FaceRecoView file, the core code is as follows:

Private void initEngine() {faceEngine = new faceEngine (); afCode = faceEngine.init(getContext(), DetectMode.ASF_DETECT_MODE_VIDEO, ConfigUtil.getFtOrient(getContext()), 16, 20, FaceEngine.ASF_FACE_DETECT | FaceEngine.ASF_AGE | FaceEngine.ASF_FACE3DANGLE | FaceEngine.ASF_GENDER | FaceEngine.ASF_LIVENESS); Log.i(TAG, "initEngine: init: " + afCode); if (afCode ! = ErrorInfo.MOK) { System.out.println(R.string.init_failed+":"+afCode); / / private void unInitEngine() {if (afCode == 0) {afCode = faceEngine.uninit (); Log.i(TAG, "unInitEngine: " + afCode); }} /** * Initialize camera */ private void initCamera() {DisplayMetrics metrics = new DisplayMetrics(); Activity activity = (Activity)getContext(); activity.getWindowManager().getDefaultDisplay().getMetrics(metrics); CameraListener cameraListener = new CameraListener() { @Override public void onCameraOpened(Camera camera, int cameraId, int displayOrientation, boolean isMirror) { Log.i(TAG, "onCameraOpened: " + cameraId + " " + displayOrientation + " " + isMirror); previewSize = camera.getParameters().getPreviewSize(); drawHelper = new DrawHelper(previewSize.width, previewSize.height, previewView.getWidth(), previewView.getHeight(), displayOrientation , cameraId, isMirror, false, false); } @Override public void onPreview(byte[] nv21, Camera camera) { if (faceRectView ! = null) { faceRectView.clearFaceInfo(); } List<FaceInfo> faceInfoList = new ArrayList<>(); long start = System.currentTimeMillis(); int code = faceEngine.detectFaces(nv21, previewSize.width, previewSize.height, FaceEngine.CP_PAF_NV21, faceInfoList); if (code == ErrorInfo.MOK && faceInfoList.size() > 0) { code = faceEngine.process(nv21, previewSize.width, previewSize.height, FaceEngine.CP_PAF_NV21, faceInfoList, processMask); if (code ! = ErrorInfo.MOK) { return; } } else { return; } List<AgeInfo> ageInfoList = new ArrayList<>(); List<GenderInfo> genderInfoList = new ArrayList<>(); List<Face3DAngle> face3DAngleList = new ArrayList<>(); List<LivenessInfo> faceLivenessInfoList = new ArrayList<>(); int ageCode = faceEngine.getAge(ageInfoList); int genderCode = faceEngine.getGender(genderInfoList); int face3DAngleCode = faceEngine.getFace3DAngle(face3DAngleList); int livenessCode = faceEngine.getLiveness(faceLivenessInfoList); / / is one of the error code is not for ErrorInfo. MOK, return the if ((ageCode | genderCode | face3DAngleCode | livenessCode)! = ErrorInfo.MOK) { return; } system.out.println (" Check success "); if (faceRectView ! = null && drawHelper ! = null) { List<DrawInfo> drawInfoList = new ArrayList<>(); for (int i = 0; i < faceInfoList.size(); i++) { drawInfoList.add(new DrawInfo(drawHelper.adjustRect(faceInfoList.get(i).getRect()), genderInfoList.get(i).getGender(), ageInfoList.get(i).getAge(), faceLivenessInfoList.get(i).getLiveness(), RecognizeColor.COLOR_UNKNOWN, null)); } drawHelper.draw(faceRectView, drawInfoList); } } @Override public void onCameraClosed() { Log.i(TAG, "onCameraClosed: "); } @Override public void onCameraError(Exception e) { Log.i(TAG, "onCameraError: " + e.getMessage()); } @Override public void onCameraConfigurationChanged(int cameraID, int displayOrientation) { if (drawHelper ! = null) { drawHelper.setCameraDisplayOrientation(displayOrientation); } Log.i(TAG, "onCameraConfigurationChanged: " + cameraID + " " + displayOrientation); }}; cameraHelper = new CameraHelper.Builder() .previewViewSize(new Point(previewView.getMeasuredWidth(), previewView.getMeasuredHeight())) .rotation(activity.getWindowManager().getDefaultDisplay().getRotation()) .specificCameraId(rgbCameraId ! = null ? rgbCameraId : Camera.CameraInfo.CAMERA_FACING_FRONT) .isMirror(false) .previewOn(previewView) .cameraListener(cameraListener) .build(); cameraHelper.init(); cameraHelper.start(); }Copy the code

14, face detection page is integrated into uniApp, of course, can not be used, why? Of course, there are two files left, one for activating the SDK and one for initializing the load of the dynamic link library file. The most important two steps are to do ~\

15, first write the initialization dynamic link library file code to the FaceReco_AppProxy file

* Check if the dynamic link library can be found. If not, * * @param Libraries Required dynamic link libraries * @return Whether the dynamic link library exists */ Private Boolean checkSoFile(String[] libraries,Application application) { ApplicationInfo applicationInfo = application.getApplicationInfo(); File dir = new File(applicationInfo.nativeLibraryDir); System.out.println(" file path: "+ dir.getabSolutePath ()); File[] files = dir.listFiles(); if (files == null || files.length == 0) { return false; } List<String> libraryNameList = new ArrayList<>(); For (File File: files) {system.out.println (" File name: "+file.getName())); libraryNameList.add(file.getName()); } boolean exists = true; for (String library : libraries) { exists &= libraryNameList.contains(library); } return exists; }Copy the code

16, then activate the SDK file code to write to the FaceReco file

/** * Activate the device */ private void active(){Observable. Create (new ObservableOnSubscribe<Integer>() {@override public void subscribe(ObservableEmitter<Integer> emitter) { RuntimeABI runtimeABI = FaceEngine.getRuntimeABI(); Log.i(TAG, "subscribe: getRuntimeABI() " + runtimeABI); int activeCode = FaceEngine.activeOnline(mUniSDKInstance.getContext(), CommonUtil.getAppId(), CommonUtil.getSdkKey()); emitter.onNext(activeCode); } }) .subscribeOn(Schedulers.io()) .observeOn(AndroidSchedulers.mainThread()) .subscribe(new Observer<Integer>() { @Override public void onSubscribe(Disposable d) { } @Override public void onNext(Integer activeCode) { if (activeCode ==  ErrorInfo.MOK) { showToast(getString(R.string.active_success)); MJsCallback. InvokeAndKeepAlive (" activate success "); } else if (activeCode == ErrorInfo.MERR_ASF_ALREADY_ACTIVATED) { showToast(getString(R.string.already_activated)); MJsCallback. InvokeAndKeepAlive (" the equipment has been activated "); } else { showToast(getString(R.string.active_failed)+":"+activeCode); MJsCallback. InvokeAndKeepAlive (" activation failed, error code: "+ activeCode); } ActiveFileInfo activeFileInfo = new ActiveFileInfo(); int res = FaceEngine.getActiveFileInfo(mUniSDKInstance.getContext(), activeFileInfo); if (res == ErrorInfo.MOK) { Log.i(TAG, activeFileInfo.toString()); } } @Override public void onError(Throwable e) { showToast(e.getMessage()); } @Override public void onComplete() { } }); }Copy the code

17. Replace the Androidmanifest.xml file with the following:

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.arcsoft.arcfacedemo"> <uses-permission  android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.READ_PHONE_STATE" /> <uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" /> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> </manifest>Copy the code

18. So far, all the configuration of our plug-in is basically finished. Next, delete the two folders

activity 
fragment
Copy the code

19. Introduce our plug-in into the app project and configure it in build.gradle of the app project

implementation project(':arcfacedemo')
Copy the code

20, will run up the project, there is no mistake, beautiful, everything is so perfect, as shown in the following picture, ha ha, no change, why no change? Let’s move on!

21, just saw the uniapp is our main interface, now we just finished the plug-in part, is to make the next uniapp to adjust our plug-in, first to write a interface, here, I will not write interface, I just say how to adjust the plug-in, yi, by the way, our plugin has not been packaged, the package the plug-in

22. Select Build->Rebuild Project in Android Studio.

How do you use it? Here I provide package.json, with this I need not say more!

{" name ":" rainbow soft SDK face detection ", "id" : "the arc - face", # # # # "version" : "1.0.0", "description" : "Face detection plug-in developed based on Hongsoft SDK, plug-in permanent maintenance, welcome to request (QQ group: 785919513) ", "_dp_type":" nativePlugin "," _dp_nativePlugin ":{"android": {"plugins": [ { "type": "module", "name": "arc-faceReco", "class": "com.arcsoft.arcfacedemo.FaceReco" }, { "type": "component", "name": "arc-faceRecoView", "class": "com.arcsoft.arcfacedemo.FaceRecoView" } ], "hooksClass": "com.arcsoft.arcfacedemo.FaceReco_AppProxy", "integrateType": "aar", "abis": [ "armeabi-v7a", "arm64-v8a" ], "minSdkVersion":23 } } }Copy the code

So far the whole process of plug-in production is explained \

25. Finally, attach the source code:Source array

4. Errors you may encounter

How to say this! Most of the errors encountered are environmental problems, or business problems, which need to be properly addressed, so let’s talk about some of the problems I encountered while integrating

1. Dynamic link library (.so file) not found

Solution: Forget to copy the.so file

Forget the mistake. I’ll fix it later

Solution: Select Android Library instead of Phone & Tablet when creating a Module

Forget about the mistake. I’ll fix it later

Solution: No Chinese in the project path

Completion of 5,

To learn more about facial recognition products, go toHongsoft Vision open platformoh