The device is connected with usb camera for basic preview, photo taking and video recording. I believe some students have similar needs in their work.
Usb camera is indeed widely used in Android devices. My previous company was engaged in vehicle-mounted products. As we all know, more and more cameras are installed on cars now, and the real-time information collected by cameras can help us drive more safely. And these cameras, in addition to the more common MIPI cameras, many also began to use USB cameras.
In addition to vehicle products, similar security, medical and other aspects, and monitoring, video related to many fields, will also use USB cameras.
Uvc camera? Whether you have ever used or encountered a camera before, I believe that after reading this article, it will bring you different harvest.
This article will be explained from the following points:
What is UVC?
Ii. UVCCamera open source project
Open source project integration
4. Small demo changes: Get real-time YUV stream while recording
5. Problems encountered and solutions
What is UVC?
UVC is fully known as USB Video Class, which directly translates to USB Video Class. It is a protocol standard specifically defined for USB Video capture devices.
This standard is a joint effort by Microsoft and several other device vendors to define a protocol standard for USB video capture devices. It has become one of the USB ORG standards.
Now mainstream operating systems have provided UVC device drivers, so hardware devices that meet UVC specifications can be used normally in the host without installing any drivers. Yes, Android already supports UVC devices.
Summary: At this point you should have the idea that UVC is a protocol, and different devices may support different protocols. If we’re going to support a USB camera on an Android device, it’s going to have to be a UVC camera.
UVCCamera open source project?
Github.com/saki4510t/U…
Now we search online uVC camera-related articles, can find uVC camera-related projects, it is no exaggeration to say that the basic is based on the above open source project to change, the open source project is really awesome, and the class encapsulation is good, the code logic is clear, it is very convenient to use. Moreover, the basic functions of camera preview, taking photos and recording have been realized, which is a relatively complete project.
Git pull is used to pull the code locally and import it into AndroidStudio. Github website may not be accessible at times in China, if not, you can also try to search for this project to download on Gitee.
The directory structure of the whole project is shown in the figure below. Of course, during the import process, there will be some errors, in fact, mainly gradle version of the problem, import errors, we will be unified in the later of this article to give you a detailed explanation, including the problems encountered and how to solve these problems.This open source project, in addition to the SDK library source, the author also provides 8 demos. The specific functions of the 8 demos are described as follows:
1) USBCameraTest0 shows how to start/stop the preview using SurfaceView. 2) USBCameraTest shows how to start/stop the preview. This is almost the same as USBCameraTest0, but uses custom TextureView instead of SurfaceView to display the camera image. 3) USBCameraTest2 demonstrates how to record video from a UVC camera (no audio) as using a MediaCodec encoder. MP4 files. This example requires API>=18 because MediaMuxer only supports API>=18. 4) USBCameraTest3 demonstrates how to record audio (from an internal microphone) video (from a UVC camera) as. MP4 files. This also shows several ways to capture still images. This example is probably best suited for the foundation project of your custom application. 5) USBCameraTest4 shows how to access the UVC camera and save the video image to the background service. This is one of the most complex examples, as it requires the use of AIDL's IPC. 6) USBCameraTest5 and USBCameraTest3 are almost identical, but use the IFrameCallback interface to save the video image instead of using the Surface input from the MediaCodec encoder. In most cases, you should not use IFrameCallback to save images because IFrameCallback is much slower than using Surface. However, IFrameCallback is useful if you want to take video frame data and process them yourself or pass them as byte buffers to other external libraries. 7) USBCameraTest6 This shows how to split a video image into multiple surfaces. You can see video images side by side in this app. This example also shows how to render an image using EGL. If you want to display video images after adding visual effects/filter effects, this example might be helpful. 8) USBCameraTest7 This shows how to use two cameras and display video images from each camera. This is still experimental and there may be some problems. 9) usbCameraTest8 This shows how to set/get uVC controls. Currently this only supports brightness and contrast.Copy the code
The demo provided, the code logic is very clear, we can according to their own needs to see the corresponding demo. These demos include basic functions such as previewing, recording, and taking photos. As for the adjustment of brightness and contrast, it may be caused by different cameras. I verified it locally and found that it did not work in fact. If anyone tries something effective later, please leave me a message for us to communicate.
Demo7 is a 2-camera Demo. There are multiple camera support requirements, you can refer to this logic. Do remember before on-board, internal on qualcomm 8953 a pre-research project, was hung up the 6 camera, it will drive colleague also spent a lot of time to light up the camera, now think about, if the uvc camera, to refer to this open source project, it will soon be able to make a simple demo.
Iii. Compilation and integration of open source project UVCCaemra?
UVCCamera core code is in libuvccamera.
To integrate this project into our project, we need two things, one is the SO library, and one can call the Java SDK source code.
From the screenshot above we can see clearly that the code mainly contains jNI and Java. By compiling JNI, we can get the so library we need. The Java code can be packaged as an AAR, or the entire code can be directly copied to our project directory as a library reference.
1) Compilation of so library
Now the compilation of so library is very convenient. As shown in the figure below, we cut into the JNI directory and directly build the NDK-build in the Terminal interface of AS to generate the SO library file we need.One thing to note here is that we need the Android 32-bit or 64-bit library file. This is in application. mk, and I have circled the location of the application. mk file in the screenshot above. For 32-bit, change the APP_ABI to armeabi-v7A, for 64-bit, arm64-V8A, and for other platforms.
2) packaging aar
For our project to integrate this open source project, we must provide Java code before we can call. The approach I’ve taken here is to package the contents of UVCCamera’s core code (i.e., no 8 demos) into an AAR and then reference the packaged AAR in my own project directory. It is also easy to package an AAR and operate within an AS. Post the picture first.
From the screenshot above, we can see that there are two modules that need to be packaged, namely Libuvccamera and usbCameraCommon. Follow the sequence shown in the screenshot, from 1 to 3. First, click Gradle on the right of the AS screen. In the freed screen, double-click assembleRelease. If no error is reported, you can see the aar file in the Build Output path of the Module.
Finally, we need to copy the generated SO library files and aar files to the liBS directory of our own project and import them into the project for use.
3) Integrate the UVCCamera SDK into your own project
Through the above steps, we have successfully compiled the SO library file and AAR file. The following figure shows how we imported the generated files into our own project.
4. Small demo changes: Get real-time YUV stream while recording
I have also written an article about uvcCamea before, here is the address, together with the demo address, interested students can also have a look.
This demo, in addition to the basic preview, photo, video function, I according to their own needs, but also added a return real-time YUV stream interface. There is a need for real-time streaming video stream similar to face recognition, upload background and so on, I believe can bring you help.
Android Usb Camera in an Article
www.jianshu.com/p/35124f098…
The demo address:
Github.com/yorkZJC/Uvc…
For my own demo, if I need to change the resolution, I can just do it in the myConstants.java file shown below.
The following screenshot shows the interface for the YUV stream callback.
5. Problems encountered and solutions
1) SDK and NDK configuration
In the first step, we need to configure the SDK and NKD, I believe many students can configure the SDK. In addition, we need to use NDK-build to compile the SO library, so the NDK must be configured well. My local NDK version is R17, I believe the NDK version is not very important.
There are two ways to configure NDK. You can modify the NDK directly in the local.properties file or select our local NKD path in the view interface and Project Structure. The following screenshots correspond to the two different modification methods. You can use either of them.
2) Import Android Studio, Gradle version configuration?
Here are some of the problems I have encountered, according to my modification operation, I believe that everyone can run successfully. [error] 1
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to maven.google.com: 443 / maven.google.com/142.250.204.46 failed: Connection timed out: ConnectCopy the code
【 error2 】
ERROR: The minSdk version should not be declared in the android manifest file. You can move the version from the manifest to the defaultConfig in the build.gradle file.
Remove minSdkVersion and sync project
Affected Modules: libuvccamera
Copy the code
【 error3 】
* What went wrong:
Execution failed for task ':libuvccamera:ndkBuild'.
> A problem occurred starting process 'command 'null/ndk-build.cmd''
Copy the code
Android NDK: The armeabi ABI is no longer supported. Use armeabi-v7a. Android NDK: NDK Application 'local' targets unknown ABI(s): armeabi mips D:/APPS/sdk/android-ndk-r17b/build//.. /build/core/setup-app.mk:79: *** Android NDK: Aborting . Stop.Copy the code
The 2021-06-11 10:08:11. 386, 3105-3105 /? E/AndroidRuntime: FATAL EXCEPTION: main Process: com.serenegiant.usbcameratest0, PID: 3105 java.lang.RuntimeException: Unable to start activity ComponentInfo{com.serenegiant.usbcameratest0/com.serenegiant.usbcameratest0.MainActivity}: java.lang.IllegalStateException: You need to use a Theme.AppCompat theme (or descendant) with this activity. at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3432)Copy the code
I pulled the code locally through Git pull, so every local change can be tracked through Git. As for the compilation error, let’s take a look at what I have modified altogether.
In the screenshot above, we can see that there are 5 changes in total. I. Build. Gradle in the root directory of the project. Ii, libuvccamera/build. Gradle changes;
3) Pull out the USB camera, and the application exits due to crash?
There is a bug in the original library file, that is, when we unplug the USB camera in the process of using it, the SO library crashes, causing our application to directly exit abnormally.
This problem has been solved by other great gods on the Internet. I will post the modification here. I also personally modified the verification.
diff --git a/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c b/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c index 8626595.. c4842c4 100644 --- a/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c +++ B/libuvccamera/SRC/main/jni/libusb/libusb/OS/android_usbfs. @ @ + 2726-2726, 6, 12 c @ @ static int handle_iso_completion(struct libusb_device_handle *handle, // XXX add usbi_mutex_lock(&itransfer->lock); for (i = 0; i < num_urbs; I++){+ //+Add by york.zhou on 2021.05.19,fix issue app crash on remove usb device + if (tpriv->iso_urbs == NULL){+ break; +} + //-Add by York. Zhou on 2021.05.19,fix issue app crash on remove USB device + if (urb == tpriv->iso_urbs[I]) { urb_idx = i + 1; break; diff --git a/libuvccamera/src/main/jni/libuvc/src/stream.c b/libuvccamera/src/main/jni/libuvc/src/stream.c index 8a1e90a.. b7cedcc 100644 --- a/libuvccamera/src/main/jni/libuvc/src/stream.c +++ b/libuvccamera/src/main/jni/libuvc/src/stream.c @@@static void _uvc_delete_transfer(struct libusb_transfer *transfer) { libusb_cancel_transfer(strmh->transfers[i]); // XXX 20141112 Add UVC_DEBUG("Freeing Transfer % D (%p)", I, transfer); free(transfer->buffer); - libusb_free_transfer(transfer); + //+Add York. zhou 2021.05-19,fix remove USB Devices,app crash + //libusb_free_transfer(transfer); strmh->transfers[i] = NULL; break; }Copy the code
4) Some USB cameras don’t recognize it?
Some students may also encounter problems with some USB cameras. One of the prerequisites is to make sure that the USB camera works properly when plugged into a computer, but not when plugged into our device.
When this problem occurs, you can capture a complete logcat log and search for subclass globally in the log. Configure the subclass information in device_filter. XML in the XML directory in the format shown in the following screenshot.
That’s the end of uvcCamera. Thank you for reading. I also welcome you to pay attention to my wechat public number: Xiaochi Notes, and communicate with me.