Lifelong Learning with you, this is Android programmer

This article covers some of the basics of Android development. Here’s what you’ll learn:

A brief introduction to the overall architecture of Android Camera

Brief introduction of Camera App layer 3. Brief introduction of Camera Framework layer 4. Camera Hal3 subsystem 5

A brief introduction to the overall architecture of Android Camera

Since Android8.0, most models use Camera API2 HAL3 architecture, steal a picture of Google first, read the whole part of the code and then look at this picture, it is really very clear, very simple, very in place. Original: source. Android. Google. Cn/devices/CAM…

1.1 Android Camera basic layering

As can be seen from the figure above, the Camera software in Android phone mainly has roughly 4 layers:

1. Application layer: Application developers can simply invoke the interfaces provided by AOSP, which are the common interfaces of Android camera applications. These interfaces operate and transfer data through the camera services of Binder and Framework layer;

2. The Framework layer: In frameworks/av/services/camera/libcameraservice/CameraService CPP, camera Framework service is the essential role, application and interaction, and HAL had interaction.

Hal layer: hardware abstraction layer. Android defines the protocol and interface for communication between Framework services and Hal layer. How to realize Hal layer is implemented by each Vendor, such as the old Qcom architecture MM-Camera, the new Camx architecture, and the Hal3 architecture after Mtk P.

4.Driver layer: data is processed from hardware to Driver layer, and the Driver layer receives HAL layer data and transfers Sensor data to HAL layer. Of course, different Sensor chips have different drivers.

Speaking of why this should be divided into so many layers, in general to distinguish boundaries, easy to upgrade, AndroidO Treble architecture analysis.

Android needs to adapt to the different configurations and sensors of each mobile phone assembly manufacturer. No matter how to change the chip, the Framework and above does not need to be changed, and App can run smoothly on machines with various configurations. The interface on HAL layer is also defined by Android Implement HAL layer suitable for your platform.

1.2 Android Camera general workflow

The green box shows the operations that application developers need to do, the blue box shows apis provided by AOSP, the yellow box shows Native Framework services, and the purple box shows HAL layer services. Describe the steps:

  1. MainActivity typically uses SurfaceView, SurfaceTexture+TextureView, or GLSurfaceView as controls for displaying preview screens, all of which include a separate Surface container for retrieving camera data .

  2. When MainActivity onCreate calls the API to notify the Framework Native Service CameraServer to connect HAL and then open the Camera hardware sensor.

  3. OpenCamera successfully notifies the App with a callback from CameraServer, calling an action like startPreview in onOpenedCamera or similar callback. CameraCaptureSession is created. During the creation process,ConfigureStream is called to the CameraServer. The ConfigureStream parameter contains a reference to the Surface in the space in the first step, which is equivalent to Surfac E container, give to CameraServer CameraServer packaging containers under the Surface for the stream, through HIDL passed to HAL, and HAL do configureStream operation

  4. When ConfigureStream succeeds, CameraServer will call the App to notify ConfigStream of success, and the App will then call AOSP SetRepeatingRequest interface to CameraServer, CameraServer initializing up an infinite loop when it threads waiting to receive the Request.

  5. CameraServer submits the request to Hal for processing. After Hal processing results are obtained, the Buffer of the request processing Result is filled into the container given by App. SetRepeatingRequest for Preview is handed to Surfac of Preview If the Buffer is Capture Request, the Surface container of ImageReader will receive the Buffer.

  6. The Surface is essentially the user and wrapper of the BufferQueue. When the Surface container set by the App in The CameraServer is full of BufferQueue, the mechanism will notify the application. At this point, the App controls take out the contents of their container and consume them The content in urface will be provided by View to SurfaceFlinger for compositing and final display, namely preview; If the Surface in ImageReader is filled in, the App will take it out and save it as an image file for consumption. reference

  7. [Android][MediaRecorder] Android MediaRecorder framework concise comb

Here’s another simple picture:

Ii. Brief introduction of Camera App layer

The application layer is the focus of application developers. It is mainly to use AOSP to provide available components of the application to make the camera application visible to users. The main interface and key points are in this Android developer/documentation/guide/camera

All the developers at the application layer need to do is to open the camera according to the interface provided by THE API of AOSP, set basic camera parameters, send request commands, and display the received data in the application interface or save it in storage.

Application developers don’t have to worry about how many cameras there are on the phone, what brand they are, how they are combined, which camera is on or off in a particular mode, they use the interface provided by AOSP through AIDL, right Binder calls the CameraServer process to fetch data from the Framework layer.

The basic process is as follows:

  1. OpenCamera: power on the Sensor
  2. ConfigureStream: this step is to control such as GLSurfaceView, ImageReader, etc to CameraServer Surface of the container.
  3. Request: Preview using SetRepeatingRequest, Capture can be used to take a picture, essentially all setRequest to CameraServer
  4. CameraServer fills the Buffer of the Request into the corresponding Surface container, which is then called back by BufferQueue to the corresponding Surface control CallBack handler, and displays the Sur in the preview or image preservation App There’s data in face.

Main a preview control and photo save control, video recording see [Android][MediaRecorder] Android MediaRecorder framework brief comb

Iii. Brief introduction of Camera Framework layer

Camera Framework layer is the CameraServer service implementation. CameraServer is Native Service, the code in the frameworks/av/services/camera/libcameraservice / CameraServer provides Aosp interface services for applications and direct interaction with Hal. Generally speaking, the probability of CamerServer problems is very low, most of the problems are in the App layer and HAL layer. The main architecture of CameraServer, as shown in the first figure, is mostly Android’s business.

3.1 CameraServer Initialization

frameworks/av/camera/cameraserver/cameraserver.rc
service cameraserver /system/bin/cameraserver
    class main
    user cameraserver
    group audio camera input drmrpc
    ioprio rt 4
    writepid /dev/cpuset/camera-daemon/tasks /dev/stune/top-app/tasks
    rlimit rtprio 10 10

Copy the code

CameraServer starts with init.

The detailed process is as follows:

3.2 App Invokes relevant operations of CameraServer

The simple process is as follows:

The detailed process is as follows:

3.2.1 open Camera:

3.2.2 configurestream

3.2.3 preview and capture request:

3.2.4 flush and close

Four Camera Hal3 subsystem

Android Official ExplanationHAL subsystemAndroid’s Camera Hardware Abstraction layer (HAL) connects the higher-level camera framework apis in Android.hardware.Camera 2 to the underlying camera drivers and hardware. Android 8.0 introduced Treble to switch the CameraHal API to a stable interface defined by HAL Interface Description Language (HIDL). Steal a picture:

1. The application sends a request to the camera subsystem. These include resolution and pixel format; Manual sensor, lens and flash controls; 3A operation mode; RAW to YUV processing controls; And the generation of statistics. Multiple requests can be made at a time, and requests can be submitted without blocking. Requests are always processed in the order they are received.

2. As you can see in the figure, the request carries the data container Surface, which is handed in to the Framework Cameraserver, packaged as the Camera3OutputStream instance, and packaged as Hal in a CameraCaptureSession Request is handed over to HAL. Hal gets the processing data and sends it back to CameraServer (CaptureResult notifications to Framework), which then gets the data from Hal and puts it into the Stream container Surface. These surfaces come from the controls that encapsulate the Surface in the application layer, so that the App gets the data from the camera subsystem.

3.HAL3 implements event and data transfer based on captureRequest and CaptureResult. Each Request corresponds to a Result.

4. Of course, these are the Android native HAL3 definitions, the interface is put there, of course, different chip manufacturers implement different, among which contact is Qualcomm MM-Camera, CAMX, Mediatek MTKCam HAL3, continue to sort out the implementation process.

The HAL3 interface is defined at androidxref.com/9.0.0_r3/xr…

Key points to be sorted out. – Ongoing

  1. The code of Camera App and Framework will be summarized here for the time being. HAL layer will be divided into Qcom and Mtk for code sorting and architecture summary respectively
  2. Event-driven (Request) and data (Buffer) transfer in Android camera,Buffer management, etc
  3. Camera underlying related modules, such as ISP,IPE,JPEG, etc
  4. Android Camera development debug method

The original link: blog.csdn.net/TaylorPotte…

Friendly recommendation: Android dry goods to share

At this point, this has ended, if there is wrong place, welcome your suggestion and correction. Meanwhile, I look forward to your attention. Thank you for reading. Thank you!