Two ways to develop camera applications in Android

Android provides two ways to use mobile camera resources to achieve the shooting function: – One is to call the system camera component directly with the Intent. This method is fast and convenient, suitable for the scene of direct access to photos, such as uploading albums, weibo, circle of friends photos and so on. : – One is to use the camera API to customize a custom camera. This method is suitable for scenes that need to customize the camera interface or develop special camera functions, such as cropping, filtering, adding stickers, emoticons, location tags, etc. This article focuses on how to use the camera API to customize your camera.

1.1 Invoking the Camera Component with an Intent

If your App’s only need is to call the camera to take a picture and get it, the advice of the old driver is not to implement the photo module yourself. You can use the Intent to call the system camera or a third-party App that can take a picture and retrieve the returned photo data. Create an Intent that specifies one of two capture types

  • MediaStore.ACTION_IMAGE_CAPTURETake photos;
  • MediaStore.ACTION_VIDEO_CAPTUREVideo shooting;

Intent intent = new Intent(MediaStore.ACTION_IMAGE/VIDEO_CAPTURE);
Copy the code

The generic processes startActivityForResult() and onActivityResult() are not expressed. Talk about the Intent parameter for taking photos.

The first is to set the address of the data returned after shooting:

intent.putExtra(MediaStore.EXTRA_OUTPUT, your-store-uri);
Copy the code

The mediastore. EXTRA_OUTPUT parameter is used to specify where to store photos/videos after they have been taken. You can use Android’s default directory for storing photos:

Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURE)
Copy the code
public class MainActivity extends Activity {

    private ImageView show_iv;

    private String mFilePath = "";

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        show_iv = (ImageView) findViewById(R.id.show_iv);
    }

    public void onClick(View view) {
        Intent intent = new Intent();
        // 1. Calling system camera directly returns no value
// intent.setAction("android.media.action.STILL_IMAGE_CAMERA");
// startActivity(intent);
        // 2 Calling the system camera has a return value but the return value is a thumbnail
// intent.setAction(MediaStore.ACTION_IMAGE_CAPTURE);
// startActivityForResult(intent, 100);
        // 3
         mFilePath =
         Environment.getExternalStorageDirectory().getAbsolutePath() +
         "/picture.png";
         intent.setAction(MediaStore.ACTION_IMAGE_CAPTURE);
         Uri uri = Uri.fromFile(new File(mFilePath));
        // Specify a path
         intent.putExtra(MediaStore.EXTRA_OUTPUT, uri);
         startActivityForResult(intent, 300);

        // 4. Open the system album
// intent.setAction(Intent.ACTION_PICK);
// intent.setType("image/*");
// startActivityForResult(intent, 500);
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        // TODO Auto-generated method stub
        super.onActivityResult(requestCode, resultCode, data);
        // Returns the thumbnail
        if (requestCode == 100) {
            if(data ! =null) {
                Bitmap bitmap = (Bitmap) data.getExtras().get("data");
                if(bitmap ! =null) { show_iv.setImageBitmap(bitmap); }}}/ / the original image
        if (requestCode == 300) {
            FileInputStream inputStream = null;
            try {
                inputStream = new FileInputStream(mFilePath);
                Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
                if(bitmap ! =null) { show_iv.setImageBitmap(rotateBitmap(bitmap)); }}catch (FileNotFoundException e) {
                e.printStackTrace();
            } finally {
                if(inputStream ! =null) {
                    try {
                        inputStream.close();
                    } catch(IOException e) { e.printStackTrace(); }}}}/ / photo album
        if (requestCode == 500) {
            Uri uri = data.getData();
            try {
                Bitmap bitmap = BitmapFactory.decodeStream(getContentResolver().openInputStream(uri));
                show_iv.setImageBitmap(bitmap);
            } catch(FileNotFoundException e) { e.printStackTrace(); }}}// The Samsung model rotates the photo 90 degrees so it needs to be rotated back
    public Bitmap rotateBitmap(Bitmap bitmap) {
        Matrix matrix = new Matrix();
        matrix.postRotate(90);
        Bitmap map = Bitmap.createBitmap(bitmap, 0.0, bitmap.getWidth(), bitmap.getHeight(), matrix, true);
        returnmap; }}Copy the code

Matters needing attention:

Android7.0 + camera permissions and Android7.0 + camera permissions and Android7.0 + camera permissions

1.2 Use camera API for camera development

1.2.1 Camera API key class analysis

Camera: The primary class for managing and manipulating Camera resources. It provides a complete camera interface, supporting camera resource switching, setting preview, shooting size, setting aperture, exposure, focus and other related parameters, obtaining preview, shooting frame data and other functions. The main methods are as follows:

  • Open () : Gets the camera instance.
/ ** * Create a new Camera object to access the specific hardware Camera. This throws a RuntimeException if * the same camera is opened by another application. * * <p> After you have finished using the camera, you must call {@link #release ()}, * otherwise it will remain locked and unavailable to other applications. * * <p> Your application can have only one Camera object active at a time * for specific hardware cameras. * * <p> Callbacks from other methods are passed to the thread calling open (). If the thread has no events to loop through, the * callback is passed to the main application event loop loop. If there is * no main application event loop, the callback is not passed. * * <p class = "caution" > <b> Note: </ b> * On some devices, this method may occur * and takes a long time to complete. It is best to call this method from a worker thread * (possibly using {@link android.os.asyncTask}) to avoid * blocking the main application UI thread. * * @parameter cameraId Hardware camera access, between 0 and * {@link #getNumberOfCameras ()} -1. * @ Returns a new Camera object, connected, locked and ready to use. * If the camera fails to open, a RuntimeException is raised (for example, Public static Camera open(int cameraId) {return new Camera(cameraId); } /** * Creates a new camera object to access the first rear camera * device on the camera. If the device does not have a rear camera, the * value is null. * @see # Camera open(int) */ public Camera open() {int Camera open = Camera open(); CameraInfo cameraInfo = new CameraInfo(); for (int i = 0; i < numberOfCameras; i++) { getCameraInfo(i, cameraInfo); if (cameraInfo.facing == CameraInfo.CAMERA_FACING_BACK) { return new Camera(i); } } return null; }Copy the code
  • SetPreviewDisplay (SurfaceHolder) : Bind the surface to draw the preview image. Surface is a handle to the raw buffer of the screen window, through which you can obtain the corresponding canvas on the screen, and then complete the work of drawing a View on the screen. The Camera and surface can be connected through the surfaceHolder. When the Camera and surface are connected, the preview frame data obtained by the Camera can be displayed on the screen through the Surface.
/ ** * set {@link Surface} for live preview. * Preview requires surface or surface texture, and * preview is necessary to take photos. The same surface can be reset without impact * Setting the preview surface will cancel any Settings by {@link#setPreviewTexture} set the preview surface texture * * <p> at this point, when this method is called, {@link SurfaceHolder} must already contain a surface * If you use {@link android.view.surfaceView}, * You need to register {@link surfaceholder.callback} * by calling {@link SurfaceHolder # addCallback (surfaceholder.callback)} and wait * {@link Surfaceholder.callback # surfaceCreated (SurfaceHolder)} and call setPreviewDisplay () or start preview on call *. * * <p> This method must be called before {@link #startPreview ()}. * One Exception is if the preview surafCE is not set (or is set to empty) * before startPreview () is called, then you can call this method once to preview the surface with non-empty parameter Settings (this allows camera setup and surface creation to occur in parallel, saving time). * Preview run, preview surface may not change. * * @param holder contains the Surface to place the preview on, * or NULL removes the preview Surface * and raises IOException if the method fails (for example, if the Surface is * unavailable or not applicable). * / public final void setPreviewDisplay(SurfaceHolder holder) throws IOException { if (holder ! = null) { setPreviewSurface(holder.getSurface()); } else { setPreviewSurface((Surface)null); }}Copy the code
  • SetPrameters sets camera parameters, including front and rear cameras, flash mode, focus mode, preview and photo size, etc.
/** * Changes the Settings for this camera service. * * @parameter Params Parameter used for this Camera service * If any parameter is invalid or unsupported, a RuntimeException is raised. *@see# getParameters () * /
    public void setParameters(Parameters params) {
        // If using preview allocations, don't allow preview size changes
        if (mUsingPreviewAllocation) {
            Size newPreviewSize = params.getPreviewSize();
            Size currentPreviewSize = getParameters().getPreviewSize();
            if(newPreviewSize.width ! = currentPreviewSize.width || newPreviewSize.height ! = currentPreviewSize.height) {throw new IllegalStateException("Cannot change preview size" +
                        " while a preview allocation is configured.");
            }
        }

        native_setParameters(params.flatten());
    }

Copy the code
  • StartPreview (): startPreview, display the preview frame data from the underlying hardware of the camera on the bound surface.

 /** * Start capturing and drawing preview frames to the screen. * Previews won't actually start until the surface is provided * using {@link#setPreviewDisplay (SurfaceHolder)} or * {@link#setPreviewTexture (SurfaceTexture)} * * <p> If passed {@link #setPreviewCallback(Camera.PreviewCallback)},
      * {@link# setOneShotPreviewCallback (Camera. PreviewCallback) {} or *@link# setPreviewCallbackWithBuffer (Camera. PreviewCallback)} set the callback callback *, The callback {@camera.previewCallback # onPreviewFrame (byte [], Camera)} * will be called when preview data is available. The */ that is always called back when onPreviewFrame is previewed
    public native final void startPreview(a);
Copy the code
  • TakePicture (Camera. Shuttershutter, Camera.PictureCallback RAW, Camera. Contains three callback parameters. Shutter is a callback when the shutter is pressed, RAW is a callback to get the raw data of the photo, and JPEG is a callback to get the image data compressed into JPG format.
/ ** * triggers asynchronous image capture. The camera service will start * a series of callbacks to the application as the image capture progresses. * Shutter callback occurs after image is taken. This can be done using the * trigger sound to let the user know that the image has been captured. This * the original callback occurs when the original image data is available (note: the data * will be empty if no original image callback buffer is available * the original image callback buffer is insufficient to hold the original image). * Postview callback occurs when zooming, fully processed Postview * images are available (note: not all hardware supports this feature). Jpeg * A callback occurs when a compressed image is available. If the * application does not require a specific callback, you can pass NULL * instead of the callback method. * * <p> This method only works if the preview is active (after * {@link #startPreview ()}). Preview will stop after the image * takes; The caller must call {@link #startPreview ()} * again to restart the preview or take more photos. Between this should not be called * {@ link android. Media. MediaRecorder# start ()} and {@ link android. Media. MediaRecorder# stop ()}. * * <p> After calling this method, you cannot call {@link #startPreview ()} * or take another photo until the JPEG callback has returned. * * @parameter shutter image capture time callback, or empty * @parameter raw (uncompressed) image data callback, or empty * @param postview image data callback, may be empty * @param JPEG JPEG image data callback, Public final void takePicture(ShutterCallback Shutter, PictureCallback raw, PictureCallback postView, PictureCallback jpeg) { mShutterCallback = shutter; mRawImageCallback = raw; mPostviewCallback = postview; mJpegCallback = jpeg; // If callback is not set, do not send me callbacks. int msgType = 0; if (mShutterCallback ! = null) { msgType |= CAMERA_MSG_SHUTTER; } if (mRawImageCallback ! = null) { msgType |= CAMERA_MSG_RAW_IMAGE; } if (mPostviewCallback ! = null) { msgType |= CAMERA_MSG_POSTVIEW_FRAME; } if (mJpegCallback ! = null) { msgType |= CAMERA_MSG_COMPRESSED_IMAGE; } native_takePicture(msgType); mFaceDetectionRunning = false; }Copy the code
  • StopPreview (): stopPreview and close camra underlying frame data transfer and drawing on surface.
  /** * Turn off camra's underlying frame data transfer and drawing on the surface, and * reset the camera for future calls {@link #startPreview()} 
     */
    public final void stopPreview(a) {
        _stopPreview();
        mFaceDetectionRunning = false;

        mShutterCallback = null;
        mRawImageCallback = null;
        mPostviewCallback = null;
        mJpegCallback = null;
        synchronized (mAutoFocusCallbackLock) {
            mAutoFocusCallback = null;
        }
        mAutoFocusMoveCallback = null;
    }

    private native final void _stopPreview(a);
Copy the code
  • Release (): Releases the Camera instance

    /** * Disconnects and releases the Camera object resources. * * 

You must call this as soon as you're done with the Camera object.

*/
public final void release(a) { native_release(); mFaceDetectionRunning = false; } private native final void native_release(a); Copy the code

SurfaceView: I’ve written about this before

  • SurfaceView: : Class for drawing camera preview images, providing the user with real-time preview images. Ordinary views and derived classes share the same surface, and all drawing must be done in the UI thread. Surfaceview is a special view. It does not share the surface with other ordinary views, but has an independent surface inside. Surfaceview is responsible for managing the format, size and display position of the surface. Since the UI thread is also processing other interactive logic, updating the View is not guaranteed to be fast and framerate, whereas the SurfaceView, which holds a separate surface and can be drawn in a separate thread, provides a higher framerate. The preview image of custom camera is suitable for surfaceView because it requires high update speed and frame rate.

  • SurfaceHolder: Surfaceholder is an abstract interface to control the surface. It can control the size and format of the surface, modify the pixels of the surface, monitor the changes of the surface, etc. The typical application of surfaceHolder is used in the SurfaceView. The SurfaceView gets the SurfaceHolder instance through the getHolder() method, which manages and listens to the state of the surface.

  • The surfaceholder.callback interface is responsible for listening for surface state changes, with three methods:

    • surfaceCreated(SurfaceHolder holder): is called immediately after the surface is created.When developing a custom camera, you can override this function to call Camera.open () and camera.setpreviewDisplay () to obtain camera resources and connect the camera to the surface.
    • surfaceChanged(SurfaceHolder holder, int format, int width, int height): Called when the surface format or size changes.When developing a custom camera, you can override this function to enable camera preview by calling Camera. startPreview so that the camera preview frame data can be passed to the Surface to display the camera preview image in real time.
    • surfaceDestroyed(SurfaceHolder holder): called before the Surface is destroyed.When developing custom cameras, you can override this function to stop camera preview and release camera resources by calling Camera.stopPreview () and camera.release().

1.2.2 Camera development process

Declare camera permissions in Android manifest.xml

The first step in development is to declare permissions to use the camera in the Android manifest.xml file:

<uses-permission android:name="android.permission.CAMERA" />
Copy the code

Some students forget to declare permissions during development, and the application may crash at runtime. Also add the following two feature declarations:

<uses-feature android:name="android.hardware.camera" android:required="true"/>
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>
Copy the code

The required attribute indicates whether the feature must be met. For example, the setup for this example requires that you have a camera but can do without autofocus.

These two declarations are optional and are used in the app store (GooglePlay) to filter devices that do not support cameras and do not support autofocus.

In addition to the permission to save photos to the memory, also need to add the permission to read and write the memory:

<uses-permission android:name="android.permission.WEITE_EXTERNAL_STORAGE" />
Copy the code

2. Turn on the camera

On the market, consumer products such as mobile phones/tablets come with two cameras as standard. The Huawei P9, for example, has dual front-facing cameras. To be honest, I was curious what it would be like to develop a dual-camera App. Before turning on the camera device, obtain how many camera devices the current device has. If your requirements include switching front and rear cameras, check the number of cameras to see if there is a rear camera.

int cameras = Camera.getNumberOfCameras();

Copy the code

This interface returns the number of cameras: a non-negative integer. The serial number of the camera is camerasain-1. For example, on a mobile device with front and rear cameras, if the return result is 2, the cameraId of the first camera is 0, which usually corresponds to the large camera behind the mobile phone. The second camera has cameraId of 1, which usually corresponds to the phone’s front-facing selfie camera.

Camera is a hardware device resource. You need to open it before using the device resource. You can open it through the interface ** camera.open (cameraId)**. Refer to the following code:

public static Camera openCamera(int cameraId) {
    try{
        return Camera.open(cameraId);
    }catch(Exception e) {
        return null; }}Copy the code

Pay attention to

Opening the camera device may fail, so be sure to check that the opening operation is successful. There are two possible reasons for this failure. First, there is no camera on the device where the App is installed, such as some tablets or special Android devices. Second, the camera corresponding to cameraId is in use, perhaps an App is using it in the background to record video.

3. Set camera parameters

After opening the Camera device, you will get a Camera object and have exclusive access to the Camera device resources. The ** camera.getParameters ()** interface is used to get the default configuration parameters of the current Camera device. Here are some parameters I can understand:

Flash configuration Parameters. You can obtain the flash configuration Parameters of the current camera by using the parameters.getFlashMode () interface:

  • Camera.parameters.flash_mode_auto Automatic mode, when the light is low automatically turn on the flash;
  • Camera.parameters.flash_mode_off Turns off the flash;
  • FLASH_MODE_ON Flash for Camera. Parameters.flash_mode_on;
  • Camera.parameters.flash_mode_red_eye Specifies the flash parameter, which is in red-eye mode.

Focus mode configuration Parameters can be obtained through the parameters.getFocusMode () interface:

  • Camera.parameters.focus_mode_auto Auto focusing mode, special mode for small white photography;
  • Camera. parameters. FOCUS_MODE_FIXED fixed focus mode, shooting old driver mode;
  • Camera.parameters.focus_mode_edof depth-of-field mode, the favorite mode of literary girls;
  • Camera.parameters.focus_mode_infinity perspective mode, shooting large scenery mode;
  • FOCUS_MODE_MACRO micro focus mode, special mode for shooting flowers, grass and ants;

Scenario mode configuration Parameters can be obtained through the parameters.getScenemode () interface:

  • SCENE_MODE_BARCODE barcode scanning scenario. The NextQRCode project determines and sets this scenario.
  • Camera.parameters.scene_mode_action Is used to take pictures of fast athletes, cars, etc.
  • Camera.parameters.scene_mode_auto Automatically selects the scene;
  • Camera.parameters.scene_mode_hdr high dynamic contrast scene, usually used for shooting sunset clouds and other clear and dark photos;
  • Camera.parameters.scene_mode_night Night scene;

4. Set the camera preview direction

The camera preview needs to be set in the right direction to display properly, otherwise the preview will get crushed badly. Under normal circumstances, if we need to know the screen direction of equipment, can use the Resources, the Configuration. The orientation to obtain. There are two Android screen directions: Portrait and landscape, and the corresponding values are ORIENTATION_PORTRAIT and ORIENTATION_LANDSCAPE respectively. Some special but the direction of the Camera equipment, set preview. The direction of the interface Camera setDisplayOrientaion (int) parameters based on Angle for the unit, and can only be 0,90,180,270 one, ` ` default to 0, refers to the left for the cell phone Camera on the picture. Remember that the value can only be one of [0, 90, 180, 270], input other Angle values will give an error ‘.

If you want the camera to follow the direction of the device, keep the top of the preview screen straight up.

public static void followScreenOrientation(Context context, Camera camera){
    final int orientation = context.getResources().getConfiguration().orientation;
    if(orientation == Configuration.ORIENTATION_LANDSCAPE) {
        camera.setDisplayOrientation(180);
    }else if(orientation == Configuration.ORIENTATION_PORTRAIT) {
        camera.setDisplayOrientation(90); }}Copy the code

5. Preview View and take photos

We usually use the SurfaceView as the camera preview View, but you can also use Texture. Get the SurfaceHolder in the SurfaceView and set the preview through the setPreviewDisplay() interface. After setting up the preview View, keep the following two points in mind:

  • Call the startPreview() method to launch the preview, otherwise the preview View will display nothing;
  • The photo operation needs to be called after the startPreview() method is executed;
  • After each photo is taken, the preview View stops previewing. So for continuous shots, you need to call startPreview() again to restore the preview;

The Camera accepts a SurfaceHolder interface, which can be obtained via surfaceholder.callback. We can implement camera preview by inheriting SurfaceView. In NextQRCode project, the implementation of LiveCameraView class, it has realized the camera preview required processing process, very simple class, the following is its full source code:

ublic class LiveCameraView extends SurfaceView implements SurfaceHolder.Callback {
    private final static String TAG = LiveCameraView.class.getSimpleName();
    private Camera mCamera;
    private SurfaceHolder mSurfaceHolder;

    public LiveCameraView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        mSurfaceHolder = this.getHolder();
        mSurfaceHolder.addCallback(this);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "Start preview display[SURFACE-CREATED]");
        startPreviewDisplay(holder);
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        if (mSurfaceHolder.getSurface() == null) {return;
        }
        Cameras.followScreenOrientation(getContext(), mCamera);
        Log.d(TAG, "Restart preview display[SURFACE-CHANGED]");
        stopPreviewDisplay();
        startPreviewDisplay(mSurfaceHolder);
    }

    public void setCamera(Camera camera) {
        mCamera = camera;
        final Camera.Parameters params = mCamera.getParameters();
        params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
        params.setSceneMode(Camera.Parameters.SCENE_MODE_BARCODE);
    }

    private void startPreviewDisplay(SurfaceHolder holder){
        checkCamera();
        try {
            mCamera.setPreviewDisplay(holder);
            mCamera.startPreview();
        } catch (IOException e) {
            Log.e(TAG, "Error while START preview for camera", e); }}private void stopPreviewDisplay(a){
        checkCamera();
        try {
            mCamera.stopPreview();
        } catch (Exception e){
            Log.e(TAG, "Error while STOP preview for camera", e); }}private void checkCamera(a){
        if(mCamera == null) {
            throw new IllegalStateException("Camera must be set when start/stop preview, call <setCamera(Camera)> to set"); }}@Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "Stop preview display[SURFACE-DESTROYED]"); stopPreviewDisplay(); }}Copy the code

As you can see from the above code, the core code of LiveCameraView is the Callback to surfaceHolder.callback: start/stop the preview action on create/destroy. In the LiveCameraView class, we use the View lifecycle callback to implement automatic management preview lifecycle control:

  • Preview automatically opens when the SurfaceView is created.
  • Close preview when SurfaceView is destroyed
  • Reset the preview when the SurfaceView size is changed

Preview View requires attention to the size of the preview output screen. Camera output screen only supports partial sizes. As for the size part, I will update it later

After the preview View is enabled, a photo can be taken with the camera.takepicture () method, and the returned photo data is retrieved through the Callback interface. The takePicture() interface can get three types of photos:

  • The first one,ShutterCallbackThe interface, which is called back in seconds, is usually used to play sound effects like “click”;
  • The second,PictureCallbackInterface, return uncompressed RAW type photo;
  • The third,PictureCallbackInterface to return a compressed JPEG photo;

We use the third parameter, the image accuracy of JPEG type photos can meet the requirements of qr code recognition. In the NextQRCode project, ZXing recognizes two-dimensional code in Bitmap format, and byte array can be easily converted into Bitmap through BitmapFactory.


public abstract class BitmapCallback implements Camera.PictureCallback {
    @Override
    public void onPictureTaken(byte[] data, Camera camera) {
        onPictureTaken(BitmapFactory.decodeByteArray(data, 0, data.length));
    }
    public abstract void onPictureTaken(Bitmap bitmap);
}

Copy the code

6. Release the camera

Opening a camera device means your App has exclusive access to the device and no other apps can use it. So when you don’t need the camera device, remember to call **release() to release the device and turn it back on again. It doesn’t cost much. Optionally release the camera device after stopPreview()**.

Additional instrumental code implementation

  • 1. Determine whether the mobile device has a camera
public static boolean hasCameraDevice(Context ctx) {
    return ctx.getPackageManager()
            .hasSystemFeature(PackageManager.FEATURE_CAMERA);
}

Copy the code
  • 2 – Determine whether auto focus is supported
public static boolean isAutoFocusSupported(Camera.Parameters params) {
   List<String> modes = params.getSupportedFocusModes();
   return modes.contains(Camera.Parameters.FOCUS_MODE_AUTO);
}
Copy the code

Refer to the link

  • www.jianshu.com/p/7dd2191b4…
  • www.jianshu.com/p/4ee7e97be…
  • Blog.csdn.net/kintai/arti…