Description of video stream transmission protocol for law enforcement camera

Note:

  • The law enforcement device mentioned in the article can be understood as an Android smartphone
  • The law enforcement software mentioned in the article can be understood as third-party software in Android phones

You can learn after reading the article

  • AIDL technology
  • Interprocess MemoryFile memory sharing technology
  • Camera Video is collected and played
  • Brief suspension frame mouth technology

Introduction to illustrate

Due to the requirements of the project, the law enforcement software can normally use the Camera resources of the equipment when local video recording is required. Since the Android system itself does not allow multiple software to use Camera resources at the same time, a set of memory-sharing sub-code stream transmission protocol is developed. When the law enforcement software needs video streams, it requests the law enforcement equipment to write YUV format video streams into the MemoryFile. The enforcement software cycles out the read YUV video stream in the specified memory at regular intervals.

Full code entry

The flow chart

rendering

  • The local camera is not enabled

  • Local camera open

Brief process and sample code description

  1. The law enforcement instrument server receives the instruction of the client to enable the video stream to be written into the memory

    	// Process the data that needs substream sent by the customer service
        private void onHandleAction(Context context, Intent intent) {
            switch (intent.getAction()) {
                /** * requires a substream */
                case Constants.ACTION_CAMERE_CORE_SHOW:
                    // If you are sending a video stream, there is no need to execute the following code
                    if(! MemoryFileServiceManager.getInsta(context).isSendVideoFrame()) MemoryFileServiceManager.getInsta(context).setSendVideoFrame(true, intent);
                    break; }}Copy the code
  2. The server obtains the condition for enabling the video

        private void sendVideoFrame(Intent intent) {
            if(intent ! =null&& intent.getExtras() ! =null) {
                Bundle extras = intent.getExtras();
                // Get the width to preview
                Constants.PREVIEWHEIGHT = extras.getInt(Constants.Config.PREVIEW_WIDTH, 1280);
                // Get the height to preview
                Constants.PREVIEWHEIGHT = extras.getInt(Constants.Config.PREVIEW_HEIGHT, 720);
                // The process that needs to bind to the other service
                Constants.BIND_OTHER_SERVICE_PCK = extras.getString(Constants.Config.BIND_OTHER_SERVICE_PCK, "");
                // The full path of the peer service needs to be bound
                Constants.BIND_OTHER_SERVICE_CLASS = extras.getString(Constants.Config.BIND_OTHER_SERVICE_CLASS, "");
                // Do you need to enable Camera ID front or back 0: back 1: front
                Constants.CAMERA_ID = extras.getInt(Constants.Config.CAMERA_ID, 0); }}Copy the code
  3. Whether the camera is enabled on the server. If the camera is enabled, no need to enable it

            // Whether there is a camera
            if (mCamera == null)
                openCamera();
    Copy the code
  4. The server initializes a block of memory for writing YUV video streams.

     mMemoryFile = initMemoryFile(Constants.MEMORY_FILE_NAME, Constants.MEMORY_SIZE);
    Copy the code
  5. Bind the peer service to provide a file descriptor

        /** * bind the peer service to provide the file descriptor */
        private void bindOtherService(a) {
            try {
                if (TextUtils.isEmpty(Constants.BIND_OTHER_SERVICE_PCK) || TextUtils.isEmpty(Constants.BIND_OTHER_SERVICE_CLASS))
                    throw new NullPointerException("PCK or CLSS is null ?");
                Intent intent = new Intent();
                ComponentName cmp = new ComponentName(Constants.BIND_OTHER_SERVICE_PCK, Constants.BIND_OTHER_SERVICE_CLASS);
                intent.setComponent(cmp);
                context.bindService(intent, mCameraServiceConnection, Context.BIND_AUTO_CREATE);
            } catch(Exception e) { Log.e(TAG, e.getMessage()); }}Copy the code
  6. The other service is successfully bound to the file descriptor ParcelFileDescriptor

                mCameraService = ICameraCoreService.Stub.asInterface(binder);
                if(mMemoryFile ! =null) {
                    try {
                        // reflection gets the file descriptor
                        mParcelFileDescriptor = MemoryFileHelper.getParcelFileDescriptor(mMemoryFile);
                        if(mParcelFileDescriptor ! =null) {
                            mCameraService.addExportMemoryFile(mParcelFileDescriptor, Constants.PREVIEWWIDTH, Constants.PREVIEWHEIGHT, Constants.MEMORY_SIZE);
    
                        }
    Copy the code
  7. Send data. If byte[0] == 0 indicates that the server can write YUV into the memory, and == 1 indicates that the client can read available YUV data.

        /** * The read flag bit writes the video stream **@param memoryFile
         */
        public void writeBytes(MemoryFile memoryFile) {
            try {
                if (mYUVQueue.size() > 0) {
                    BufferBean mBufferBean = new BufferBean(Constants.BUFFER_SIZE);
                    // Read the flag
                    memoryFile.readBytes(mBufferBean.isCanRead, 0.0.1);
                    // When the first digit is 0, it means that the client has read the video stream and can write it to memory normally
                    if (mBufferBean.isCanRead[0] = =0) {
                        // Get the video stream
                        byte[] video = mYUVQueue.poll();
                        if(video ! =null)
                            // Write the video stream to memory
                            memoryFile.writeBytes(video, 0.0, video.length);
                        // Flag bit reset, waiting for customer service to read the video stream
                        mBufferBean.isCanRead[0] = 1;
                        memoryFile.writeBytes(mBufferBean.isCanRead, 0.0.1);
                    } else {
                        Log.d(TAG, "readShareBufferMsg isCanRead:" + mBufferBean.isCanRead[0] + "; length:"+ mBufferBean.mBuffer.length); }}}catch(IOException e) { e.printStackTrace(); sendBroadcast(Constants.ACTION_FEEDBACK, e.getMessage()); }}Copy the code
  8. Other error messages such as success or failure are reported to the customer service

    // return to the customer service
    public void sendBroadcast(String action,String content) {
            Intent intent = new Intent();
            intent.setAction(action);
            ComponentName componentName = new ComponentName("com.t01.sharevideostream"."com.t01.sharevideostream.revices.FeedBackReceiver");
            intent.setComponent(componentName);
            Bundle extras = new Bundle();
            extras.putString(Constants.ACTION_FEEDBACK_CONTENT, content);
            intent.putExtras(extras);
            context.sendBroadcast(intent);
        }
    Copy the code