We know that the Camera collection and return is YUV data, AudioRecord is PCM, we need to encode these data (compression coding), here we say in Android audio and video coding can not escape the hole -MediaCodec.

MediaCodec

PSMediaCodec can be used to encode/decode audio/video.

A brief introduction to MediaCodec

The MediaCodec class can be used to access low-level media codecs, which are encoders/decoder components. It is part of the Android low-level multimedia support infrastructure (usually with MediaExtractor, MediaSync, MediaMuxer, MediaCrypto, MediaDrm, Image, Surface is used with AudioTrack). A description of MediaCodec can be found in the official MediaCodec introduction

Broadly speaking, a codec processes input data to generate output data. It processes data asynchronously and uses a set of input and output buffers. In the simple case, you request (or receive) an empty input buffer, populate it with data, and send it to the codec for processing. The codec runs out of data and converts it to one of the empty output buffers. Finally, you request (or receive) the populated output buffer, use its contents, and release it back into the codec.

For PS readers who are familiar with the producer-consumer model, MediaCodec’s operating model is not hard to understand.

Below is a simple class diagram for MediaCodec

MediaCodec state machine

In the MediaCodec lifecycle, a codec is conceptually in one of three states: Stopped, Executing, or Released. A Stopped state is actually a collection of three states: Uninitialized, Configured, and Error. A Executing state is conceptually Flushed, Running, and stream-of-stream.

When a codec is created using one of the factory methods, the codec is in an uninitialized state. First, you need to configure (…) Configure it to the configured state, and then call start () to move it to the execution state. In this state, you can process the data through the buffer queue operations described above.

Flushed, Running, and stream-of-stream are three sub-states of an execution state. Immediately after start (), the uploading codec becomes the Flushed state that contains all the buffers. Once the first input buffer is queued, the codec moves to the “Running” sub-state, where it spends most of its time. When you queue the input buffer with the end-of-stream flag, the codec transitions to an end-of-stream substate. In this state, the codec will no longer accept other input buffers, but will still generate output buffers until the end of the stream is reached at the output. In the execution state, you can use Flush () to return to the “flush” substate at any time.

The call to stop () returns the codec to the Uninitialized state, which can then be configured again. Once you have done with the codec, you must release it by calling release ().

In rare cases, the codec may encounter an error and enter an “error” state. This information is conveyed using invalid return values from queued operations or sometimes through exceptions. Call reset () to make the codec available again. You can call it from any state to move the codec back to the “Uninitialized” state. Otherwise, call Release () to move to the terminal’s “Released” state.

PSMediaCodec data processing mode can be divided into synchronous and asynchronous, we will analyze one by one below

MediaCodec Synchronization mode

In the code

public H264MediaCodecEncoder(int width, Int height) {/ / set the parameters of MediaFormat MediaFormat MediaFormat = MediaFormat. CreateVideoFormat (MIMETYPE_VIDEO_AVC, width, height); mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible); mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, width * height * 5); mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30); //FPS mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1); Try {/ / by MIMETYPE create instances mMediaCodec MediaCodec = MediaCodec. CreateEncoderByType (MIMETYPE_VIDEO_AVC); Configure (mediaFormat, null, null, mmediacodec.configure_flag_encode) MediaCodec.CONFIGURE_FLAG_ENCODE); // call start mmediacodec.start (); } catch (Exception e) { e.printStackTrace(); }}Copy the code

Call putData to add raw YUV data to the queue

  public void putData(byte[] buffer) {
        if (yuv420Queue.size() >= 10) {
            yuv420Queue.poll();
        }
        yuv420Queue.add(buffer);
 }

Copy the code
Public void startEncoder() {isRunning = true; ExecutorService executorService = Executors.newSingleThreadExecutor(); executorService.execute(new Runnable() { @Override public void run() { byte[] input = null; While (isRunning) {if (yuv420queue.size () > 0) {// Fetch data from the queue input = yuv420queue.poll (); } if (input ! = null) {try {/ / 【 1 】 dequeueInputBuffer int inputBufferIndex = mMediaCodec. DequeueInputBuffer (TIMEOUT_S); If (inputBuffer >= 0) {// [2] getInputBuffer ByteBuffer inputBuffer = null; if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) { inputBuffer = mMediaCodec.getInputBuffer(inputBufferIndex); } else { inputBuffer = mMediaCodec.getInputBuffers()[inputBufferIndex]; } inputBuffer.clear(); inputBuffer.put(input); / / [3] queueInputBuffer mMediaCodec. QueueInputBuffer (inputBufferIndex, 0, input length, getPTSUs (), 0). } MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); / / [4] dequeueOutputBuffer int outputBufferIndex = mMediaCodec. DequeueOutputBuffer (bufferInfo TIMEOUT_S); if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) { MediaFormat newFormat = mMediaCodec.getOutputFormat();  if (null ! = mEncoderCallback) { mEncoderCallback.outputMediaFormatChanged(H264_ENCODER, newFormat); } if (mMuxer ! = null) { if (mMuxerStarted) { throw new RuntimeException("format changed twice"); } // now that we have the Magic Goodies, start the muxer mTrackIndex = mMuxer.addTrack(newFormat); mMuxer.start(); mMuxerStarted = true; } } while (outputBufferIndex >= 0) { ByteBuffer outputBuffer = null; / / [5] getOutputBuffer if (android. OS. Build. VERSION. SDK_INT > = android. OS. Build. VERSION_CODES. LOLLIPOP) {outputBuffer = mMediaCodec.getOutputBuffer(outputBufferIndex); } else { outputBuffer = mMediaCodec.getOutputBuffers()[outputBufferIndex]; } if (bufferInfo.flags == MediaCodec.BUFFER_FLAG_CODEC_CONFIG) { bufferInfo.size = 0; } if (bufferInfo.size > 0) { // adjust the ByteBuffer values to match BufferInfo (not needed?) outputBuffer.position(bufferInfo.offset); outputBuffer.limit(bufferInfo.offset + bufferInfo.size); // write encoded data to muxer(need to adjust presentationTimeUs. bufferInfo.presentationTimeUs = getPTSUs(); if (mEncoderCallback ! = null) {/ / callback mEncoderCallback onEncodeOutput (H264_ENCODER outputBuffer, bufferInfo); } prevOutputPTSUs = bufferInfo.presentationTimeUs; if (mMuxer ! = null) { if (! mMuxerStarted) { throw new RuntimeException("muxer hasn't started"); } mMuxer.writeSampleData(mTrackIndex, outputBuffer, bufferInfo); } } mMediaCodec.releaseOutputBuffer(outputBufferIndex, false); bufferInfo = new MediaCodec.BufferInfo(); outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_S); } } catch (Throwable throwable) { throwable.printStackTrace(); } } else { try { Thread.sleep(500); } catch (InterruptedException e) { e.printStackTrace(); }}}}}); }Copy the code

ArrayBlockingQueue

yuv420Queue = new ArrayBlockingQueue<>(10); To receive byte[] YUV data from Camera callback, we created a new one to loop data from the buffer queue yuv420Queue to MediaCodec for encoding. Code complete format is by mMediaCodec = MediaCodec createEncoderByType (MIMETYPE_VIDEO_AVC); Specified, this output is the most widely used H264 format
[]>

See H264MediaCodecEncoder for the complete code

MediaCodec Asynchronous mode

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP) public H264MediaCodecAsyncEncoder(int width, int height) { MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIMETYPE_VIDEO_AVC, width, height); mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible); mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, width * height * 5); mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30); //FPS mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1); try { mMediaCodec = MediaCodec.createEncoderByType(MIMETYPE_VIDEO_AVC); mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); Mmediacodec.setcallback (new mediacodec.callback () {@override /** * Called when an input buffer becomes available. * * @param codec The MediaCodec object. * @param index The index of the available input buffer. */ public void onInputBufferAvailable(@NonNull MediaCodec codec, int index) { Log.i("MFB", "onInputBufferAvailable:" + index); byte[] input = null; if (isRunning) { if (yuv420Queue.size() > 0) { input = yuv420Queue.poll(); } if (input ! = null) { ByteBuffer inputBuffer = codec.getInputBuffer(index); inputBuffer.clear(); inputBuffer.put(input); codec.queueInputBuffer(index, 0, input.length, getPTSUs(), 0); } } } @Override /** * Called when an output buffer becomes available. * * @param codec The MediaCodec object. * @param index The index of the available output buffer. * @param info Info regarding the available output buffer {@link MediaCodec.BufferInfo}. */ public void onOutputBufferAvailable(@NonNull MediaCodec codec, int index, @NonNull MediaCodec.BufferInfo info) { Log.i("MFB", "onOutputBufferAvailable:" + index); ByteBuffer outputBuffer = codec.getOutputBuffer(index); if (info.flags == MediaCodec.BUFFER_FLAG_CODEC_CONFIG) { info.size = 0; } if (info.size > 0) { // adjust the ByteBuffer values to match BufferInfo (not needed?) outputBuffer.position(info.offset); outputBuffer.limit(info.offset + info.size); // write encoded data to muxer(need to adjust presentationTimeUs. info.presentationTimeUs = getPTSUs(); if (mEncoderCallback ! = null) {/ / callback mEncoderCallback onEncodeOutput (H264_ENCODER outputBuffer, info). } prevOutputPTSUs = info.presentationTimeUs; if (mMuxer ! = null) { if (! mMuxerStarted) { throw new RuntimeException("muxer hasn't started"); } mMuxer.writeSampleData(mTrackIndex, outputBuffer, info); } } codec.releaseOutputBuffer(index, false); } @Override public void onError(@NonNull MediaCodec codec, @NonNull MediaCodec.CodecException e) { } @Override /** * Called when the output format has changed * * @param codec The  MediaCodec object. * @param format The new output format. */ public void onOutputFormatChanged(@NonNull MediaCodec codec, @NonNull MediaFormat format) { if (null ! = mEncoderCallback) { mEncoderCallback.outputMediaFormatChanged(H264_ENCODER, format); } if (mMuxer ! = null) { if (mMuxerStarted) { throw new RuntimeException("format changed twice"); } // now that we have the Magic Goodies, start the muxer mTrackIndex = mMuxer.addTrack(format); mMuxer.start(); mMuxerStarted = true; }}}); mMediaCodec.start(); } catch (Exception e) { e.printStackTrace(); }}Copy the code

Please see H264MediaCodecAsyncEncoder complete code

MediaCodec summary

MediaCodec used for audio and video codec work (this process some articles also called hard solution), through MediaCodec. CreateEncoderByType (MIMETYPE_VIDEO_AVC) in the function parameters to create audio or video encoder, Similarly by MediaCodec. CreateDecoderByType (MIMETYPE_VIDEO_AVC) create audio or video decoder. MediaFormat is used to specify the different parameters required in audio and video codec.

summary

This article provides a detailed analysis of MediaCodec, which can be practiced in a blog Demo.

Put Demo address detailed Demo