preface
In this paper, simple book migration, the original address: www.jianshu.com/p/1ff123409…
Because the project needs to do some processing on the decoded YUV420P format data, YUV420P was obtained by ffMPEG soft solution before, but with the improvement of image pixels, the efficiency of FFMPEG has affected the software experience. Therefore, the hard solution of MediaCodec on Android is used to improve efficiency.
An overview of the
Refer to MediaCodec’s official documentation:
In broad terms, a codec processes input data to generate output data. It processes data asynchronously and uses a set of input and output buffers. At a simplistic level, you request (or receive) an empty input buffer, fill it up with data and send it to the codec for processing. The codec uses up the data and transforms it into one of its empty output buffers. Finally, you request (or receive) a filled output buffer, consume its contents and release it back to the codec.
This means that MediaCodec processes data asynchronously and uses a set of input and output buffers. The developer requests an empty input buffer, fills it with data, and puts it back into the codec. The codec processes the input data and outputs the results to an empty output buffer. The developer releases the output cache back into the codec after it has finished using its contents:
Decoding the code
Initialize the
private static final long DEFAULT_TIMEOUT_US = 1000 * 10;
private static final String MIME_TYPE = "video/avc";
private static final int VIDEO_WIDTH = 1520;
private static final int VIDEO_HEIGHT = 1520;
private MediaCodec mCodec;
private MediaCodec.BufferInfo bufferInfo;
public void initCodec() {
try {
mCodec = MediaCodec.createDecoderByType(MIME_TYPE);
} catch (IOException e) {
e.printStackTrace();
}
bufferInfo = new MediaCodec.BufferInfo();
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, VIDEO_WIDTH, VIDEO_HEIGHT);
mCodec.configure(mediaFormat, null, null, 0);
mCodec.start();
}
public void release() {
if (null != mCodec) {
mCodec.stop();
mCodec.release();
mCodec = null;
}
}
Copy the code
decoding
public void decode(byte[] h264Data) { int inputBufferIndex = mCodec.dequeueInputBuffer(DEFAULT_TIMEOUT_US); if (inputBufferIndex >= 0) { ByteBuffer inputBuffer; if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) { inputBuffer = mCodec.getInputBuffer(inputBufferIndex); } else { inputBuffer = mCodec.getInputBuffers()[inputBufferIndex]; } if (inputBuffer ! = null) { inputBuffer.clear(); inputBuffer.put(h264Data, 0, h264Data.length); mCodec.queueInputBuffer(inputBufferIndex, 0, h264Data.length, 0, 0); } } int outputBufferIndex = mCodec.dequeueOutputBuffer(bufferInfo, DEFAULT_TIMEOUT_US); ByteBuffer outputBuffer; while (outputBufferIndex > 0) { if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) { outputBuffer = mCodec.getOutputBuffer(outputBufferIndex); } else { outputBuffer = mCodec.getOutputBuffers()[outputBufferIndex]; } if (outputBuffer ! = null) { outputBuffer.position(0); outputBuffer.limit(bufferInfo.offset + bufferInfo.size); byte[] yuvData = new byte[outputBuffer.remaining()]; outputBuffer.get(yuvData); if (null! =onDecodeCallback) { onDecodeCallback.onFrame(yuvData); } mCodec.releaseOutputBuffer(outputBufferIndex, false); outputBuffer.clear(); } outputBufferIndex = mCodec.dequeueOutputBuffer(bufferInfo, DEFAULT_TIMEOUT_US); }}Copy the code
Decodes the callback interface
public interface OnDecoderCallback {
void onFrame(byte[] yuvData);
}
Copy the code
There are a few caveats:
- Use dequeueInputBuffer(long timeoutUs) to request an input buffer. TimeoutUs is the wait time in microseconds. Set to -1 for an infinite wait. The integer variable returned is the index of the requested input buffer.
- GetInputBuffers () returns an array of input buffers. Index can be used to obtain the current requested input buffer. Clear this parameter before using it to avoid the impact of previous buffers.
- Similarly, dequeueOutputBuffer(BufferInfo Info, long timeoutUs) is used to request the index of an output buffer that holds output data. BufferInfo is used to store information about the output buffer
- ReleaseOutputBuffer (int index, Boolean render); If you specify a valid surface when you configure the codec, setting render to true will first send the buffer to the surface render, which is purely for YUV data.
Problems encountered during use
In the actual testing process, it was found that the YUV data format decoded by MediaCodec of Android devices from different manufacturers was different. For example, the standard YUV420P format decoded by MediaCodec of my test machine (a tablet of an unknown brand). However, on another test machine (Huawei Honor Note8) decoding is NV12 format:
Refer to Android: MediaCodec video file hardware decoding, high efficiency YUV format frame, fast save JPEG images learned that API 21 new add MediaCodec all hardware decoding support COLOR_FormatYUV420Flexible format. It is not a definitive YUV420 format, It involves COLOR_FormatYUV411Planar, COLOR_FormatYUV411PackedPlanar, COLOR_FormatYUV420Planar, COLOR_FormatYUV420PackedPlanar COLOR_Fo RmatYUV420SemiPlanar and COLOR_FormatYUV420PackedSemiPlanar these, so can ensure that the decoding frame format is one of these. As you can see from the MediaCodecInfo source code, when YUV420Flexible was introduced in API 21, these formats were deprecated:
Specify frame format
Specifying the frame format only needs to be specified before MediaCodec is configured, which is updated in initCodec above as follows:
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
mCodec.configure(mediaFormat, null, null, 0);
mCodec.start();
Copy the code
Maybe due to the few problems of my test machine, I can achieve the same effect without specifying the frame format when USING it.
YUV420P frame is obtained
Now that the decoded frame format is locked to the above mentioned, it is only one step away from the standard YUV420P frame format.
You can use McOdec.getoutputformat ().getINTEGER (mediaformat.key_COLOR_format) to get the decoded frame format. So what you get here is 1, 2, 3 COLOR_FORMATYUV411PLANAR, COLOR_FORMATYUV411PACKEDPLANAR, COLOR_FORMATYUV420PLANAR, COLOR_FORMATYUV420PACKEDPLANAR COLOR_FO RMATYUV420SEMIPLANAR and COLOR_FORMATYUV420PACKEDSEMIPLANAR in one of these, then you just need to convert the corresponding type to the standard YUV420P data with respect to OK:
MediaFormat mediaFormat = mCodec.getOutputFormat();
switch (mediaFormat.getInteger(MediaFormat.KEY_COLOR_FORMAT)) {
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV411Planar:
break;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV411PackedPlanar:
break;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
break;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
yuvData = yuv420spToYuv420P(yuvData, mediaFormat.getInteger(MediaFormat.KEY_WIDTH), mediaFormat.getInteger(MediaFormat.KEY_HEIGHT));
break;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
break;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
default:
break;
}
Copy the code
The method of yuV420SP to Yuv420P is attached:
private static byte[] yuv420spToYuv420P(byte[] yuv420spData, int width, int height) { byte[] yuv420pData = new byte[width * height * 3 / 2]; int ySize = width * height; System.arraycopy(yuv420spData, 0, yuv420pData, 0, ySize); For (int j = 0, I = 0; j < ySize / 2; j += 2, i++) { yuv420pData[ySize + i] = yuv420spData[ySize + j]; Yuv420pData [ySize * 5/4 + I] = yuv420spData[ySize + j + 1]; } return yuv420pData; }Copy the code
One last word
This paper gives the ideas of Java level conversion, but it is suggested to convert in native layer or directly compatible with different YUV formats when using. After all, this step of conversion will have a great impact on efficiency.
The decoding was indeed much faster with MediaCodec, but on inferior devices (such as my unknown tablet) the hard solution performed significantly worse than the soft solution. At present, many online comments said the hard solution pit is reasonable, but even if there is a pit, this decoding speed still let me can not stop ah ~