In this article, we will briefly introduce the decoding and drawing of the Android RTMP/RTSP player.

decoding

When it comes to decoding, we all know that soft and hard solutions, and even some companies feel that hard decoding is universal enough, and slowly abandon soft solutions. If you consider the device matching, both soft and hard decoding support, it is a good choice. Therefore, daniu live SDK in the development of this, the classification is like this:

1. Soft decoding: after obtaining the original data, you can perform subsequent operations such as callback and snapshot of the original data.

2. Hard decoding: After obtaining the original data, you can perform subsequent operations such as callback and snapshot of the original data.

3. Hard decoding: Set the surface mode and directly render to the set surface, without the operation of snapshot and data callback after decoding.

You may be wondering, why support mode 3 when you have mode 2? What are the advantages of modes 2 and 3?

Hard decoding directly sets surface mode, relatively speaking, most chips have better support, better decoding versatility, and reduce data copy, lower resource occupancy, the disadvantage is that the original data after decoding can not be obtained, more like a black box operation; Mode 2 takes into account both hard decoding resource occupancy (relative to soft solution) and secondary operation of original data (such as secondary processing of YUV/RGB data after decoding). Compared with Mode 3, the decoding universality is slightly worse, but data processing is more flexible.

Related interfaces:

	/** * Set Video H.264 HW decoder **@param handle: return value from SmartPlayerOpen()
	 *
	 * @param isHWDecoder: 0: software decoder; 1: hardware decoder.
	 *
	 * @return {0} if successful
	 */
	public native int SetSmartPlayerVideoHWDecoder(long handle, int isHWDecoder);

	/** * Set Video H.265(HEVC) HW decoder **@param handle: return value from SmartPlayerOpen()
	 *
	 * @param isHevcHWDecoder: 0: software decoder; 1: hardware decoder.
	 *
	 * @return {0} if successful
	 */
	public native int SetSmartPlayerVideoHevcHWDecoder(long handle, int isHevcHWDecoder);

	/** * Set Surface View. **@param handle: return value from SmartPlayerOpen()
	 *
	 * @param surface: surface view
	 *
	 * <pre> NOTE: if not set or set surface with null, it will playback audio only. </pre> 
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetSurface(long handle, Object surface);
Copy the code

Considering that not all devices support hard solution, the design idea of DANIu Live SDK is to do the hard solution test first, and switch to the soft solution mode directly when the hard solution is not supported.

draw

The drawing of RTMP and RTSP player of DANiu live SDK supports two modes, ordinary SurfaceView and GLSurface. Ordinary Surface has better compatibility, and GLSurface is relatively more delicate to draw. In addition, in ordinary Surface mode, Some anti-aliasing parameter Settings are also supported. In both modes, the filling mode setting option of the video screen is designed (whether to display in equal proportion or not). The specific interface design is as follows:

	/** * Set the fill mode of the video screen, such as fill the whole view, equal proportion to fill the view, if not set, default fill the whole view *@param handle: return value from SmartPlayerOpen()
	 * @paramRender_scale_mode 0: fills the entire view; 1: Fills the view proportionally. The default is 0 *@return {0} if successful
	 */
	public native int SmartPlayerSetRenderScaleMode(long handle, int render_scale_mode);

	/ * * * SurfaceView set mode (NTRenderer. CreateRenderer the second parameter passed in the case of false), render type * *@param handle: return value from SmartPlayerOpen()
	 *
	 * @paramFormat: 0: RGB565 format. If this parameter is not set, the default value is RGB565. 1: ARGB8888 format * *@return {0} if successful
	 */
	public native int SmartPlayerSetSurfaceRenderFormat(long handle, int format);

	/ * * * SurfaceView setting mode (NTRenderer. CreateRenderer the second parameter passed in the case of false), antialiasing effect, note: anti-aliasing mode after open, may be video performance, please be careful with * *@param handle: return value from SmartPlayerOpen()
	 *
	 * @paramIsEnableAntiAlias: 0: If this parameter is not set, the anti-aliasing mode is disabled by default. 1: Enable anti-aliasing mode * *@return {0} if successful
	 */
	public native int SmartPlayerSetSurfaceAntiAlias(long handle, int isEnableAntiAlias);
Copy the code

For audio output, audioTrack and OpenSL ES can be considered. In consideration of generality, Audiotrack mode can be selected. Of course, it is better to set an option for users to choose by themselves:

	/** * Set AudioOutput Type **@param handle: return value from SmartPlayerOpen()
	 *
	 * @param use_audiotrack:
	 *
	 * <pre> NOTE: if use_audiotrack with 0: it will use auto-select output devices; if with 1: will use audio-track mode. </pre>
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetAudioOutputType(long handle, int use_audiotrack);
Copy the code

Video view inversion/rotation

	/** * Set video vertical inversion **@param handle: return value from SmartPlayerOpen()
	 *
	 * @paramIs_flip: 0: no inversion, 1: inversion * *@return {0} if successful
	 */
	public native int SmartPlayerSetFlipVertical(long handle, int is_flip);

	/** * Set video horizontal inversion **@param handle: return value from SmartPlayerOpen()
	 *
	 * @paramIs_flip: 0: no inversion, 1: inversion * *@return {0} if successful
	 */
	public native int SmartPlayerSetFlipHorizontal(long handle, int is_flip);

	/** * Set the rotation clockwise, note that any Angle except 0 degrees will consume extra performance **@param handle: return value from SmartPlayerOpen()
	 *
	 * @paramDegress: Supports 0 °, 90 °, 180 °, and 270 ° rotation * *@return {0} if successful
	 */
	public native int SmartPlayerSetRotation(long handle, int degress);
Copy the code

Callback of raw data after decoding

In some cases, developers need to process YUV/RGB or PCM data after decoding. In this case, they need to design the interface model for the decoded data callback:

	/** * Set External Render **@param handle: return value from SmartPlayerOpen()
	 *
	 * @param external_render: External Render
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetExternalRender(long handle, Object external_render);

	/** * Set External Audio Output **@param handle: return value from SmartPlayerOpen()
	 *
	 * @param external_audio_output:  External Audio Output
	 *
	 * @return {0} if successful
	 */
	public native int SmartPlayerSetExternalAudioOutput(long handle, Object external_audio_output);
Copy the code

Specific call example:

//libPlayer.SmartPlayerSetExternalRender(playerHandle, new RGBAExternalRender());
//libPlayer.SmartPlayerSetExternalRender(playerHandle, new I420ExternalRender());
Copy the code

Get the original data and perform secondary operations (such as face recognition, etc.) :

    class RGBAExternalRender implements NTExternalRender {
        // public static final int NT_FRAME_FORMAT_RGBA = 1;
        // public static final int NT_FRAME_FORMAT_ABGR = 2;
        // public static final int NT_FRAME_FORMAT_I420 = 3;

        private int width_ = 0;
        private int height_ = 0;
        private int row_bytes_ = 0;
        private ByteBuffer rgba_buffer_ = null;

        @Override
        public int getNTFrameFormat(a) {
            Log.i(TAG, "RGBAExternalRender::getNTFrameFormat return "
                    + NT_FRAME_FORMAT_RGBA);
            return NT_FRAME_FORMAT_RGBA;
        }

        @Override
        public void onNTFrameSizeChanged(int width, int height) {
            width_ = width;
            height_ = height;

            row_bytes_ = width_ * 4;

            Log.i(TAG, "RGBAExternalRender::onNTFrameSizeChanged width_:"
                    + width_ + " height_:" + height_);

            rgba_buffer_ = ByteBuffer.allocateDirect(row_bytes_ * height_);
        }

        @Override
        public ByteBuffer getNTPlaneByteBuffer(int index) {
            if (index == 0) {
                return rgba_buffer_;
            } else {
                Log.e(TAG,
                        "RGBAExternalRender::getNTPlaneByteBuffer index error:"
                                + index);
                return null; }}@Override
        public int getNTPlanePerRowBytes(int index) {
            if (index == 0) {
                return row_bytes_;
            } else {
                Log.e(TAG,
                        "RGBAExternalRender::getNTPlanePerRowBytes index error:"
                                + index);
                return 0; }}public void onNTRenderFrame(int width, int height, long timestamp) {
            if (rgba_buffer_ == null)
                return;

            rgba_buffer_.rewind();

            // copy buffer

            // test
            // byte[] test_buffer = new byte[16];
            // rgba_buffer_.get(test_buffer);

            Log.i(TAG, "RGBAExternalRender:onNTRenderFrame w=" + width + " h="
                    + height + " timestamp=" + timestamp);

            // Log.i(TAG, "RGBAExternalRender:onNTRenderFrame rgba:" +
            // bytesToHexString(test_buffer));}}class I420ExternalRender implements NTExternalRender {
        // public static final int NT_FRAME_FORMAT_RGBA = 1;
        // public static final int NT_FRAME_FORMAT_ABGR = 2;
        // public static final int NT_FRAME_FORMAT_I420 = 3;

        private int width_ = 0;
        private int height_ = 0;

        private int y_row_bytes_ = 0;
        private int u_row_bytes_ = 0;
        private int v_row_bytes_ = 0;

        private ByteBuffer y_buffer_ = null;
        private ByteBuffer u_buffer_ = null;
        private ByteBuffer v_buffer_ = null;

        @Override
        public int getNTFrameFormat(a) {
            Log.i(TAG, "I420ExternalRender::getNTFrameFormat return "
                    + NT_FRAME_FORMAT_I420);
            return NT_FRAME_FORMAT_I420;
        }

        @Override
        public void onNTFrameSizeChanged(int width, int height) {
            width_ = width;
            height_ = height;

            y_row_bytes_ = (width_ + 15) & (~15);
            u_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);
            v_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);

            y_buffer_ = ByteBuffer.allocateDirect(y_row_bytes_ * height_);
            u_buffer_ = ByteBuffer.allocateDirect(u_row_bytes_
                    * ((height_ + 1) / 2));
            v_buffer_ = ByteBuffer.allocateDirect(v_row_bytes_
                    * ((height_ + 1) / 2));

            Log.i(TAG, "I420ExternalRender::onNTFrameSizeChanged width_="
                    + width_ + " height_=" + height_ + " y_row_bytes_="
                    + y_row_bytes_ + " u_row_bytes_=" + u_row_bytes_
                    + " v_row_bytes_=" + v_row_bytes_);
        }

        @Override
        public ByteBuffer getNTPlaneByteBuffer(int index) {
            if (index == 0) {
                return y_buffer_;
            } else if (index == 1) {
                return u_buffer_;
            } else if (index == 2) {
                return v_buffer_;
            } else {
                Log.e(TAG, "I420ExternalRender::getNTPlaneByteBuffer index error:" + index);
                return null; }}@Override
        public int getNTPlanePerRowBytes(int index) {
            if (index == 0) {
                return y_row_bytes_;
            } else if (index == 1) {
                return u_row_bytes_;
            } else if (index == 2) {
                return v_row_bytes_;
            } else {
                Log.e(TAG, "I420ExternalRender::getNTPlanePerRowBytes index error:" + index);
                return 0; }}public void onNTRenderFrame(int width, int height, long timestamp) {
            if (y_buffer_ == null)
                return;

            if (u_buffer_ == null)
                return;

            if (v_buffer_ == null)
                return;


            y_buffer_.rewind();

            u_buffer_.rewind();

            v_buffer_.rewind();

    		/* if ( ! is_saved_image ) { is_saved_image = true; int y_len = y_row_bytes_*height_; int u_len = u_row_bytes_*((height_+1)/2); int v_len = v_row_bytes_*((height_+1)/2); int data_len = y_len + (y_row_bytes_*((height_+1)/2)); byte[] nv21_data = new byte[data_len]; byte[] u_data = new byte[u_len]; byte[] v_data = new byte[v_len]; y_buffer_.get(nv21_data, 0, y_len); u_buffer_.get(u_data, 0, u_len); v_buffer_.get(v_data, 0, v_len); int[] strides = new int[2]; strides[0] = y_row_bytes_; strides[1] = y_row_bytes_; int loop_row_c = ((height_+1)/2); int loop_c = ((width_+1)/2); int dst_row = y_len; int src_v_row = 0; int src_u_row = 0; for ( int i = 0; i < loop_row_c; ++i) { int dst_pos = dst_row; for ( int j = 0; j 


            Log.i(TAG, "I420ExternalRender::onNTRenderFrame w=" + width + " h=" + height + " timestamp=" + timestamp);

            // copy buffer

            // test
            // byte[] test_buffer = new byte[16];
            // y_buffer_.get(test_buffer);

            // Log.i(TAG, "I420ExternalRender::onNTRenderFrame y data:" + bytesToHexString(test_buffer));

            // u_buffer_.get(test_buffer);
            // Log.i(TAG, "I420ExternalRender::onNTRenderFrame u data:" + bytesToHexString(test_buffer));

            // v_buffer_.get(test_buffer);
            // Log.i(TAG, "I420ExternalRender::onNTRenderFrame v data:" + bytesToHexString(test_buffer));}}Copy the code

conclusion

RTMP/RTSP player for Android platform development, for decoding and drawing part of the consideration, is a piece of advice, interested developers can refer to.