directory

  1. Basic knowledge of
  2. Use GLSurfaceView to play video while decoding
  3. Problems encountered
  4. data
  5. harvest

First, basic knowledge

1.1. YUV or RGB

Video is made up of a sequence of images or frames of YUV data representing the color space of a picture or video. YUV, RGB, HSV, etc., FFmpeg decoded video data is YUV data, while OpenGL ES will use RGB data when rendering, so we need to convert YUV into RGB first, the corresponding conversion formula is as follows:

RGB = mat3 (1.0, 1.0, 1.0, 0.0, 0.39465, 2.03211, 1.13983, 0.5806, 0.0) * yuv;Copy the code

1.2 OpenGL ES Basics

In the second series, we learned the basic flow and GLSL syntax of OpenGLES, as well as drawing various graphs and matrix transformations. What is unclear or forgotten can be reviewed. OpenGL ES has a lot of knowledge and a lot of things to do, and we’re going to do a lot more on that later. OpenGL ES basic concept

(8) GLSL and Shader rendering process

OpenGL ES draws plane graphics

Audio and video development journey (10) GLSurfaceView source code parsing &EGL environment

OpenGL ES Matrix Transformation and coordinate system

OpenGL ES textures

Use GLSurfaceView to play decoded YUV data

In the previous several articles, we realized the decoding of video stream to generate YUV raw stream, which was verified by YUVplayer and FFPlayer on PC. In this section, we use GLSurfaceview provided by Android to render the video. Since GLsurfaceView already has an EGL render thread, we will first familiarize ourselves with the render process by using it

First we write down the vertex shader and the source shader. Vertex shader

//#version 120

attribute vec4 aPosition;
attribute vec2 aTextureCoord;

varying vec2 vTextureCoord;

void main() {
    gl_Position = aPosition;
    vTextureCoord = aTextureCoord;
}
Copy the code

Source shader

//#version 120 precision mediump float; varying vec2 vTextureCoord; uniform sampler2D samplerY; uniform sampler2D samplerU; uniform sampler2D samplerV; void main() { vec3 yuv; vec3 rgb; yuv.r=texture2D(samplerY, vTextureCoord).g; Yuv. G = texture2D (samplerU, vTextureCoord). G - 0.5; Yuv. B = texture2D (samplerV, vTextureCoord). G - 0.5; RGB = mat3(1.0, 1.0, 1.0, 0.0, -0.39465, 2.03211, 1.13983, -0.5806, 0.0) *yuv; Gl_FragColor = vec4 (RGB, 1.0); }Copy the code

Render code as follows, is also a more general operation, and unclear, you can look back at the OpenGL series content

package android.spport.mylibrary2; import android.content.res.Resources; import android.opengl.GLES20; import android.opengl.GLSurfaceView; import android.util.Log; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.FloatBuffer; import javax.microedition.khronos.egl.EGLConfig; import javax.microedition.khronos.opengles.GL10; public class MyRender implements GLSurfaceView.Renderer { private Resources resources; private int program; Private float verCoords [] = {f/f / 1.0, 1.0, / f / 1.0, 1.0, f/f / 1.0, 1.0, f/f / 1.0, 1.0 f - 1 f, 1 f, 1 f, 1 f, 1 f, 1f, 1f, 1f }; Private float textureCoords [] = {/ f / 1.0, 0.0, f/f / 0.0, 0.0, f/f / 1.0, 1.0, f/f / 0.0, 1.0 f 0 f, 1 f, 1 f, 1 f, 0 f, 0 f, 1 f, 0f }; private final int BYTES_PER_FLOAT = 4; private int aPositionLocation; private int aTextureCoordLocation; private int samplerYLocation; private int samplerULocation; private int samplerVLocation; private FloatBuffer verCoorFB; private FloatBuffer textureCoorFB; private int[] textureIds; public MyRender(Resources resources) { this.resources = resources; } @override public void onSurfaceCreated(GL10 gl, EGLConfig config) {gles20. glClearColor(0.0f, 0.0f, 0.0f, 1.0f); String vertexShader = ShaderHelper.loadAsset(resources, "vertex_shader.glsl"); String fragShader = ShaderHelper.loadAsset(resources, "frag_shader.glsl"); program = ShaderHelper.loadProgram(vertexShader, fragShader); aPositionLocation = GLES20.glGetAttribLocation(program, "aPosition"); aTextureCoordLocation = GLES20.glGetAttribLocation(program, "aTextureCoord"); samplerYLocation = GLES20.glGetUniformLocation(program, "samplerY"); samplerULocation = GLES20.glGetUniformLocation(program, "samplerU"); samplerVLocation = GLES20.glGetUniformLocation(program, "samplerV"); verCoorFB = ByteBuffer.allocateDirect(verCoords.length * BYTES_PER_FLOAT) .order(ByteOrder.nativeOrder()) .asFloatBuffer() .put(verCoords); verCoorFB.position(0); textureCoorFB = ByteBuffer.allocateDirect(textureCoords.length * BYTES_PER_FLOAT) .order(ByteOrder.nativeOrder()) .asFloatBuffer() .put(textureCoords); textureCoorFB.position(0); // textureIds = new int[3]; GLES20.glGenTextures(3, textureIds, 0); for (int i = 0; i < 3; i++) { GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIds[i]); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); } } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { GLES20.glViewport(0, 0, width, height); } @Override public void onDrawFrame(GL10 gl) { Log.i("MyRender", "onDrawFrame: width="+width+" height="+height); if (width > 0 && height > 0 && y ! = null && u ! = null && v ! = null) { GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); GLES20.glUseProgram(program); GLES20.glEnableVertexAttribArray(aPositionLocation); GLES20.glVertexAttribPointer(aPositionLocation, 2, GLES20.GL_FLOAT, false, 2 * BYTES_PER_FLOAT, verCoorFB); GLES20.glEnableVertexAttribArray(aTextureCoordLocation); GLES20.glVertexAttribPointer(aTextureCoordLocation, 2, GLES20.GL_FLOAT, false, 2 * BYTES_PER_FLOAT, textureCoorFB); GlActiveTexture (gles20.gl_texture0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIds[0]); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, y ); GLES20.glActiveTexture(GLES20.GL_TEXTURE1); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIds[1]); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, width / 2, height / 2, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, u ); GLES20.glActiveTexture(GLES20.GL_TEXTURE2); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIds[2]); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, width/2, height/2, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, v ); GLES20.glUniform1i(samplerYLocation, 0); GLES20.glUniform1i(samplerULocation, 1); GLES20.glUniform1i(samplerVLocation, 2); y.clear(); y = null; u.clear(); u = null; v.clear(); v = null; GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); GLES20.glDisableVertexAttribArray(aPositionLocation); GLES20.glDisableVertexAttribArray(aTextureCoordLocation); } } private int width; private int height; private ByteBuffer y; private ByteBuffer u; private ByteBuffer v; public void setYUVRenderData(int width, int height, byte[] y, byte[] u, byte[] v) { this.width = width; this.height = height; this.y = ByteBuffer.wrap(y); this.u = ByteBuffer.wrap(u); this.v = ByteBuffer.wrap(v); }}Copy the code

After the video is decoded, the Java callback function is called by JNI and CPP to render the YUV data to the Java layer with GlSurfaceView.

extern "C" { #include "include/libavcodec/avcodec.h" #include "include/libavformat/avformat.h" #include "include/log.h" #include <libswscale/swscale.h> #include <libavutil/imgutils.h> #include <libswresample/swresample.h> #include <SLES/OpenSLES.h> #include <SLES/OpenSLES_Android.h> #include <libavutil/time.h> } jmethodID onCallYuvData; jobject jcallJavaobj; JavaVM* javaVM; extern "C" JNIEXPORT void JNICALL Java_android_spport_mylibrary2_Demo_initYUVNativeMethod(JNIEnv *env, jobject thiz) { // jcallJavaobj = thiz; jcallJavaobj = env->NewGlobalRef(thiz); env->GetJavaVM(&javaVM); onCallYuvData = env->GetMethodID(env->GetObjectClass(thiz), "onCallYUVData", "(II[B[B[B)V"); } extern "C" JNIEXPORT jint JNICALL Java_android_spport_mylibrary2_Demo_decodeVideo(JNIEnv *env, Jobject thiz, JString inputPath, jString outPath) {... If (onCallYuvData!=NULL) {jbyteArray yData = env->NewByteArray(y_size); jbyteArray uData = env->NewByteArray(y_size/4); jbyteArray vData = env->NewByteArray(y_size/4); env->SetByteArrayRegion(yData, 0, y_size, reinterpret_cast<const jbyte *>(pFrameYUV->data[0])); env->SetByteArrayRegion(uData, 0, y_size/4, reinterpret_cast<const jbyte *>(pFrameYUV->data[1])); env->SetByteArrayRegion(vData, 0, y_size/4, reinterpret_cast<const jbyte *>(pFrameYUV->data[2])); // env->SetByteArrayRegion(vData, 0, y_size/4, reinterpret_cast<const jbyte *>(pFrameYUV->data[1])); // env->SetByteArrayRegion(uData, 0, y_size/4, reinterpret_cast<const jbyte *>(pFrameYUV->data[2])); LOGI("native onCallYuvData widith=%d",pCodecParameters->width); Env ->NewGlobalRef(thiz); Otherwise, a wild pointer exception will occur env->CallVoidMethod(jcallJavaobj,onCallYuvData,pCodecParameters->width,pCodecParameters->height,yData,uData,vData); Env ->DeleteLocalRef(yData); env->DeleteLocalRef(uData); env->DeleteLocalRef(vData); Av_usleep (1000 * 50);}...}; av_usleep(1000 * 50);Copy the code

Realize video playback.

We can use JNI callback to pass the decoded YUV to Java layer for rendering. This is a consumption. Can we directly complete rendering through CPP layer? Of course, audio OpenGL ES provides Java and Native support, we can render in native layer completely, but nativeW layer does not have EGL environment similar to GLSuerfaceView, which is encapsulated, so we need to create GL rendering thread for rendering. We will carry out the learning practice in the future, using OpenSL ES rendering to play audio and OpenGL ES rendering video in the native layer through decoding thread and rendering thread.

The code has been uploaded to github [github.com/ayyb1988/ff… Welcome to exchange, learn and grow together.

4. Problems encountered

  1. A JNI DETECTED ERROR IN APPLICATION exception occurred at runtime

    5.963/5247-5247? A/DEBUG: Abort message: ‘JNI DETECTED ERROR IN APPLICATION: use of invalid jobject 0x7fcea23564 from int android.spport.mylibrary2.Demo.decodeVideo(java.lang.String, Java.lang.String)’ 2021-03-10 06:57:35.963 5247-5247/? X0 0000000000000000 x1 0000000000000dd6 x2 0000000000000006 x3 0000007fcea22390 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X4 fefeff7939517f97 x5 fefeff7939517f97 x6 fefeff7939517f97 x7f7f7f7f7f7f7f7f7fffff 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X8 00000000000000f0 x9 2CD4CDCB09DC01f0 x10 0000000000000001 x11 0000000000000000 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X13 FFFFFFFFFFFFFFFF X14 0000000000000004 X15 FFFFFFFFFFFFFF 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X16 0000007a3a7618c0 x17 0000007a3a73D900 x18 0000007a3c048000 x19 000000000000000DD6 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X20 0000000000000DD6 x21 00000000FFFFFFFF X22 000000799CCa7CC0 x23 00000079b5130625 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X24 00000079b51520fd x25 0000000000000001 x26 00000079b4fbc258 x27 0000007a3b8067c0 2021-03-10 06:57:35.963 5247-5247/? A/DEBUG: X28 00000079b565b338 x29 0000007fCEa22430 2021-03-10 06:57:35.963 5247-5247/? Sp 0000007fCEa22370 LR 0000007a3a6EF0C4 PC 0000007a3a6EF0f0 2021-03-10 06:57:36.026 2647-2647/? E/ndroid.systemu: Invalid ID 0x00000000.2021-03-10 06:57:36.047 12827-20222/? E/ hack.hub: Net. Connect = I’m afraid to call its toString() 2021-03-10 06:57:36.071 5247-5247/? The 2021-03-10 06:57:36.? 071, 5247-5247 / A/DEBUG: #00 pc 00000000000830f0 /apex/com.android.runtime/lib64/bionic/libc.so (abort+160) (BuildId: e55e6e4c631509598633769798683023) … The 2021-03-10 06:57:36. 072, 5247-5247 /? A/DEBUG: #08 pc 000000000036771c /apex/com.android.runtime/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*)+652) (BuildId: D700c52998d7d76cb39e2001d670e654) 2021-03-10 06:57:36.? 072, 5247-5247 / A/DEBUG: #09 pc 000000000036c76c /apex/com.android.runtime/lib64/libart.so (art::(anonymous namespace)::CheckJNI::CheckCallArgs(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck&, _JNIEnv*, _jobject*, _jclass*, _jmethodID*, art::InvokeType, art::(anonymous namespace)::VarArgs const*)+132) (BuildId: d700c52998d7d76cb39e2001d670e654)

why

Jobject is stored in a global variable instead of using a global reference. As soon as the function returns, the caller is discarded by the GC, and object points to an invalid address.

Solution:

jcallJavaobj = thiz; JcallJavaobj = env->NewGlobalRef(thiz);Copy the code

RENDERMODE_WHEN_DIRTY black screen called requestRender after viewing log data but did not trigger onDrawFrame.

Timing issues, GlSurfaceview is inflater after its EGL environment is not ready as early, through post delay decoding render

    glSurfaceView.postDelayed(new Runnable() {
            @Override
            public void run() {
                demo.initYUVNativeMethod();
                demo.decodeVideo(folderurl+"/input.mp4", externalFilesDir+"/output7.yuv");
            }
        },300);
Copy the code

3. The rendered video is upside down

Private float verCoords [] = {1.0 f to 1.0 f, f / / RB - 1.0, 1.0, f / / LB 1.0 f to 1.0 f, / / RT - 1.0 f, f / 1.0 / LT}; Private float textureCoords[] = {1.0f, 0.0f, 0.0f, RB 0.0f, 0.0f,// RT 0.0f, 0.0f //LT}; private float textureCoords[] = {1.0f, 0.0f, 0.0f,// RT 0.0f, 0.0f //LT}; - "to private float verCoords [] = {1 f, 1 f, / / LB 1 f, 1 f, / / RB - 1 f, 1 f, / / LT 1 f, 1 / f/RT}; private float textureCoords[] = { 0f,1f, //LT 1f, 1f,//RT 0f, 0f,//LB 1f, 0f //RB };Copy the code

Reason: The texture and vertex coordinates in OpenGL are y up, the Android phone coordinates are y down. So openGL-> the phone says, you need to rotate the coordinates up and down

4. The rendered video skipped frames. Through log inspection and analysis, it was found that the decoding was too fast to render, resulting in the data to be rendered but not rendered before being covered by the data to be decoded later. Since the rendering and decoding threads have not been separated and synchronized and added decoding buffer, a delay solution is adopted here. Add a 50ms delay in Packet decoding rendering

 av_usleep(1000 * 50);
Copy the code

It is normal to play the decoded YUV data through FFplay on PC, while there is something wrong with the rendered data on mobile phone. There must be something wrong with the rendering. Check the render code and find that, The width and height of the V texture in YUV texture are not set correctly

GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, v ); Gles20.glteximage2d (gles20.gl_texture_2D, 0, gles20.gl_luminance, width/2, height/2, 0, gles20.gl_luminance, GLES20.GL_UNSIGNED_BYTE, v );Copy the code

Four, data

  1. (8) Master basic knowledge of video and render YUV data with OpenGL ES 2.0
  2. YUV < — > RGB conversion algorithm
  3. Render YUV video based on OpenGl on Android platform
  4. Render YUV textures with OpenGL ES
  5. JNI DETECTED ERROR IN APPLICATION resolve records

Five, the harvest

  1. Review YUV and RGB basics
  2. GLSurfaceView is used to encode and decode video data
  3. Solve the problems of decoding and rendering out of sync leading to jump frame, rendering green screen, upside down rendering screen, etc

Thank you for reading

The next few posts were originally planned to be Native layer rendering, audio and video synchronization, coding, double speed playback, RTMP push and pull streaming, etc. But recently, I have become a little impetuous because there are so many things to learn, not only audio and video, but also all kinds of advanced knowledge of Android. I want to disperse my energy, take care of both, but my energy is limited, sometimes I have to focus like a laser to accomplish things. Considering the new field at work, they cannot help each other in their spare time, which is a kind of avoidance. When you get stuck, face it and deal with it. Recently, the use of OpenGL in the work is more, a lot of content is also learning practice, in order to work and study to achieve twice the result, decided to suspend the FFmpeg series of more text, next we focus on OpenGL ES rendering. Adjust your priorities and order. FFmpeg I’ll see you later.

Next we will learn and practice FBO, welcome to pay attention to the public account “audio and video development journey”, learn and grow together.

Welcome to communicate