To know a class is to make a friend; Look at a source, equivalent to a top session; Reading a frame is equivalent to witnessing a passage of thought; Do a program, equivalent to creating a life; A Git submission is equivalent to recording a growth; Life may not be perfect, but it can be sublime. —- Zhang Fengjieteilie


SurfaceView

SurfaceView can be found in videos, cameras, games, Flutter and other scenarios that require high performance rendering. SurfaceView is the key to video that you must overcome if you want to perform high performance rendering.

This article will give you a sense of the SurfaceView from the following points of minimalism:

[1].Camera preview and SurfaceView usage [2].Camera2 preview and SurfaceView usage [3]. Video playback and combination with OpenGL [6]. The connection between Flutter and SurfaceViewCopy the code


1.Camera uses SurfaceView to enable preview

The SurfaceView relies on the SurfaceHolder class, so the two are inseparable. The Camera setPreviewDisplay method takes a SurfaceHolder

The SurfaceHolder is not created immediately and requires a callback listener. In order to create, change, destroy the perception of it and related operations. The listener interface is surfaceholder.callback, which can be implemented directly for convenience. Or you can create a new class

Detailed operation :Android multimedia Camera related operations

public class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback {
    private Camera camera;

    public CameraSurfaceView(Context context) {
        this(context,null);
    }

    public CameraSurfaceView(Context context, AttributeSet attrs) {
        this(context, attrs,0);
    }

    public CameraSurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        getHolder().addCallback(this);// Add a callback to the SurfaceHolder of the SurfaceView
    }

    / / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - overwrite the SurfaceHolder. The Callback method -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        camera = Camera.open();
        camera.setDisplayOrientation(90);
        try {
            camera.setPreviewDisplay(holder);//Camera+SurfaceHolder
            camera.startPreview();
        } catch(IOException e) { e.printStackTrace(); }}@Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}@Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        camera.release();// Release resources}}Copy the code

2. Use SurfaceView in Camera2

Camera2 is not the Camera2 class, but the Camera2 package under the camera system, although it is quite complicated to use

But simplicity has its limits, complexity has its value, and its display core also needs a SurfaceHolder

For details, see the related operations of Camera2 for Android multimedia

public class Camera2SurfaceView extends SurfaceView implements SurfaceHolder.Callback {
    private Handler mainHandler;
    private String mCameraID;
    private CameraManager mCameraManager;
    private CameraDevice mCameraDevice;// Camera equipment
    private CameraCaptureSession mCameraCaptureSession;
    private Handler childHandler;

    private CameraDevice.StateCallback mStateCallback;

    private Semaphore mCameraOpenCloseLock = new Semaphore(1);// To prevent the application from exiting before closing the camera

    public Camera2SurfaceView(Context context) {
        this(context,null);
    }

    public Camera2SurfaceView(Context context, AttributeSet attrs) {
        this(context, attrs,0);
    }

    public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        getHolder().addCallback(this);// Add a callback to the SurfaceHolder of the SurfaceView
    }

    / / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - overwrite the SurfaceHolder. The Callback method -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        initHandler();// Initialize the thread handler
        initCamera();// Initialize the camera
        try {
            if(ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) ! = PackageManager.PERMISSION_GRANTED) {return;
            }
            mCameraManager.openCamera(mCameraID, mStateCallback, mainHandler);
        } catch(CameraAccessException e) { e.printStackTrace(); }}@Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}@Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        mCameraDevice.close();// Release resources; // Release resources
    }
    
    private void initCamera(a) {
        mCameraID = "" + CameraCharacteristics.LENS_FACING_FRONT;// Rear camera
        // Get the camera manager
        mCameraManager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);

        mStateCallback = new CameraDevice.StateCallback() {
            @Override
            public void onOpened(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice = camera;
                startPreview();
            }
            @Override
            public void onDisconnected(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice.close();
            }
            @Override
            public void onError(@NonNull CameraDevice camera, int error) { mCameraOpenCloseLock.release(); mCameraDevice.close(); }}; }private void initHandler(a) {
        HandlerThread handlerThread = new HandlerThread("Camera2");
        handlerThread.start();
        mainHandler = new Handler(getMainLooper());// Main thread Handler
        childHandler = new Handler(handlerThread.getLooper());// Child thread Handler
    }

    /** * Start preview */
    private void startPreview(a) {
        try {
            // Create capturerequest.Builder for preview
            final CaptureRequest.Builder reqBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            // Use the SurfaceView surface as the target for CaptureRequest.Builder
            reqBuilder.addTarget(getHolder().getSurface());

            CameraCaptureSession creates CameraCaptureSession, which manages the processing of preview requests and photo requests
            CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                            if (null == mCameraDevice) return;
                            mCameraCaptureSession = cameraCaptureSession; // When the camera is ready, start displaying the preview
                            try {// Display preview
                                mCameraCaptureSession.setRepeatingRequest(reqBuilder.build(), null, childHandler);
                            } catch(CameraAccessException e) { e.printStackTrace(); }}@Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {}}; mCameraDevice.createCaptureSession(Collections.singletonList(getHolder().getSurface()), stateCallback, childHandler); }catch(CameraAccessException e) { e.printStackTrace(); }}}Copy the code

3. Use GLSurfaceView in OpenGL

GLSurfaceView, as a subclass of SurfaceView, opens a door called OpenGL.

It implements the SurfaceHolder.Callback2 interface and needs to pass in a GLsurfaceView. Render interface

public class TriangleGLView extends GLSurfaceView implements GLSurfaceView.Renderer {
    private  Triangle mTriangle;
    
    public TriangleGLView(Context context) {
        this(context, null);
    }

    public TriangleGLView(Context context, AttributeSet attrs) {
        super(context, attrs);
        setEGLContextClientVersion(2);// Set OpenGL ES 2.0 context
        setRenderer(this);// Set the renderer
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        mTriangle=new Triangle();
        GLES20.glClearColor(1.0 f.0.0 f.0.0 f.1.0 f);//rgba
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        GLES20.glViewport(0.0, width, height);/ / GL viewport
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        // Clear the color cache and depth cacheGLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT| GLES20.GL_DEPTH_BUFFER_BIT); mTriangle.draw(); }}Copy the code

Drawing in OpenGL is pretty daunting, so here’s the simplest triangle drawing,

If you are interested, you can take a look at the author’s OpenGL article. After carefully reading, you can basically get a start on Android Multimedia GL-ES Episode 1 — The Brave Gather Android Multimedia GL-ES Episode 2 — Puzzle Cube Android Multimedia GLES2 Episode 3 — The Light of the Flame Android Multimedia GLES2 Wars episode 5 — The Light of the Universe Android Multimedia GLES2 Wars episode 6 — the Platform of the nine Layers

public class Triangle {
    private FloatBuffer vertexBuffer;// Vertex buffer
    private final String vertexShaderCode =// Vertex shader code
            "attribute vec4 vPosition;" +
                    "void main() {" +
                    " gl_Position = vPosition;" +
                    "}";
    private final String fragmentShaderCode =// Chip shader code
            "precision mediump float;" +
                    "uniform vec4 vColor;" +
                    "void main() {" +
                    " gl_FragColor = vColor;" +
                    "}";
    private final int mProgram;
    private int mPositionHandle;// Position handle
    private int mColorHandle;// Color handle
    private final int vertexCount = sCoo.length / COORDS_PER_VERTEX;Number of vertices
    private final int vertexStride = COORDS_PER_VERTEX * 4; / / 3 * 4 = 12
    // The coordinate number of each vertex in the array
    static final int COORDS_PER_VERTEX = 3;
    static float sCoo[] = {   // In counterclockwise order
            0.0 f.0.0 f.0.0 f.The top / /
            -1.0 f, -1.0 f.0.0 f./ / lower left
            1.0 f, -1.0 f.0.0 f  / / right
    };
    // Color, rgba
    float color[] = {0.63671875 f.0.76953125 f.0.22265625 f.1.0 f};
    public Triangle(a) {
        // Initialize the vertex byte buffer
        ByteBuffer bb = ByteBuffer.allocateDirect(sCoo.length * 4);// Each float: number of coordinates * 4 bytes
        bb.order(ByteOrder.nativeOrder());// Use the byte order of the native hardware device
        vertexBuffer = bb.asFloatBuffer();// Create a floating-point buffer from a byte buffer
        vertexBuffer.put(sCoo);// Add the coordinates to the FloatBuffer
        vertexBuffer.position(0);// Set the buffer to read the first coordinate
        int vertexShader = loadShader(
                GLES20.GL_VERTEX_SHADER,// Vertex shader
                vertexShaderCode);
        int fragmentShader = loadShader
                (GLES20.GL_FRAGMENT_SHADER,// Chip color
                        fragmentShaderCode);
        mProgram = GLES20.glCreateProgram();// Create an empty OpenGL ES program
        GLES20.glAttachShader(mProgram, vertexShader);// Add vertex shader
        GLES20.glAttachShader(mProgram, fragmentShader);// Add the slice shader
        GLES20.glLinkProgram(mProgram);Create an executable OpenGL ES project
    }

    private int loadShader(int type, String shaderCode) {
        int shader = GLES20.glCreateShader(type);// Create a shader
        GLES20.glShaderSource(shader, shaderCode);// Add shader source code
        GLES20.glCompileShader(shader);/ / compile
        return shader;
    }

    public void draw(a) {
        // Add the program to the OpenGL ES environment
        GLES20.glUseProgram(mProgram);
        // Gets a handle to the vPosition member of the vertex shader
        mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
        // Enable handles for triangle vertices
        GLES20.glEnableVertexAttribArray(mPositionHandle);
        // Prepare the trigonometric data
        GLES20.glVertexAttribPointer(
                mPositionHandle, COORDS_PER_VERTEX,
                GLES20.GL_FLOAT, false,
                vertexStride, vertexBuffer);
        // Gets a handle to the vColor member of the slice shader
        mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
        // Set the color for the triangle
        GLES20.glUniform4fv(mColorHandle, 1, color, 0);
        // Draw a triangle
        GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
        // Disable vertices arraysGLES20.glDisableVertexAttribArray(mPositionHandle); }}Copy the code

4. The use of OpenGL in camera

Now, just to recap, the camera needs a SurfaceHolder, and GLSurfaceView is a SurfaceView. But good things take time, and it is not so simple as imagined…

Create a SurfaceTexture object in the main Class of the CameraGLView, bind the texture to it, and create a SurfaceHolder using the SurfaceTexture as an input.

public class CameraGLView extends GLSurfaceView  implements GLSurfaceView.Renderer {

    private CameraDrawer cameraDrawer;
    public SurfaceTexture surfaceTexture;
    private int[] textureId = new int[1];

    / / -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- the camera operation -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    private Handler mainHandler;
    private Handler childHandler;
    private String mCameraID;
    private CameraManager mCameraManager;
    private CameraDevice mCameraDevice;// Camera equipment
    private CameraCaptureSession mCameraCaptureSession;

    private CameraDevice.StateCallback mStateCallback;
    private Size mVideoSize;
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);// To prevent the application from exiting before closing the camera
    private Surface surface;


    public CameraGLView(Context context) {
        this(context,null);
    }

    public CameraGLView(Context context, AttributeSet attrs) {
        super(context, attrs);
        setEGLContextClientVersion(3);// Set OpenGL ES 3.0 context
        setRenderer(this);// Set the renderer
    }

    private void initHandler(a) {
        HandlerThread handlerThread = new HandlerThread("Camera2");
        handlerThread.start();
        mainHandler = new Handler(getMainLooper());// Main thread Handler
        childHandler = new Handler(handlerThread.getLooper());// Child thread Handler
    }

    private void initCamera(a) {

        mCameraID = "" + CameraCharacteristics.LENS_FACING_FRONT;// Rear camera
        // Get the camera manager
        mCameraManager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);
        mVideoSize=getCameraOutputSizes(mCameraManager,mCameraID,SurfaceTexture.class).get(0);

        mStateCallback = new CameraDevice.StateCallback() {
            @Override
            public void onOpened(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice = camera;
                startPreview();
            }
            @Override
            public void onDisconnected(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice.close();
            }
            @Override
            public void onError(@NonNull CameraDevice camera, int error) { mCameraOpenCloseLock.release(); mCameraDevice.close(); }}; }/** * Get the list of output dimensions for the specified camera according to the output class, in descending order */
    public List<Size> getCameraOutputSizes(CameraManager cameraManager, String cameraId, Class clz){
        try {
            CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
            StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            List<Size> sizes = Arrays.asList(configs.getOutputSizes(clz));
            Collections.sort(sizes, (o1, o2) -> o1.getWidth() * o1.getHeight() - o2.getWidth() * o2.getHeight());
            Collections.reverse(sizes);
            return sizes;
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        return null;
    }
    /** * Start preview */
    private void startPreview(a) {
        surfaceTexture.setDefaultBufferSize(mVideoSize.getWidth(), mVideoSize.getHeight());
        surfaceTexture.setOnFrameAvailableListener(surfaceTexture -> requestRender());
        surface = new Surface(surfaceTexture);

        try {
            // Create capturerequest.Builder for preview
            final CaptureRequest.Builder reqBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            // Use the SurfaceView surface as the target for CaptureRequest.Builder
            reqBuilder.addTarget(surface);
            CameraCaptureSession creates CameraCaptureSession, which manages the processing of preview requests and photo requests
            CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                    if (null == mCameraDevice) return;
                    mCameraCaptureSession = cameraCaptureSession; // When the camera is ready, start displaying the preview
                    try {// Display preview
                        mCameraCaptureSession.setRepeatingRequest(reqBuilder.build(), null, childHandler);
                    } catch(CameraAccessException e) { e.printStackTrace(); }}@Override
                public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {}}; mCameraDevice.createCaptureSession(Collections.singletonList(surface), stateCallback, childHandler); }catch(CameraAccessException e) { e.printStackTrace(); }}@Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        cameraDrawer=new CameraDrawer(getContext());
        // Create a texture object
        GLES30.glGenTextures(textureId.length, textureId, 0);
        // Bind the texture object to srufaceTexture
        surfaceTexture = new SurfaceTexture(textureId[0]);        // Create and connect programs

        initHandler();// Initialize the thread handler
        initCamera();// Initialize the camera
        try {
            if(ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) ! = PackageManager.PERMISSION_GRANTED) {return;
            }
            mCameraManager.openCamera(mCameraID, mStateCallback, mainHandler);
        } catch(CameraAccessException e) { e.printStackTrace(); }}@Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        glViewport(0.0,width,height);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        surfaceTexture.updateTexImage();
        GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT);
        cameraDrawer.draw(textureId[0]); }}Copy the code

The Texture is drawn from the CameraDrawer class, which is very similar to drawing triangles and is colored by a shader

Fragments fragment: camera. FSH

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision mediump float;

in vec2 vTexCoord;
out vec4 outColor;
uniform samplerExternalOES sTexture;

void main(){
    outColor = texture(sTexture, vTexCoord);
}
Copy the code

Vertex top yuan: camera. VSH

#version 300 es
in vec4 aPosition;
in vec2 aTexCoord;

out vec2 vTexCoord;

void main(){
    gl_Position = aPosition;
    vTexCoord = aTexCoord;
}
Copy the code

Painting: CameraDrawer

public class CameraDrawer { private static final String VERTEX_ATTRIB_POSITION = "a_Position"; private static final int VERTEX_ATTRIB_POSITION_SIZE = 3; private static final String VERTEX_ATTRIB_TEXTURE_POSITION = "a_texCoord"; private static final int VERTEX_ATTRIB_TEXTURE_POSITION_SIZE = 2; private static final String UNIFORM_TEXTURE = "s_texture"; Private float vertex [] = {0.0 f, 1 f, 1 f, / / upper left - 1 f, 1 f, 0.0 f, / / lower left 1 f, 1 f, 0.0 f, / / lower 1 f, 1 f, f / 0.0 / upper}; Public float[] textureCoord = {0.0f,1.0f, 1.0f,1.0f, 0.0f,0.0f,0.0f, 0.0f}; public float[] textureCoord = {0.0f,1.0f, 1.0f,0.0f, 0.0f}; private FloatBuffer vertexBuffer; private FloatBuffer textureCoordBuffer; private int program; private Context context; public CameraDrawer(Context context) { this.context = context; initVertexAttrib(); / / initializes the vertex data program. = GLUtil loadAndInitProgram (enclosing context, "camera. VSH", "camera. FSH"); GLES30. GlClearColor (1.0 f, f 1.0, 1.0 f, 0.0 f); } private void initVertexAttrib() { textureCoordBuffer = GLUtil.getFloatBuffer(textureCoord); vertexBuffer = GLUtil.getFloatBuffer(vertex); } public void draw(int textureId){ GLES30.glUseProgram(program); / / handle to initialize the int vertexLoc = GLES30. GlGetAttribLocation (program, VERTEX_ATTRIB_POSITION); int textureLoc = GLES30.glGetAttribLocation(program, VERTEX_ATTRIB_TEXTURE_POSITION); GLES30.glEnableVertexAttribArray(vertexLoc); GLES30.glEnableVertexAttribArray(textureLoc); GLES30.glVertexAttribPointer(vertexLoc, VERTEX_ATTRIB_POSITION_SIZE, GLES30.GL_FLOAT, false, 0, vertexBuffer); GLES30.glVertexAttribPointer(textureLoc, VERTEX_ATTRIB_TEXTURE_POSITION_SIZE, GLES30.GL_FLOAT, false, 0, textureCoordBuffer); Gles30.glactivetexture (gles30.gl_texture0); GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId); GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_NEAREST); GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR); GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE); GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE); int uTextureLoc = GLES30.glGetUniformLocation(program, UNIFORM_TEXTURE); GLES30.glUniform1i(uTextureLoc,0); // Draw gles30.gldrawarrays (gles30.gl_triangle_fan,0, vertic.length / 3); / / disable vertex GLES30 glDisableVertexAttribArray (vertexLoc); GLES30.glDisableVertexAttribArray(textureLoc); }}Copy the code

If you don’t know anything about OpenGL, you’ll see the results and say, “TM, why bother previewing?” Bye-bye. No more learning. Goodbye.

Yeah, it’s a mess, and it’s gonna get worse. But you won’t. Others will. You are afraid of trouble, others to study, this is the gap between people. What I can’t understand most is that people who are afraid of trouble ask for learning methods everywhere. As long as you are not afraid of trouble, problems willing to drill, to see the source code, to debug, and what can stop you. Things are difficult and easy, it is easy for the sufferers, not easy for the difficult.

OpenGL opens the door to a wide range of shader operations, filters, textures, coloring, transformations… You could even say give me a use for Shader, and I can create a world for you


5. Use OpenGL in video playback

If you know anything about video playback, MediaPlayer and Surface can work in concert

So, similarly, you can combine video playback with OpenGL, and then you can use shader to change the world and you can do SurfaceTexture and texture binding in GLVideoView, And generate Surface for MediaPlayer to play video, see :Android multimedia video player (based on MediaPlayer)

public class GLVideoView extends GLSurfaceView implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener { private float[] sTMatrix = new float[16]; private final float[] projectionMatrix=new float[16]; private SurfaceTexture surfaceTexture; private MediaPlayer mediaPlayer; private VideoDrawer videoDrawer; private int textureId; private boolean updateSurface; private boolean playerPrepared; private int screenWidth,screenHeight; public GLVideoView(Context context) { super(context); } public GLVideoView(Context context, AttributeSet attrs) { super(context, attrs); setEGLContextClientVersion(2); // Set OpenGL ES 3.0 context setRenderer(this); // Set the renderer initPlayer(); } private void initPlayer() { mediaPlayer=new MediaPlayer(); try{ mediaPlayer.setDataSource(getContext(), Uri.parse("/sdcard/toly/sh.mp4")); }catch (IOException e){ e.printStackTrace(); } mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mediaPlayer.setLooping(true); mediaPlayer.setOnVideoSizeChangedListener(this); } @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { videoDrawer=new VideoDrawer(getContext()); playerPrepared=false; synchronized(this) { updateSurface = false; } int[] textures = new int[1]; GLES20.glGenTextures(1, textures, 0); textureId = textures[0]; GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId); surfaceTexture = new SurfaceTexture(textureId); surfaceTexture.setOnFrameAvailableListener(this); Surface surface = new Surface(surfaceTexture); mediaPlayer.setSurface(surface); surface.release(); if (! playerPrepared){ try { mediaPlayer.prepare(); playerPrepared=true; } catch (IOException t) { } mediaPlayer.start(); playerPrepared=true; } } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { screenWidth=width; screenHeight=height; GLES20. GlViewport (0, 0, screenWidth screenHeight); } @Override public void onDrawFrame(GL10 gl) { synchronized (this){ if (updateSurface){ surfaceTexture.updateTexImage();  surfaceTexture.getTransformMatrix(sTMatrix); updateSurface = false; } } videoDrawer.draw(textureId,projectionMatrix, sTMatrix); } @Override public void onFrameAvailable(SurfaceTexture surfaceTexture) { updateSurface = true; } @Override public void onVideoSizeChanged(MediaPlayer mp, int width, int height) { screenWidth=width; screenHeight=height; updateProjection(width,height); } private void updateProjection(int videoWidth, int videoHeight){ float screenRatio=(float)screenWidth/screenHeight; float videoRatio=(float)videoWidth/videoHeight; if (videoRatio>screenRatio){ Matrix.orthoM(projectionMatrix,0, -1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio, -1f,1f); }else { Matrix.orthoM(projectionMatrix,0, -screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f, -1f,1f); }}}Copy the code

shader

---->[video.fsh]---- #extension GL_OES_EGL_image_external : require precision mediump float; varying vec2 vTexCoord; uniform samplerExternalOES sTexture; void main() { vec3 color = texture2D(sTexture, vTexCoord).rgb; Float threshold = 0.8; // Threshold float mean = (color.r + color.g + color.b) / 3.0; color.r = color.g = color.b = mean >= threshold ? 1.0, 0.0; gl_FragColor = vec4(1,color); // fixed red} ---->[video.vsh]---- attribute vec4 aPosition; // Attribute vec4 aTexCoord; // Texture coordinates varying VEC2 vTexCoord; uniform mat4 uMatrix; uniform mat4 uSTMatrix; void main() { vTexCoord = (uSTMatrix * aTexCoord).xy; gl_Position = uMatrix*aPosition; }Copy the code

And then through VideoDrawer coloring and drawing

public class VideoDrawer { private Context context; private int aPositionLocation; private int programId; private FloatBuffer vertexBuffer; private final float[] vertexData = { 1f, -1f, 0f, -1f, -1f, 0f, 1f, 1f, 0f, -1f, 1f, 0f }; private int uMatrixLocation; private final float[] textureVertexData = { 1f, 0f, 0f, 0f, 1f, 1f, 0f, 1f }; private FloatBuffer textureVertexBuffer; private int uTextureSamplerLocation; private int aTextureCoordLocation; private int uSTMMatrixHandle; public VideoDrawer(Context context) { this.context = context; vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4) .order(ByteOrder.nativeOrder()) .asFloatBuffer() .put(vertexData); vertexBuffer.position(0); textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4) .order(ByteOrder.nativeOrder()) .asFloatBuffer() .put(textureVertexData); textureVertexBuffer.position(0); programId = GLUtil.loadAndInitProgram(context, "video.vsh", "video.fsh"); aPositionLocation = GLES20.glGetAttribLocation(programId, "aPosition"); uMatrixLocation = GLES20.glGetUniformLocation(programId, "uMatrix"); uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix"); uTextureSamplerLocation = GLES20.glGetUniformLocation(programId, "sTexture"); aTextureCoordLocation = GLES20.glGetAttribLocation(programId, "aTexCoord"); } public void draw(int textureId, float[] projectionMatrix, float[] sTMatrix) { GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT); GLES20.glUseProgram(programId); GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, projectionMatrix, 0); GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, sTMatrix, 0); vertexBuffer.position(0); GLES20.glEnableVertexAttribArray(aPositionLocation); GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false, 12, vertexBuffer); textureVertexBuffer.position(0); GLES20.glEnableVertexAttribArray(aTextureCoordLocation); GLES20.glVertexAttribPointer(aTextureCoordLocation, 2, GLES20.GL_FLOAT, false, 8, textureVertexBuffer); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId); GLES20.glUniform1i(uTextureSamplerLocation, 0); GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); }}Copy the code
6. Flutter and SurfaceView

If you know anything about Flutter implementation, you’re probably familiar with FlutterView. For Android,

All Flutter views are drawn in The FlutterView, which is derived from the SurfaceView, just to show how powerful the SurfaceView is

public class FlutterView extends SurfaceView 
                implements BinaryMessenger, TextureRegistry {
Copy the code

SurfaceView: SurfaceView: SurfaceView: SurfaceView: SurfaceView: SurfaceView: SurfaceView

Surfacecallback and nextTextureId are created directly in the constructor,surfaceCreated, surfaceChanged, surfaceDestroyed

public class FlutterView extends SurfaceView implements BinaryMessenger.TextureRegistry {
    private static final String TAG = "FlutterView";
    / /...
    private final Callback mSurfaceCallback;
    / /...
    private final AtomicLong nextTextureId;

    // In the constructor
    this.mSurfaceCallback = new Callback() {
                public void surfaceCreated(SurfaceHolder holder) {
                    FlutterView.this.assertAttached();
                    FlutterView.this.mNativeView.getFlutterJNI().onSurfaceCreated(holder.getSurface());
                }

                public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
                    FlutterView.this.assertAttached();
                    FlutterView.this.mNativeView.getFlutterJNI().onSurfaceChanged(width, height);
                }

                public void surfaceDestroyed(SurfaceHolder holder) {
                    FlutterView.this.assertAttached();
                    FlutterView.this.mNativeView.getFlutterJNI().onSurfaceDestroyed(); }};this.getHolder().addCallback(this.mSurfaceCallback);
Copy the code

The mSurfaceCallback is also removed for both Detach and Destroy

    public FlutterNativeView detach(a) {
        if (!this.isAttached()) {
            return null;
        } else {
            this.getHolder().removeCallback(this.mSurfaceCallback);
            this.mNativeView.detachFromFlutterView();
            FlutterNativeView view = this.mNativeView;
            this.mNativeView = null;
            returnview; }}public void destroy(a) {
        if (this.isAttached()) {
            this.getHolder().removeCallback(this.mSurfaceCallback);
            this.mNativeView.destroy();
            this.mNativeView = null; }}Copy the code

  • SurfaceTexture instance object

Use inner classes SurfaceTextureRegistryEntry build, setOnFrameAvailableListener and also has appeared in the above

So it’s worth having a familiar face, at least when you see it, you know what it’s doing.

public SurfaceTextureEntry createSurfaceTexture(a) {
    SurfaceTexture surfaceTexture = new SurfaceTexture(0);
    surfaceTexture.detachFromGLContext();
    FlutterView.SurfaceTextureRegistryEntry entry = new FlutterView.SurfaceTextureRegistryEntry(this.nextTextureId.getAndIncrement(), surfaceTexture);
    this.mNativeView.getFlutterJNI().registerTexture(entry.id(), surfaceTexture);
    return entry;
}

final class SurfaceTextureRegistryEntry implements SurfaceTextureEntry {
    private final long id;
    private final SurfaceTexture surfaceTexture;
    private boolean released;
    private OnFrameAvailableListener onFrameListener = new OnFrameAvailableListener() {
        public void onFrameAvailable(SurfaceTexture texture) {
            if(! SurfaceTextureRegistryEntry.this.released && FlutterView.this.mNativeView ! =null) {
                FlutterView.this.mNativeView.getFlutterJNI().markTextureFrameAvailable(SurfaceTextureRegistryEntry.this.id); }}}; SurfaceTextureRegistryEntry(long id, SurfaceTexture surfaceTexture) {
        this.id = id;
        this.surfaceTexture = surfaceTexture;
        if (VERSION.SDK_INT >= 21) {
            this.surfaceTexture.setOnFrameAvailableListener(this.onFrameListener, new Handler());
        } else {
            this.surfaceTexture.setOnFrameAvailableListener(this.onFrameListener); }}public SurfaceTexture surfaceTexture(a) {
        return this.surfaceTexture;
    }
    public long id(a) {
        return this.id;
    }
Copy the code

SurfaceTexture is most important for binding textures, which is implemented in FlutterJNI’s nativeRegisterTexture method.

---->[io.flutter.embedding.engine.FlutterJNI#registerTexture]----
    @UiThread
    public void registerTexture(long textureId, @NonNull SurfaceTexture surfaceTexture) {
        this.ensureRunningOnMainThread();
        this.ensureAttachedToNative();
        this.nativeRegisterTexture(this.nativePlatformViewId, textureId, surfaceTexture);
    }

---->[io.flutter.embedding.engine.FlutterJNI#nativeRegisterTexture]----
private native void nativeRegisterTexture(long var1, long var3, @NonNull SurfaceTexture var5);
Copy the code

Where is the C++ implementation of nativeRegisterTexture

To view the C++ code for FlutterJNI, download the flutter engine at GitHub. Location: engine – master/shell/platform/android/platform_view_android_jni. H

static void RegisterTexture(JNIEnv* env, jobject jcaller, jlong shell_holder, jlong texture_id, jobject surface_texture) {
  ANDROID_SHELL_HOLDER->GetPlatformView() - >RegisterExternalTexture(
      static_cast<int64_t>(texture_id),                        //
      fml::jni::JavaObjectWeakGlobalRef(env, surface_texture)  //
  );
}

bool RegisterApi(JNIEnv* env) {
  static const JNINativeMethod flutter_jni_methods[] = {
     {
          .name = "nativeRegisterTexture",
          .signature = "(JJLandroid/graphics/SurfaceTexture;) V",
          .fnPtr = reinterpret_cast<void*>(&RegisterTexture),
      },
      / /...

  if (env->RegisterNatives(g_flutter_jni_class->obj(), flutter_jni_methods,
                           fml::size(flutter_jni_methods)) ! =0) {
    FML_LOG(ERROR) << "Failed to RegisterNatives with FlutterJNI";
    return false;
  }
Copy the code

Accidentally learned a JNI method of registration… This wave is good. What is a good way to learn. Look, think, and knowledge meets you by chance

bool PlatformViewAndroid::Register(JNIEnv* env) {
  if (env == nullptr) {
    FML_LOG(ERROR) << "No JNIEnv provided";
    return false;
  }

/ /...
// FindClass refers to the Java class of FlutterJNI.
  g_flutter_jni_class = new fml::jni::ScopedJavaGlobalRef<jclass>(
      env, env->FindClass("io/flutter/embedding/engine/FlutterJNI"));
  if (g_flutter_jni_class->is_null()) {
    FML_LOG(ERROR) << "Failed to find FlutterJNI Class.";
    return false;
  }

/ /...
  return RegisterApi(env);
}
Copy the code

And then I stopped digging. There will be an opportunity to dig a special hole. At this point you should have a sensible idea of the SurfaceView. Post it one last time:

That’s the end of my task, and I’ll leave the torch to the next :Android drawing mechanism and Surface family source parsing

This article is the best explanation of the Surface family I have seen so far. It is recommended to read and recite the whole article.


Well, this article here, river’s lake road, see you soon. See you later.

I am Zhang Fengjieteili. if you have anything you want to communicate, please leave a message. You can also add wechat :zdl1994328