Zero, preface,

I’ve been working on Flutter for a long time recently. Most of my free time has been devoted to FlutterUnit and bragging about Flutter in the Flutter group. The OpenGLES3.0 series has been stalled for a long time. Recently after a busy period of time, I want to pick up and update a video special effects related content. Warm tip: this GIF picture is more and larger, pay attention to the flow.

  • [- multimedia -] OpenGLES3.0 access video implementation effects – introduction
  • [-opengles3.0 -] Episode 1 Mainline – Open the door to a new world
  • [-opengles3.0 -] Set 2 mainline – Draw surface and image map
  • [-opengles3.0 -] Episode 3 Mainline – Shader shader and image effects
  • [-Opengles3.0 -] [-Opengles3.0 -] [-Opengles3.0 -] [-Opengles3.0 –

As mentioned earlier, OpenGLES can use fragment shaders to effect texture maps. The same is true for the video, such as the red effect below. The video texture is constantly updated through MediaPlayer, and then drawn by OpenGLES. In between, the texture can be manipulated through the fragment shader to achieve various effects.

  • For example, by controlling the output color of the fragment shader to produce color-related effects
  • For example, effects can be achieved by controlling the texture coordinates of the fragment shader
  • Such as dynamic effects through input parameters

First, preparation

Github address: github.com/toly1994328…

1. Prepare resources

If you want to implement video effects, you must have videos, and this article does not involve runtime permissions, so the video resources are kept in accessible directories. You can do it at your discretion, as long as you can get access to video resources.


2. Full-screen landscape processing

Do the full landscape operation in MainActivity#onCreate.

public class MainActivity extends AppCompatActivity {
    private GLVideoView videoView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        fullScreenLandSpace();
				// TODO sets the View
    }

    // Immerse the title bar in landscape
    private void fullScreenLandSpace(a) {
        // Check whether the SDK version is >=21
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
            Window window = getWindow();
            // Full screen, hidden status bar
            getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
            window.getDecorView().setSystemUiVisibility(
                    View.SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN | View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION);
            window.setNavigationBarColor(Color.TRANSPARENT); // Set the virtual key to transparent
        }
      
        // If ActionBar is not empty, hide it
        ActionBar actionBar = getSupportActionBar();
        if(actionBar ! =null) {
            actionBar.hide();
        }

        // If not landscape, set landscape
        if(getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) { setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE); }}}Copy the code

3. Check the OpenGLES version and set the View

Check whether the device supports OpenGLES30 by checkSupportOpenGLES30. If you do, create a GLVideoView and set it up in the setContentView to display it.

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    fullScreenLandSpace();
    if (checkSupportOpenGLES30()) {
        videoView = new GLVideoView(this);
        setContentView(videoView);
    } else {
        Log.e("MainActivity".OpenGL ES 3.0 is not supported on the current device!);
        finish();
    }
}
private boolean checkSupportOpenGLES30() {
    ActivityManager am = (ActivityManager) getSystemService(Context.ACTIVITY_SERVICE);
    if(am ! =null) {
        ConfigurationInfo info = am.getDeviceConfigurationInfo();
        return (info.reqGlEsVersion >= 0x30000);
    }
    return false;
}
Copy the code

Two, access video playback OpenGLES

1, view,GLVideoView

GLVideoView inherits from GLSurfaceView which is essentially a View,

public class GLVideoView extends GLSurfaceView{
    VideoRender render;

    public GLVideoView(Context context) {
        super(context);
        // Set OpenGL ES 3.0 context
        setEGLContextClientVersion(3);
      	
        // Set the video path
        String videoPath = "/data/data/com.toly1994.opengl_video/cache/sh.mp4";
        File video =  new File(videoPath);
        // Initialize VideoRender
        render = new VideoRender(getContext(),video);
        // Set the renderersetRenderer(render); }}Copy the code

2. The rendererVideoRenderThe class definition

VideoRender implements the GlSurfaceView.renderer interface for handling OpenGL render callbacks. Use MediaPlayer, the size of the video surveillance needs OnVideoSizeChangedListener, therefore VideoRender implementation. The SurfaceTexture triggers notifications when a new stream frame becomes available, and VideoRender implements OnFrameAvailableListener.

public class VideoRender implements
        GLSurfaceView.Renderer, / /OpenGLRendering the callbackSurfaceTexture.OnFrameAvailableListener.MediaPlayer.OnVideoSizeChangedListener
{

    private final Context context; / / context
    private final File video; // Video file
    private VideoDrawer videoDrawer; / / to draw
    private int viewWidth, viewHeight, videoWidth, videoHeight; // Video and screen size
          
    private MediaPlayer mediaPlayer; // Video player
    private SurfaceTexture surfaceTexture; // Surface texture
    private volatile boolean updateSurface; // Whether to update the surface texture
    private int textureId; / / texture id

    public VideoRender(Context context, File video) {
        this.context = context;
        this.video = video;
    }
Copy the code

3. Callback description of the three interfaces
  • GLSurfaceView.RendererNote that there are three callbacks in the child threadGLThreadIs executed in.
  • OnSurfaceCreated: The surface created or reloaded is a callbackIs generally used for resource initialization.
  • OnSurfaceChanged: Callback when surface size changes, for transformation matrix setting;
  • OnDrawFrame: Callback on each frame drawn, used for drawing.

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        Log.e("VideoRender"."onSurfaceCreated: " + Thread.currentThread().getName());
        videoDrawer = new VideoDrawer(context); 
        initMediaPlayer(); // Initialize MediaPlayer
        mediaPlayer.start(); // Start playing
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Log.e("VideoRender"."Thread name:" + Thread.currentThread().getName() +
                        "-----onSurfaceChanged: (" + width + "," + height + ")"
        );
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        Log.e("VideoRender"."onDrawFrame: "+ Thread.currentThread().getName()); }}Copy the code

  • OnVideoSizeChangedListenerThere is a callback inonVideoSizeChangedIn themianThread, where you can get the dimensions of the video.
  • OnFrameAvailableListenerThere is a callback inonFrameAvailablewhenNew stream frames are availableWill trigger atmianThread, can update the texture update flag is identified astrue;
@Override
public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
    Log.e("VideoRender"."Thread name:" + Thread.currentThread().getName() +
                    "-----onVideoSizeChanged: (" + width + "," + height + ")"
    );
}

@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    Log.e("VideoRender"."onFrameAvailable: "+Thread.currentThread().getName());
}
Copy the code

4. The initializationMediaPlayerplayer

InitMediaPlayer is created in onSurfaceCreated to create a MediaPlayer object and set the video resource, audio stream type, and audio stream type. It is important to bind textures, create SurfaceTexture, Surface objects, and set the Surface for MediaPlayer.

private void initMediaPlayer(a) {
    // Create a MediaPlayer object
    mediaPlayer = new MediaPlayer();
    try {
        // Set the video resource
        mediaPlayer.setDataSource(context, Uri.fromFile(video));
    } catch (IOException e) {
        e.printStackTrace();
    }
    // Set the audio stream type
    mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
  	// Set the loop to play
    mediaPlayer.setLooping(true);
    // Set the video size change listener
    mediaPlayer.setOnVideoSizeChangedListener(this);
  
    / / create a surface
    int[] textures = new int[1];
    GLES30.glGenTextures(1, textures, 0);
    textureId = textures[0];
    // Bind the texture ID
    GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
    surfaceTexture = new SurfaceTexture(textureId);
    surfaceTexture.setOnFrameAvailableListener(this);
    Surface surface = new Surface(surfaceTexture);
    
    / / set the surface
    mediaPlayer.setSurface(surface);
    surface.release();
    try {
        mediaPlayer.prepare();
    } catch (IOException t) {
        Log.e("Prepare ERROR"."onSurfaceCreated: "); }}Copy the code

5. Video playback size

Here by Matrix. OrthoM initializes the orthogonal projection Matrix, you can through the video aspect ratio, the ratio of high to width view configuration, to set the projection of projection is not opened here, you can set Matrix. The orthoM proportion of 3 ~ 7 parameters to control the video screen, Here -1, 1, -1, 1 means fill the entire view.

private final float[] projectionMatrix = new float[16];

@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
    viewWidth = width;
    viewHeight = height;
    updateProjection();
    GLES30.glViewport(0.0, viewWidth, viewHeight);
}

@Override
public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
    videoWidth = width;
    videoHeight = height;
}

private void updateProjection(a) {
    float screenRatio = (float) viewWidth / viewHeight;
    float videoRatio = (float) videoWidth / videoHeight;
    // Orthogonal projection matrix
    Matrix.orthoM(projectionMatrix, 0,
            -1.1, -1.1,
            -1.1);
}
Copy the code

6. Draw video textures

To keep VideoRender from looking too messy, here’s a VideoDrawer class for drawing related resource preparation and drawing processes. The constructor loads the shader code and initializes the program, initialize the vertex buffer, and texture coordinate buffer. Some of the more fixed processes I have simply wrapped in BufferUtils and LoadUtils for you to see the source code for yourself.

public class VideoDrawer {

    private FloatBuffer vertexBuffer;
    private FloatBuffer textureVertexBuffer;

    private final float[] vertexData = {
            1f, -1f.0f,
            -1f, -1f.0f.1f.1f.0f,
            -1f.1f.0f
    };

    private final float[] textureVertexData = {
            1f.0f.0f.0f.1f.1f.0f.1f
    };

    private final int aPositionLocation = 0;
    private final int aTextureCoordLocation = 1;
    private final int uMatrixLocation = 2;
    private final int uSTMMatrixLocation = 3;
    private final int uSTextureLocation = 4;

    private int programId;

    public VideoDrawer(Context context) {
        vertexBuffer = BufferUtils.getFloatBuffer(vertexData);
        textureVertexBuffer = BufferUtils.getFloatBuffer(textureVertexData);
        programId = LoadUtils.initProgram(context, "video.vsh"."red_video.fsh");
    }

    public void draw(int textureId, float[] projectionMatrix, float[] sTMatrix) {
        GLES30.glClear(GLES30.GL_DEPTH_BUFFER_BIT | GLES30.GL_COLOR_BUFFER_BIT);

        GLES30.glUseProgram(programId);
        GLES30.glUniformMatrix4fv(uMatrixLocation, 1.false, projectionMatrix, 0);
        GLES30.glUniformMatrix4fv(uSTMMatrixLocation, 1.false, sTMatrix, 0);

        GLES30.glEnableVertexAttribArray(aPositionLocation);
        GLES30.glVertexAttribPointer(aPositionLocation, 3, GLES30.GL_FLOAT, false.12, vertexBuffer);

        GLES30.glEnableVertexAttribArray(aTextureCoordLocation);
        GLES30.glVertexAttribPointer(aTextureCoordLocation, 2, GLES30.GL_FLOAT, false.8, textureVertexBuffer);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR);

        GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
        GLES30.glUniform1i(uSTextureLocation, 0);
        GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0.4); }}Copy the code

In OpenGLES2.0, we need to get the handle of a variable. In OpenGLES3.0, we can specify the location through layout (location = X), which makes it easier to use. Here is the vertex shader video.vsh

#version 300 es
layout (location = 0) in vec4 vPosition;// Vertex position
layout (location = 1) in vec4 vTexCoord;// Texture coordinates
layout (location = 2) uniform mat4 uMatrix; // Vertex transformation matrix
layout (location = 3) uniform mat4 uSTMatrix; // Texture transformation matrix

out vec2 texCoo2Frag; 

void main() {
    texCoo2Frag = (uSTMatrix * vTexCoord).xy;
    gl_Position = uMatrix*vPosition;
}
Copy the code

Below is the clip shader video.fsh, which is displayed using the samplerExternalOES texture requiring #extension GL_OES_EGL_image_external_essl3 by setting outColor through the texture and texture coordinates.

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    outColor = texture(sTexture, texCoo2Frag);
}
Copy the code

7. Draw and texture update

From the previous log screenshot, onDrawFrame and onFrameAvailable are not running in the same thread. When onFrameAvailable is fired, a new stream frame is available and texture updates can be performed. There are thread-safety issues when two threads need to modify the same shared variable, which is why synchronized is added. This will play normally.

@Override
public void onDrawFrame(GL10 gl) {
    synchronized (this) {
        if (updateSurface) {
            surfaceTexture.updateTexImage();
            surfaceTexture.getTransformMatrix(sTMatrix);
            updateSurface = false;
        }
    }
    videoDrawer.draw(textureId, projectionMatrix, sTMatrix);
}

@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    updateSurface = true;
}
Copy the code

Section shader color effects

Although the above seems to be a lot of things, in fact, the process is relatively fixed, the focus is on the use of fragment shaders. Everything that is visible to us is color, and fragment shaders manipulate colors in different places.

1. Red treatment

Vec3 color = texture(sTexture, texCoo2Frag).rgb; RGB three-dimensional color vector can be obtained. When the RGB average value of a color at a point is greater than threshold, RGB is set to 1, namely white; otherwise, GB is 0, namely red. In this way, red-white effect can be achieved, and red flux can be controlled by different threshold values. Threshold values can also be passed in as parameters.

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    vec3 color = texture(sTexture, texCoo2Frag).rgb;
    float threshold = 0.7;/ / threshold
    float mean = (color.r + color.g + color.b) / 3.0;
    color.g = color.b = mean >= threshold ? 1.0 : 0.0;
    outColor = vec4(1,color.gb,1);
}
Copy the code

The blue effect can also be achieved by fixing the blue channel. This should give you a simple idea of what shaders can do.

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    vec3 color = texture(sTexture, texCoo2Frag).rgb;
    float threshold = 0.7;/ / threshold
    float mean = (color.r + color.g + color.b) / 3.0;
    color.r = color.g = mean >= threshold ? 1.0 : 0.0;
    outColor = vec4(color.rg, 1.1);
}
Copy the code

2. Negative effect

The RGB channels are subtracted by 1 to get the negative effect. Renderer: View/Videodraftor. Java vertex shader Video.vsh Fragment shader: negative_video.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    vec4 color= texture(sTexture, texCoo2Frag);
    float r = 1.0 - color.r;
    float g = 1.0 - color.g;
    float b = 1.0 - color.b;
    outColor = vec4(r, g, b, 1.0);
}
Copy the code

3. Gray effect

The grayscale effect can be obtained by setting RGB channels to G. Renderer: View/Videodraftor. Java vertex shader video.vsh Segment shader grey_video.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    vec4 color = texture(sTexture, texCoo2Frag);
    outColor = vec4(color.g, color.g, color.g, 1.0);
}
Copy the code

4. Nostalgic effect

Renderers: View/Videodraftor. Java vertex shader Video.vsh Segment shaders: Nostalgic_video.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    vec4 color = texture(sTexture, texCoo2Frag);
    float r = color.r;
    float g = color.g;
    float b = color.b;
    
    r = 0.393* r + 0.769 * g + 0.189* b;
    g = 0.349 * r + 0.686 * g + 0.168 * b;
    b = 0.272 * r + 0.534 * g + 0.131 * b;
    outColor = vec4(r, g, b, 1.0);
}
Copy the code

5. Time flies

Java vertex shader Video.vsh Segment shader: year_video.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main(){
    float arg = 1.5;

    vec4 color= texture(sTexture, texCoo2Frag);
    float r = color.r;
    float g = color.g;
    float b = color.b;
    b = sqrt(b)*arg;
  
    if (b>1.0) b = 1.0;

    outColor = vec4(r, g, b, 1.0);
}
Copy the code

In plain English, this is playing with color. Some color algorithms, no matter in what platform, what frame, what system, as long as it can get the RGB channel value of the picture, can according to these algorithms for image special effects processing. So about the special effects of color, the most important is the usual accumulation and understanding of color, these can find more, more try.


Section shader position effects

In addition to playing with colors, we can also do special effects on the textures passed in from the video by texture coordinates, such as mirror, splice, Mosaic, etc.

1. The mirror

Renderer: View/Videodraftor. Java vertex shader video.vsh Segment shader mirror_video.fsh

To start with a simple effect of the texture coordinate position, the upper left corner of the texture is (0,0) and the upper right is at most 1. The following processing logic is: when x is greater than 0.5, x takes the value of 1-x, so that the pixels on the right are symmetrical with the pixels on the left, so as to achieve the following mirror effect.

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main() {
    vec2 pos = texCoo2Frag;
    if (pos.x > 0.5) {
        pos.x = 1.0 - pos.x;
    }
    outColor = texture(sTexture, pos);
}
Copy the code

2. Split effect

Using zoom and coordinate offset, you can achieve the effect of four video tiles on the same screen, of course you can choose two, four, six, eight…… We’re done. Renderer: View/Videodraftor. Java vertex shader Video.vsh Fragment shader: fenJing.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main(){
    vec2 pos = texCoo2Frag.xy;
    if (pos.x <= 0.5 && pos.y<= 0.5) {/ / left
        pos.x = pos.x * 2.0;
        pos.y = pos.y * 2.0;
    } else if (pos.x > 0.5 && pos.y< 0.5) {/ / right
        pos.x = (pos.x - 0.5) * 2.0;
        pos.y = (pos.y) * 2.0;
    } else if (pos.y> 0.5 && pos.x < 0.5) { / / lower left
        pos.y = (pos.y - 0.5) * 2.0;
        pos.x = pos.x * 2.0;
    } else if (pos.y> 0.5 && pos.x > 0.5) {/ / right
        pos.y = (pos.y - 0.5) * 2.0;
        pos.x = (pos.x - 0.5) * 2.0;
    }
    outColor = texture(sTexture, pos);
}
Copy the code

You can also do different effects for each storyboard, such as using the previous color effect for different storyboards. If you are idle and bored, you can add extra points again in dividing mirror… Renderers: View/Videodraftor. Java vertex shader video.vsh Fragment shader splite.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main(){
    vec2 pos = texCoo2Frag.xy;
    vec4 result;

    if (pos.x <= 0.5 && pos.y<= 0.5) {/ / left
        pos.x = pos.x * 2.0;
        pos.y = pos.y * 2.0;
        vec4 color = texture(sTexture, pos);
        result = vec4(color.g, color.g, color.g, 1.0);
    } else if (pos.x > 0.5 && pos.y< 0.5) {/ / right
        pos.x = (pos.x - 0.5) * 2.0;
        pos.y = (pos.y) * 2.0;
        vec4 color= texture(sTexture, pos);
        float arg = 1.5;
        float r = color.r;
        float g = color.g;
        float b = color.b;
        b = sqrt(b)*arg;
        if (b>1.0) b = 1.0;
        result = vec4(r, g, b, 1.0);
    } else if (pos.y> 0.5 && pos.x < 0.5) { / / lower left
        pos.y = (pos.y - 0.5) * 2.0;
        pos.x = pos.x * 2.0;
        vec4 color= texture(sTexture, pos);
        float r = color.r;
        float g = color.g;
        float b = color.b;
        r = 0.393* r + 0.769 * g + 0.189* b;
        g = 0.349 * r + 0.686 * g + 0.168 * b;
        b = 0.272 * r + 0.534 * g + 0.131 * b;
        result = vec4(r, g, b, 1.0);
    } else if (pos.y> 0.5 && pos.x > 0.5) {/ / right
        pos.y = (pos.y - 0.5) * 2.0;
        pos.x = (pos.x - 0.5) * 2.0;
        vec4 color= texture(sTexture, pos);
        float r = color.r;
        float g = color.g;
        float b = color.b;
        b = 0.393* r + 0.769 * g + 0.189* b;
        g = 0.349 * r + 0.686 * g + 0.168 * b;
        r = 0.272 * r + 0.534 * g + 0.131 * b;
        result = vec4(r, g, b, 1.0);
    }
    outColor = result;
}
Copy the code

3. Mosaic effect

Renderer: View/Videodraftor. Java vertex shader Video.vsh Fragment shader: mask_rect.fsh

Look from simple square Mosaic first, here 2264.0/1080.0 is the width and height ratio of the picture, write dead here, can pass in through the outside. CellX and cellY control the width and height of the small rectangles, and count controls how much. The bigger the count, the denser the Mosaic.

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main(){
    float rate= 2264.0 / 1080.0;
    float cellX= 1.0;
    float cellY= 1.0;
    float count = 80.0;

    vec2 pos = texCoo2Frag;
    pos.x = pos.x*count;
    pos.y = pos.y*count/rate;

    pos = vec2(floor(pos.x/cellX)*cellX/count, floor(pos.y/cellY)*cellY/(count/rate))+ 0.5/count*vec2(cellX, cellY);
    outColor = texture(sTexture, pos);
}
Copy the code

In addition to squares, round mosaics are also possible. Renderer: View/Videodraftor. Java vertex shader video.vsh Fragment shader: ball_mask.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

void main(){
    float rate= 2264.0 / 1080.0;
    float cellX= 3.0;
    float cellY= 3.0;
    float rowCount=300.0;

    vec2 sizeFmt=vec2(rowCount, rowCount/rate);
    vec2 sizeMsk=vec2(cellX, cellY);
    vec2 posFmt = vec2(texCoo2Frag.x*sizeFmt.x, texCoo2Frag.y*sizeFmt.y);
    vec2 posMsk = vec2(floor(posFmt.x/sizeMsk.x)*sizeMsk.x, floor(posFmt.y/sizeMsk.y)*sizeMsk.y)+ 0.5*sizeMsk;
    float del = length(posMsk - posFmt);
    vec2 UVMosaic = vec2(posMsk.x/sizeFmt.x, posMsk.y/sizeFmt.y);

    vec4 result;
    if (del< cellX/2.0)
    result = texture(sTexture, UVMosaic);
    else
    result = vec4(1.0.1.0.1.0.0.0);
    outColor = result;
}
Copy the code

Hexagon Mosaic effect Renderer: View/Videodraftor. Java vertex shader Video.vsh Fragment shader: video_mask.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

// Hexagon delta
const float mosaicSize = 0.01;

void main (void)
{
  float rate= 2264.0 / 1080.0;
  float length = mosaicSize;
  float TR = 0.866025;

  // Texture coordinate value
  float x = texCoo2Frag.x;
  float y = texCoo2Frag.y;

  // Convert to the coordinates in the matrix
  int wx = int(x / 1.5 / length);
  int wy = int(y / TR / length);
  vec2 v1, v2, vn;

  // Analyze whether the coordinates in the matrix are in the odd or even row, according to the odd even value to determine the angular coordinate values of our matrix
  if (wx/2 * 2 == wx) {
    if (wy/2 * 2 == wy) {
      / / (0, 0), (1, 1)
      v1 = vec2(length * 1.5 * float(wx), length * TR * float(wy));
      v2 = vec2(length * 1.5 * float(wx + 1), length * TR * float(wy + 1));
    } else {
      / / (0, 1), (1, 0)
      v1 = vec2(length * 1.5 * float(wx), length * TR * float(wy + 1));
      v2 = vec2(length * 1.5 * float(wx + 1), length * TR * float(wy)); }}else {
    if (wy/2 * 2 == wy) {
      / / (0, 1), (1, 0)
      v1 = vec2(length * 1.5 * float(wx), length * TR * float(wy + 1));
      v2 = vec2(length * 1.5 * float(wx + 1), length * TR * float(wy));
    } else {
      / / (0, 0), (1, 1)
      v1 = vec2(length * 1.5 * float(wx), length * TR * float(wy));
      v2 = vec2(length * 1.5 * float(wx + 1), length * TR * float(wy + 1)); }}// Get the distance
  float s1 = sqrt(pow(v1.x - x, 2.0) + pow(v1.y - y, 2.0));
  float s2 = sqrt(pow(v2.x - x, 2.0) + pow(v2.y - y, 2.0));

  // Set the specific texture coordinates
  if (s1 < s2) {
    vn = v1;
  } else {
    vn = v2;
  }
  vec4 color = texture(sTexture, vn);
  outColor = color;
}
Copy the code

Dynamic effect of fragment shader

The previous effect is to write dead variables, in fact, many quantities can be imported from the outside, and achieve dynamic effects, such as out-of-body, noise, shaking, etc. This will also illustrate how to pass parameters to shaders from the outside.

1. Out-of-body experience

Drawing device: the view/VideoDrawerPlus. Java vertex shader video. VSH fragment shader: gost. FSH

The progress of the diffusion is controlled by the uProgress variable, and now you only need to change the progress dynamically when drawing.

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

/ / schedule
layout (location = 5) uniform float uProgress;

void main (void) {
  / / cycle
  float duration = 0.7;
  // The maximum opacity of the second layer generated
  float maxAlpha = 0.4;
  // The maximum zoom ratio for the second layer
  float maxScale = 1.8;

  / / schedule
  float progress = mod(uProgress, duration) / duration; / / 0 ~ 1
  // Current transparency
  float alpha = maxAlpha * (1.0 - progress);
  // The current zoom scale
  float scale = 1.0 + (maxScale - 1.0) * progress;

  // Get the corresponding x and y coordinates according to the zoom ratio
  float weakX = 0.5 + (texCoo2Frag.x - 0.5) / scale;
  float weakY = 0.5 + (texCoo2Frag.y - 0.5) / scale;
  // New layer texture coordinates
  vec2 weakTextureCoords = vec2(weakX, weakY);

  // The texture pixel value corresponding to the texture coordinates of the new layer
  vec4 weakMask = texture(sTexture, weakTextureCoords);

  vec4 mask = texture(sTexture, texCoo2Frag);

  // Blend the texture pixel values to get the actual color after blending
  outColor = mask * (1.0 - alpha) + weakMask * alpha;
}
Copy the code

To avoid confusion, it has built a kind of com/toly1994 / opengl_video/view/VideoDrawerPlus. Java, need to do is define uProgressLocation, Update Progress in Draw and set glUniform1f.

private final int uProgressLocation = 5;
private float progress = 0.0 f;

public void draw(int textureId, float[] projectionMatrix, float[] sTMatrix) {
    progress += 0.02;
    GLES30.glClear(GLES30.GL_DEPTH_BUFFER_BIT | GLES30.GL_COLOR_BUFFER_BIT);
    GLES30.glUseProgram(programId);
    GLES30.glUniform1f(uProgressLocation, progress);
    / / with slightly
}
Copy the code

2. Burr effect

Drawing device: the view/VideoDrawerPlus. Java vertex shader video. VSH fragment shader: video_ci. FSH

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

/ / schedule
layout (location = 5) uniform float uProgress;

const float PI = 3.14159265;
const float uD = 80.0;
const float uR = 0.5;

// This function c gets a random number
float rand(float n) {
  // Return the x decimal part of fract(x)
  return fract(sin(n) * 43758.5453123);
}

void main (void) {
  // Maximum jitter
  float maxJitter = 0.2;
  float duration = 0.4;
  // Red color offset
  float colorROffset = 0.01;
  // Blue color offset
  float colorBOffset = 0.025;

  // Time of the current cycle
  float time = mod(uProgress, duration * 2.0);
  // Current amplitude 0.0 ~ 1.0
  float amplitude = max(sin(uProgress * (PI / duration)), 0.0);

  // The y value of the current coordinate is randomly offset from -1 to 1
  float jitter = rand(texCoo2Frag.y) * 2.0 - 1.0;
  // Determine whether the current coordinates need to be offset
  bool needOffset = abs(jitter) < maxJitter * amplitude;

  // Get the texture x value and determine how much it should be offset in the x direction based on whether it is greater than a certain threshold
  float textureX = texCoo2Frag.x + (needOffset ? jitter : (jitter * amplitude * 0.006));
  // Texture coordinates after tearing in x direction
  vec2 textureCoords = vec2(textureX, texCoo2Frag.y);

  // Color offset 3 groups of colors
  vec4 mask = texture(sTexture, textureCoords);
  vec4 maskR = texture(sTexture, textureCoords + vec2(colorROffset * amplitude, 0.0));
  vec4 maskB = texture(sTexture, textureCoords + vec2(colorBOffset * amplitude, 0.0));

  // Finally get the final color based on three different sets of texture coordinate values
  outColor = vec4(maskR.r, mask.g, maskB.b, mask.a);
}
Copy the code

3. Dispersion effect

Drawing device: the view/VideoDrawerPlus. Java vertex shader video. VSH fragment shader: video_offset. FSH

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;

/ / schedule
layout (location = 5) uniform float uProgress;

void main (void) {
  / / cycle
  float duration = 0.7;
  // The maximum opacity of the second layer generated
  float maxAlpha = 0.4;
  // The maximum zoom ratio for the second layer
  float maxScale = 1.8;

  / / schedule
  float progress = mod(uProgress, duration) / duration; / / 0 ~ 1
  // Current transparency
  float alpha = maxAlpha * (1.0 - progress);
  // The current zoom scale
  float scale = 1.0 + (maxScale - 1.0) * progress;

  // Get the corresponding x and y coordinates according to the zoom ratio
  float weakX = 0.5 + (texCoo2Frag.x - 0.5) / scale;
  float weakY = 0.5 + (texCoo2Frag.y - 0.5) / scale;
  // New layer texture coordinates
  vec2 weakTextureCoords = vec2(weakX, weakY);

  // The texture pixel value corresponding to the texture coordinates of the new layer
  vec4 weakMask = texture(sTexture, weakTextureCoords);

  vec4 mask = texture(sTexture, texCoo2Frag);

  // Blend the texture pixel values to get the actual color after blending
  outColor = mask * (1.0 - alpha) + weakMask * alpha;
}
Copy the code

4. Jitter effect

Drawing device: the view/VideoDrawerPlus. Java vertex shader video_scale. VSH fragment shader: video_offset. FSH

Jitter is the effect of continuously scaling the vertex shader’s transformation matrix. Fragment shaders can also perform effects at the same time. Here is a combination of jitter and dispersion.

---->[video_scale.vsh]----
#version 300 es
layout (location = 0) in vec4 vPosition;// Vertex position
layout (location = 1) in vec4 vTexCoord;// Texture coordinates
layout (location = 2) uniform mat4 uMatrix;
layout (location = 3) uniform mat4 uSTMatrix;
// The current time
layout (location = 5) uniform float uProgress;

out vec2 texCoo2Frag;
const float PI = 3.1415926;
void main() {
    
    / / cycle
    float duration = 0.6;
    // The maximum scale
    float maxAmplitude = 0.3;

    // Similar to residuals, represents the time value in the current cycle
    float time = mod(uProgress, duration);
    // Get the current magnification value according to the position in the cycle
    float amplitude = 1.0 + maxAmplitude * abs(sin(time * (PI / duration)));
    // The current vertex is converted to the screen coordinate position
    gl_Position = uMatrix*vec4(vPosition.x * amplitude, vPosition.y * amplitude, vPosition.zw);
    texCoo2Frag = (uSTMatrix * vTexCoord).xy;
}
Copy the code

5. Distortion effect

Drawing device: the view/VideoDrawerPlus. Java vertex shader video. VSH fragment shader: video_rotate. FSH

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision highp float;

in vec2 texCoo2Frag;
out vec4 outColor;

layout (location = 4) uniform samplerExternalOES sTexture;
layout (location = 5) uniform float uProgress;

const float PI = 3.14159265;
const float uD = 80.0;
const float uR = 1.0;

void main()
{
  float rate= 2264.0 / 1080.0;
  ivec2 ires = ivec2(128.128);
  float res = float(ires.s);
  / / cycle
  float duration = 3.0;
  vec2 st = texCoo2Frag;
  float radius = res * uR;
  / / schedule
  float progress = mod(uProgress, duration) / duration; / / 0 ~ 1
  vec2 xy = res * st;

  vec2 dxy = xy - vec2(res/2., res/2.);
  float r = length(dxy);

  / / (1.0 - r/Radius);
  float beta = atan(dxy.y, dxy.x) + radians(uD) * 2.0 * (-(r/radius)*(r/radius) + 1.0);

  vec2 xy1 = xy;
  if(r<=radius) {
    xy1 = res/2. + r*vec2(cos(beta), sin(beta))*progress;
  }
  st = xy1/res;

  vec3 irgb = texture(sTexture, st).rgb;

  outColor = vec4( irgb, 1.0 );
}
Copy the code

In general, with special effects, you can manipulate texture positions, vertex positions, and fragment colors. A lot of the shaders are collected by me, and some of the long shader code I don’t really understand, I’ll take a closer look at them later. These shaders are a tribute to the pioneers’ thinking about the transformation of the world and their obsession with visual presentation. I also hope that later people can design interesting shaders.

@Zhang Fengjietele 2020.12.08 not allowed to transfer my public number: the king of programming contact me – email :[email protected] – wechat :zdl1994328 ~ END ~