The original article was first published on the wechat official account Byteflow
Camera base filter
In this section, we will use GLSL (OpenGL Shader Language) to implement a variety of basic filters based on different shaders.
GLSL some high frequency of use of built-in functions
Built-in functions | Function description |
---|---|
float distance (genType p0, genType p1) | Compute the distance between the vectors P0 and p1 |
float length (genType x) | Returns the length of vector x |
genType floor (genType x) | Returns the maximum integer value less than or equal to x |
genType ceil (genType x) | Returns the minimum integer value greater than or equal to x |
genType mod (genType x, float y) | Return x — y * floor (x/y), which is modulo computed % |
float dot (genType x, genType y) | The dot product between the vectors x and y |
vec3 cross (vec3 x, vec3 y) | The cross product between the vectors x and y |
genType normalize (genType x) | Normalized vector, return a vector of length 1 in the same direction as x |
Dynamic grid
Dynamic mesh filters mainly divide the texture into multiple grids and then dynamically change the width of the mesh lines based on an offset. Mod and floor are built-in GLSL functions that represent modular and rounded, respectively. It should be noted that the texture coordinate system should be converted to the picture coordinate system before calculation to ensure that the mesh is not stretched.
// Dynimic Mesh dynamic mesh shader
#version 100
precision highp float;
varying vec2 v_texcoord;
uniform lowp sampler2D s_textureY;
uniform lowp sampler2D s_textureU;
uniform lowp sampler2D s_textureV;
uniform float u_offset;/ / the offset
uniform vec2 texSize;// Texture size
vec4 YuvToRgb(vec2 uv) {
float y, u, v, r, g, b;
y = texture2D(s_textureY, uv).r;
u = texture2D(s_textureU, uv).r;
v = texture2D(s_textureV, uv).r;
u = u - 0.5;
v = v - 0.5;
r = y + 1.403 * v;
g = y - 0.344 * u - 0.714 * v;
b = y + 1.770 * u;
return vec4(r, g, b, 1.0);
}
void main(a)
{
vec2 imgTexCoord = v_texcoord * texSize;// Convert texture coordinates to image coordinates
float sideLength = texSize.y / 6.0;// The edge length of the grid
float maxOffset = 0.15 * sideLength;// Set the maximum grid line width
float x = mod(imgTexCoord.x, floor(sideLength));
float y = mod(imgTexCoord.y, floor(sideLength));
float offset = u_offset * maxOffset;
if(offset <= x
&& x <= sideLength - offset
&& offset <= y
&& y <= sideLength - offset)
{
gl_FragColor = YuvToRgb(v_texcoord);
}
else
{
gl_FragColor = vec4(1.0.1.0.1.0.1.0); }}Copy the code
Split screen
The principle of split screen filter is to downsample (zoom out) the entire texture in multiple specified areas, so that the entire image can be displayed multiple times in multiple areas.
// Split screen (split screen)
#version 100
precision highp float;
varying vec2 v_texcoord;
uniform lowp sampler2D s_textureY;
uniform lowp sampler2D s_textureU;
uniform lowp sampler2D s_textureV;
vec4 YuvToRgb(vec2 uv) {
float y, u, v, r, g, b;
y = texture2D(s_textureY, uv).r;
u = texture2D(s_textureU, uv).r;
v = texture2D(s_textureV, uv).r;
u = u - 0.5;
v = v - 0.5;
r = y + 1.403 * v;
g = y - 0.344 * u - 0.714 * v;
b = y + 1.770 * u;
return vec4(r, g, b, 1.0);
}
void main(a)
{
vec2 newTexCoord = v_texcoord;
if(newTexCoord.x < 0.5)
{
newTexCoord.x = newTexCoord.x * 2.0;
}
else
{
newTexCoord.x = (newTexCoord.x - 0.5) * 2.0;
}
if(newTexCoord.y < 0.5)
{
newTexCoord.y = newTexCoord.y * 2.0;
}
else
{
newTexCoord.y = (newTexCoord.y - 0.5) * 2.0;
}
gl_FragColor = YuvToRgb(newTexCoord);
}
Copy the code
Scaling of the circular
The circle effect implementation of scaling relies primarily on offsets to dynamically change the size of the circle radius, sampling the texture within the radius to display the image, and returning a fixed color (such as white) outside the radius. Distance is also a built-in function of GLSL to calculate the distance between two points. In addition, it should be noted that the texture coordinate system should be converted to the image coordinate system before calculation, otherwise the drawing will be an elliptic image (in the case of different image width and height), think about why this is the case.
//scale circle is a scaled circle
#version 100
precision highp float;
varying vec2 v_texcoord;
uniform lowp sampler2D s_textureY;
uniform lowp sampler2D s_textureU;
uniform lowp sampler2D s_textureV;
uniform float u_offset;
uniform vec2 texSize;
vec4 YuvToRgb(vec2 uv) {
float y, u, v, r, g, b;
y = texture2D(s_textureY, uv).r;
u = texture2D(s_textureU, uv).r;
v = texture2D(s_textureV, uv).r;
u = u - 0.5;
v = v - 0.5;
r = y + 1.403 * v;
g = y - 0.344 * u - 0.714 * v;
b = y + 1.770 * u;
return vec4(r, g, b, 1.0);
}
void main(a)
{
vec2 imgTex = v_texcoord * texSize;// Convert texture coordinates to image coordinates
float r = (u_offset + 0.208 ) * texSize.x;
if(distance(imgTex, vec2(texSize.x / 2.0, texSize.y / 2.0)) < r)
{
gl_FragColor = YuvToRgb(v_texcoord);
}
else
{
gl_FragColor = vec4(1.0.1.0.1.0.1.0); }}Copy the code
Before calculation, the texture coordinate system should be converted to the image coordinate system first, the reason is that the value range of the vertical and horizontal coordinates of the texture are all [0, 1]. From the numerical point of view, the length of the vertical and horizontal directions of the texture is the same, but in OpenGL sampling, the width to height ratio of the image is often not 1, which leads to the vertical and horizontal coordinates with the same value. Corresponding to different sampling weights, the circle was expected to be drawn but the ellipse was actually drawn.
Implementation code path: OpenGLCamera2