I. Introduction to the camera
In Android OpenGL basics (three, draw Bitmap texture) article, we briefly introduced how to draw how to paste a picture on the quadrilateral. This article describes how to use GLSurfaceView to implement preview camera. Unlike a single image texture, the camera is a texture with constantly changing content. First of all, a brief introduction to several common methods of camera:
1.1 Declaring Camera Permissions
If your APP needs to use a camera, declare it in manifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
Copy the code
1.2 Checking camera Permissions
There are two types of Android permissions:
- Installation permissions: For example, common permissions or signature permissions are automatically granted to your application during application installation.
- Runtime permissions: On Android 6.0 (API level 23) or later, you must request permissions.
Check whether you have the camera permission as follows:
// Method 1: Call the API provided by the Activity in the Activity
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
this.checkSelfPermission(Manifest.permission.CAMERA)
}
// Method 2: The API provided by AndroidX
ContextCompat.checkSelfPermission(this,
Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED
Copy the code
1.3 Requesting Camera Permission
Request camera permission as follows:
// Method 1:: calls the API provided by the Activity in the Activity
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
val permissions = arrayOf(Manifest.permission.CAMERA)
this.requestPermissions(permissions, PERMISSION_REQUEST_CODE)
}
// Method 2: The API provided by AndroidX
ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.CAMERA),1)
Copy the code
1.4 Common Methods
Here are some common methods of Camera:
public class Camera1Utils {
private Camera camera;
/** * open the camera **/
public void openCamera(a) {
// Turn on the camera
camera = Camera.open();
Camera.Parameters parameters = camera.getParameters();
// Auto focus
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
camera.setParameters(parameters);
// Start camera preview
camera.startPreview();
}
public void stopCamera(a) {
if(camera ! =null) { camera.stopPreview(); camera.release(); }}/** * Use SurfaceHolder to handle camera preview data **/
public void setPreviewDisplay(SurfaceHolder surfaceHolder) {
try {
// Pass the camera preview to the SurfaceHolder
camera.setPreviewDisplay(surfaceHolder);
} catch(Exception e) { e.printStackTrace(); }}/** * Use SurfaceTexture to hold camera preview data **/
public void setPreviewTexture(SurfaceTexture surfaceTexture){
try {
// Pass the camera preview data to the SurfaceTexture
camera.setPreviewTexture(surfaceTexture);
} catch(Exception e) { e.printStackTrace(); }}}Copy the code
Second, camera preview
In order to see the camera preview on the screen, you need to pass the camera data to a View for display after opening the camera. The common way is to use the SurfaceView to display the camera in real time.
2.1 SurfaceView
After the SurfaceView is created successfully, the camera data can be passed to the SurfaceView’s SurfaceHolder to display the camera image in the SurfaceView.
class CameraPreview(context: Context,=private val mCamera: Camera
) : SurfaceView(context), SurfaceHolder.Callback {
override fun surfaceCreated(holder: SurfaceHolder) {
// After the Surface is successfully created, the camera data is passed to the SurfaceView's SurfaceHolder
mCamera.apply {
try {
setPreviewDisplay(holder)
startPreview()
} catch (e: IOException) {
Log.d(TAG, "Error setting camera preview: ${e.message}")}}}Copy the code
2.2 GLSurfaceView
The GLSurfaceView class provides helper classes that help manage THE EGL context, communicate between threads, and interact with the activity lifecycle. The GLSurfaceView itself cannot be associated directly with the camera data, requiring SurfaceTexture. Once the camera is open, you can pass the camera data to the SurfaceTexture, where the camera texture is drawn into the GLSurfaceView. This article mainly introduces this approach, which is detailed in section 3.
Third, OpenGL camera preview
SurfaceTexture, GLSurfaceView, GlSurfaceView. Render, SurfaceTexture, GlSurfaceView. Render, SurfaceTexture, GlSurfaceView. Render
3.1 SurfaceTexture
SurfaceTexture is used to capture camera preview data after the camera is started.
public class SurfaceTexture {
/** * Register OnFrameAvailableListener callback; * OnFrameAvailableListener's onFrameAvailable method is called when new data is available for the SurfaceTexture */
public void setOnFrameAvailableListener(SurfaceTexture.OnFrameAvailableListener listener) {
setOnFrameAvailableListener(listener, null);
}
/** * Update the texture image to the most recent frame from the image stream. */
public void updateTexImage(a) { nativeUpdateTexImage(); }}Copy the code
3.2 GLSurfaceView
In GLSurfaceView, when the SurfaceTexture has new data (onFrameAvailable), call its requestRender() to trigger its own rerender (onDrawFrame() method) :
class MyGLSurfaceView(context: Context? , attrs: AttributeSet?) : GLSurfaceView(context, attrs), SurfaceTexture.OnFrameAvailableListener {private val renderer: MyGLRenderer
init {
setEGLContextClientVersion(2)
renderer = MyGLRenderer(this)
setRenderer(renderer)
renderMode = RENDERMODE_WHEN_DIRTY
}
override fun onFrameAvailable(surfaceTexture: SurfaceTexture?). {
RenderMode set to RENDERMODE_WHEN_DIRTY;
OnFrameAvailable () is called when the camera passes new data to the SurfaceTexture
// Call requestRender() in onFrameAvailable() to trigger a render update to the Surface
requestRender()
}
}
Copy the code
3.3 GLSurfaceView. The Renderer
The glSurfaceView. Renderer works simply by starting the camera after onSurfaceCreated and passing the camera preview data to the SurfaceTexture. The Listener for the SurfaceTexture is set to GLSurfaceView, and the main drawing work is done in our custom CameraDrawer class:
class MyGLRenderer(private val frameAvailableListener: SurfaceTexture.OnFrameAvailableListener)
: GLSurfaceView.Renderer {
private lateinit var cameraDrawer: CameraDrawer
private val cameraManager: Camera1Utils = Camera1Utils()
override fun onSurfaceCreated(gl: GL10? , config:EGLConfig?). {
GLES20.glClearColor(0.0 f.0.0 f.0.0 f.1.0 f)
cameraDrawer = CameraDrawer()
cameraDrawer.getSurfaceTexture().setOnFrameAvailableListener(frameAvailableListener)
cameraManager.openCamera()
cameraManager.setPreviewTexture(cameraDrawer.getSurfaceTexture())
}
override fun onSurfaceChanged(gl: GL10? , width:Int, height: Int) {
GLES20.glViewport(0.0, width, height)
}
override fun onDrawFrame(gl: GL10?). {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT)
cameraDrawer.getSurfaceTexture().updateTexImage()
cameraDrawer.draw()
}
}
Copy the code
3.4 Drawing camera textures
Let’s take a look at how our custom CameraDrawer class draws the camera preview.
3.4.1 Creating a Texture
Create textures using the Android OpenGL basic OpenGLUtils tool class provided in Section 1.2. The difference is that the camera texture type is gles11ext.gl_texture_external_oes:
val texture = OpenGLUtils.createTextures(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 1,
GLES20.GL_NEAREST, GLES20.GL_LINEAR,
GLES20.GL_CLAMP_TO_EDGE, GLES20.GL_CLAMP_TO_EDGE
)
Copy the code
3.4.2 Camera Texture GLSL
The uniform samplerExternalOES s_texture is the same as the uniform samplerExternalOES s_texture.
/** * Vertex shader code */
private val vertexShaderCode = """ attribute vec4 vPosition; attribute vec2 inputTextureCoordinate; varying vec2 textureCoordinate; void main(){ gl_Position = vPosition; textureCoordinate = inputTextureCoordinate; } "" "
/** * fragment shader code */
private val fragmentShaderCode = """ #extension GL_OES_EGL_image_external : require precision mediump float; varying vec2 textureCoordinate; uniform samplerExternalOES s_texture; void main() { gl_FragColor = texture2D( s_texture, textureCoordinate ); } "" "
Copy the code
3.4.3 Camera texture vertex coordinates
OpenGL preview camera picture is actually to draw the camera texture to a quadrilateral, and Android OpenGL base (three, draw Bitmap texture) 2D picture texture is different in that the starting point of the camera data is the mobile phone landscape, the upper left corner is (0,0) point, So if we want the camera to match the portrait preview we want, we need to set the texture coordinates corresponding to the vertices to:
// Coordinates of quadrilateral vertices
private var squareCoords = floatArrayOf(
-1f.1f.0.0 f.// top left
-1f, -1f.0.0 f.// bottom left
1f, -1f.0.0 f.// bottom right
1f.1f.0.0 f // top right
)
// Texture coordinates corresponding to vertices
private var textureVertices = floatArrayOf(
0f.1f.// top left
1f.1f.// bottom left
1f.0f.// bottom right
0f.0f // top right
)
Copy the code
3.4.4 summary
After modifying the above code, the actual drawing method draw() is exactly the same as that in Android OpenGL base (three, draw Bitmap texture). The complete code is as follows:
class CameraDrawer {
/** * Vertex shader code */
private val vertexShaderCode = """ attribute vec4 vPosition; attribute vec2 inputTextureCoordinate; varying vec2 textureCoordinate; void main(){ gl_Position = vPosition; textureCoordinate = inputTextureCoordinate; } "" "
/** * fragment shader code */
private val fragmentShaderCode = """ #extension GL_OES_EGL_image_external : require precision mediump float; varying vec2 textureCoordinate; uniform samplerExternalOES s_texture; void main() { gl_FragColor = texture2D( s_texture, textureCoordinate ); } "" "
/** * shader program ID reference */
private var mProgram = 0
/** * Camera preview SurfaceTexture */
private var cameraSurfaceTexture: SurfaceTexture
// Coordinates of quadrilateral vertices
private var squareCoords = floatArrayOf(
-1f.1f.0.0 f.// top left
-1f, -1f.0.0 f.// bottom left
1f, -1f.0.0 f.// bottom right
1f.1f.0.0 f // top right
)
// Texture coordinates corresponding to vertices
private var textureVertices = floatArrayOf(
0f.1f.// top left
1f.1f.// bottom left
1f.0f.// bottom right
0f.0f // top right
)
// A buffer array of four vertices
private val vertexBuffer: FloatBuffer =
ByteBuffer.allocateDirect(squareCoords.size * 4).order(ByteOrder.nativeOrder())
.asFloatBuffer().apply {
put(squareCoords)
position(0)}// A sequential array of four vertices
private val drawOrder = shortArrayOf(0.1.2.0.2.3)
// A buffer array of four vertices to draw a sequential array
private val drawListBuffer: ShortBuffer =
ByteBuffer.allocateDirect(drawOrder.size * 2).order(ByteOrder.nativeOrder())
.asShortBuffer().apply {
put(drawOrder)
position(0)}// A four-vertex texture coordinate buffer array
private val textureVerticesBuffer: FloatBuffer =
ByteBuffer.allocateDirect(textureVertices.size * 4).order(ByteOrder.nativeOrder())
.asFloatBuffer().apply {
put(textureVertices)
position(0)}private var textureID = 0
// Number of coordinates for each vertex
private val COORDS_PER_VERTEX = 3
// The number of coordinates for each texture vertex
private val COORDS_PER_TEXTURE_VERTEX = 2
private val vertexStride: Int = COORDS_PER_VERTEX * 4
private val textVertexStride: Int = COORDS_PER_TEXTURE_VERTEX * 4
init {
// Compile vertex shaders and fragment shaders
val vertexShader: Int = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode)
val fragmentShader: Int = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode)
The glCreateProgram function creates a shader program and returns an ID reference to the newly created program object
mProgram = GLES20.glCreateProgram().also {
// Add a vertex shader to the program object
GLES20.glAttachShader(it, vertexShader)
// Add fragment shaders to program objects
GLES20.glAttachShader(it, fragmentShader)
// Connect and create an executable OpenGL ES program object
GLES20.glLinkProgram(it)
}
val texture = OpenGLUtils.createTextures(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 1,
GLES20.GL_NEAREST, GLES20.GL_LINEAR,
GLES20.GL_CLAMP_TO_EDGE, GLES20.GL_CLAMP_TO_EDGE
)
textureID = texture[0]
cameraSurfaceTexture = SurfaceTexture(textureID)
}
fun getSurfaceTexture(a): SurfaceTexture {
return cameraSurfaceTexture
}
fun draw(a) {
Add program to OpenGL ES environment
GLES20.glUseProgram(mProgram)
// Get the vPosition variable in the vertex shader (it can be obtained from the shader program because the shader code has been compiled previously); It's represented by a unique ID
val position = GLES20.glGetAttribLocation(mProgram, "vPosition")
// Allows manipulation of the vertex object position
GLES20.glEnableVertexAttribArray(position)
// Pass vertex data to the vPosition variable pointed to by position; Associates a vertex property with a vertex buffer object
GLES20.glVertexAttribPointer(
position, COORDS_PER_VERTEX, GLES20.GL_FLOAT,
false, vertexStride, vertexBuffer
)
// Activate the textureID corresponding texture unit
GLES20.glActiveTexture(textureID)
// Bind the texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureID)
// Get the inputTextureCoordinate variable in the vertex shader (texture coordinates); It's represented by a unique ID
val textureCoordinate = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate")
// Allows manipulation of the texture coordinates inputTextureCoordinate variable
GLES20.glEnableVertexAttribArray(textureCoordinate)
// Pass texture coordinate data to the inputTextureCoordinate variable
GLES20.glVertexAttribPointer(
textureCoordinate, COORDS_PER_TEXTURE_VERTEX, GLES20.GL_FLOAT,
false, textVertexStride, textureVerticesBuffer
)
// Draw the quadrilateral in the order specified in drawListBuffer
GLES20.glDrawElements(
GLES20.GL_TRIANGLE_STRIP, drawOrder.size,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer
)
// After the operation, the vertex object position is disallowed
GLES20.glDisableVertexAttribArray(position)
GLES20.glDisableVertexAttribArray(textureCoordinate)
}
private fun loadShader(type: Int, shaderCode: String): Int {
The glCreateShader function creates a vertex shader or fragment shader and returns a reference to the ID of the newly created shader
val shader = GLES20.glCreateShader(type)
// Associate the shader with the code, then compile the shader
GLES20.glShaderSource(shader, shaderCode)
GLES20.glCompileShader(shader)
return shader
}
}
Copy the code
Four, camera filter
Similar to the image post-processing described in Android OpenGL Basics (4, Image post-processing), in the camera preview, you can modify the fragment coloring code to implement the camera filter.
4.1 Grayscale filter
/** * fragment shader code */
private val fragmentShaderCode = """ #extension GL_OES_EGL_image_external : require precision mediump float; varying vec2 textureCoordinate; uniform samplerExternalOES s_texture; void main() { gl_FragColor = texture2D( s_texture, textureCoordinate ); Float Average = 0.2126 * gl_fragcolor.r + 0.7152 * gl_fragcolor.g + 0.0722 * gl_fragcolor.b; float Average = 0.2126 * gl_fragcolor.r + 0.7152 * gl_fragcolor.g + 0.0722 * gl_fragcolor.b; Gl_FragColor = vec4(average, average, average, 1.0); } "" "
Copy the code
4.2 Edge detection filter
/** * fragment shader code */
private val fragmentShaderCode = """ #extension GL_OES_EGL_image_external : require precision mediump float; varying vec2 textureCoordinate; uniform samplerExternalOES s_texture; Const float offset = 1.0f / 300.0f; void main() { vec2 offsets[9]; offsets[0] = vec2(-offset, offset); // offsets[1] = vec2(0.0f, offset); // offsets[2] = vec2(offset, offset); // offsets[3] = vec2(-offset, 0.0f); // left offsets[4] = vec2(0.0f, 0.0f); // offset [5] = vec2(offset, 0.0f); // right offsets[6] = vec2(-offset, -offset); // offsets[7] = vec2(0.0f, -offset); // offsets[8] = offset (-offset); // bottom right // kernel function float kernel[9]; The kernel [0] = 1.0 f; The kernel [1] = 1.0 f; The kernel [2] = 1.0 f; The kernel [3] = 1.0 f; The kernel [4] = 8.0 f; The kernel [5] = 1.0 f; The kernel [6] = 1.0 f; The kernel [7] = 1.0 f; The kernel [8] = 1.0 f; Vec3 sampleTex[9]; for(int i = 0; i < 9; i++) { sampleTex[i] = vec3(texture2D(s_texture, textureCoordinate.xy + offsets[i])); }r col = vec3(0.0); for(int i = 0; i < 9; i++) col += sampleTex[i] * kernel[i]; Gl_FragColor = vec4 (col 1.0); } "" "
Copy the code
For other types of filters, the fragment shader code can be similarly modified. Here are the effects of the camera’s original image and several other filters:
The End
Please follow me to unlock more skills: BC’s Gold Digger homepage ~ 💐 BC’s CSDN homepage ~ 💐💐
Android OpenGL developer documentation: developer.android.com/guide/topic… Opengl learning materials: learnopengl – cn. Making. IO/camera development documentation: developer.android.com/guide/topic… The surfaceview developer documentation: source.android.com/devices/gra… GPUImage: github.com/cats-oss/an… Android OpenGL base (a, draw a triangle quadrilateral) :juejin.cn/post/707675… Android OpenGL basic (2, coordinate system) :juejin.cn/post/707713… Android OpenGL base (three, draw Bitmap texture) :juejin.cn/post/707967… Android OpenGL basic (four, image post-processing) :juejin.cn/post/708073… Android OpenGL: juejin.cn/column/7076…