Public account reply: OpenGL, receive learning resources gift package
Author: cain_huang
Source: https://www.jianshu.com/p/c127387cd504
New media typesetting
The first thing we need to know about drawing a sphere is to build a spherical texture this way.
To build a sphere, we first need to know how the vertices and textures need to be built.
Spherical texturing is generally done by splitting the sphere into multiple planes and then using triangles to construct the planes.
When divided into more planes, it is about like a ball.
The vertices of a sphere are generally connected to each other, as shown below:
Spherical vertices. PNG
Now that we know how spherical vertices are connected, how do we calculate the positions of spherical vertices?
As shown below:
Three dimensional coordinate formula.png
According to the triangle formula, it can be obtained:
x0 = R * cos(a) * sin(b); y0 = R * sin(a); z0 = R * cos(a) * cos(b);
Copy the code
And that gives us the position of the vertex at a certain position of the sphere. Then we can divide the sphere into a number of vertices at a certain Angle and connect them together to form the sphere we need.
The vertex calculation method is as follows:
for (double vAngle = 0; vAngle < Math.PI; vAngle = vAngle + angleSpan) { // vertical for (double hAngle = 0; hAngle < 2 * Math.PI; hAngle = hAngle + angleSpan) { // horizontal float x0 = (float) (radius * Math.sin(vAngle) * Math.cos(hAngle)); float y0 = (float) (radius * Math.sin(vAngle) * Math.sin(hAngle)); float z0 = (float) (radius * Math.cos((vAngle))); float x1 = (float) (radius * Math.sin(vAngle) * Math.cos(hAngle + angleSpan)); float y1 = (float) (radius * Math.sin(vAngle) * Math.sin(hAngle + angleSpan)); float z1 = (float) (radius * Math.cos(vAngle)); float x2 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.cos(hAngle + angleSpan)); float y2 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.sin(hAngle + angleSpan)); float z2 = (float) (radius * Math.cos(vAngle + angleSpan)); float x3 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.cos(hAngle)); float y3 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.sin(hAngle)); float z3 = (float) (radius * Math.cos(vAngle + angleSpan)); vertex.add(x1); vertex.add(y1); vertex.add(z1); vertex.add(x3); vertex.add(y3); vertex.add(z3); vertex.add(x0); vertex.add(y0); vertex.add(z0); vertex.add(x1); vertex.add(y1); vertex.add(z1); vertex.add(x2); vertex.add(y2); vertex.add(z2); vertex.add(x3); vertex.add(y3); vertex.add(z3); }}
Copy the code
The above method is to divide the vertices into several vertices at certain angles horizontally and longitudinally. After connecting all the vertices into a small plane, we can get the sphere we need by splicing them together.
Ok, let’s look at the implementation:
-
Custom GLSurfaceView:
public class SphereSurfaceView extends GLSurfaceView { private SphereRender mSphereRender; public SphereSurfaceView(Context context) { super(context); init(context); } public SphereSurfaceView(Context context, AttributeSet attrs) { super(context, attrs); init(context); } private void init(Context context) { mSphereRender = new SphereRender(context); setEGLContextClientVersion(2); setRenderer(mSphereRender); setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY); setOnTouchListener(new OnTouchListener() { public boolean onTouch(View v, MotionEvent event) {switch (event.getAction()) {case motionEvent.action_down :// Msphererender.rotate (20f, 0, 1, 0); } return true; }}); }}
Copy the code
When YOU click on GLSurfaceView, the sphere will rotate 20 degrees about the Y-axis.
- Let’s look at the Render implementation:
Public class implements GlSurfaceView.renderer {private float[] mViewMatrix = new float[16]; Private Float [] mModelMatrix = new float[16]; Private float[] mProjectionMatrix = new float[16]; Private float[] mMVPMatrix = new float[16]; private float[] mMVPMatrix = new float[16]; private SphereFilter mSphereFilter; private Context mContext; public SphereRender(Context context) { mContext = context; } @override public void onSurfaceCreated(GL10 gl, EGLConfig config) { 0.5 f, 1.0 f); Gles20.glenable (gles20.gl_depth_test); // Open the back clipping gles20.glenable (gles20.gl_cull_face); // Create SphereFilter entity mSphereFilter = new SphereFilter(); // Initialize matrix initMatrix(); } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { GLES20.glViewport(0, 0, width, height); float ratio = (float) width / height; FrustumM (mProjectionMatrix, 0, -ratio, ratio, -1, 1, 20, 100); SetLookAtM (mViewMatrix, 0, 0f, 0f, 30f, // Camera positions 0f, 0f, 0f, // target positions 0f, 1.0f, 0.0f // Camera orientation); } @ Override public void onDrawFrame GL10 (gl) {/ / clear color depth buffer and buffer GLES20 glClear (GLES20. GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT); calculateMatrix(); mSphereFilter.setMVPMatrix(mMVPMatrix); mSphereFilter.drawSphere(); } /** * private void initMatrix() {matrix.setidentitym (mViewMatrix, 0); Matrix.setIdentityM(mModelMatrix, 0); Matrix.setIdentityM(mProjectionMatrix, 0); Matrix.setIdentityM(mMVPMatrix, 0); } /** * Private void calculateMatrix() {matrix.multiplymm (mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0); Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0); } / rotating * * * * @ param Angle * @ @ param param x * y * @ param z * / public void rotate (float Angle, float x, float y, float z) { Matrix.rotateM(mModelMatrix, 0, angle, x, y, z); }}
Copy the code
Render is mainly to enable the depth detection, and then set the corresponding view matrix, projection matrix and model matrix, etc., to calculate the comprehensive transformation matrix during rendering to pass to the spherical rendering entity, and then draw the sphere.
- Spherical drawing entities are implemented as follows:
public class SphereFilter { private static final String VERTEX_SHADER = "uniform mat4 u_Matrix; // The final transform matrix \n" + "attribute vec4 a_Position; // Vertex position \ N "+" VARYING VEC4 vPosition; // The vertex position used to pass to the slice shader \n" + "void main() {\n" + "gl_Position = u_Matrix * a_Position; \n" + " vPosition = a_Position; \n" + "}"; private static final String FRAGMENT_SHADER = "precision mediump float; \n" + "varying vec4 vPosition; \n" + "void main() {\n" + "float uR = 0.6; // The radius of the ball \n" + "vec4 color; \n" + "float n = 8.0; Float span = 2.0*uR/n; float span = 2.0*uR/n; float span = 2.0*uR/n; / / length of square \ n "+" / / computing procession layer \ n "+" int I = int ((vPosition. X + uR)/span); N "+" int j = int((vposition. y + uR)/span); \n" + "int k = int((vposition. z + uR)/span); / / the number of columns \ n "+" int colorType = int (mod (float (I + j + k), 2.0)); \ n "+" if (colorType = = 1) {/ / odd number for the green \ n "+" color = vec4 (0.2, 1.0, 0.129, 0); \ n "+"} else {/ / even to white \ n "+" color = vec4 (1.0, 1.0, 1.0, 0); / / white \ n \ n "+" "+"} / / to calculate the color to the film \ n "+" gl_FragColor = color; \n" + "}"; Private float radius = 1.0f; // Final double angleSpan = math.pi / 90f; Private FloatBuffer mVertexBuffer; Int mVertexCount = 0; Private static final int BYTES_PER_FLOAT = 4; private static final int BYTES_PER_FLOAT = 4; Private static final int COORDS_PER_VERTEX = 1; private static final int COORDS_PER_VERTEX = 1; private int mProgramHandle; private int muMatrixHandle; private int maPositionHandle; private float[] mMVPMatrix = new float[16]; public SphereFilter() { initSphereVertex(); createProgram(); Matrix.setIdentityM(mMVPMatrix, 0); Public void initSphereVertex() {ArrayList<Float> vertex = new ArrayList<Float>(); for (double vAngle = 0; vAngle < Math.PI; vAngle = vAngle + angleSpan) { // vertical for (double hAngle = 0; hAngle < 2 * Math.PI; hAngle = hAngle + angleSpan) { // horizontal float x0 = (float) (radius * Math.sin(vAngle) * Math.cos(hAngle)); float y0 = (float) (radius * Math.sin(vAngle) * Math.sin(hAngle)); float z0 = (float) (radius * Math.cos((vAngle))); float x1 = (float) (radius * Math.sin(vAngle) * Math.cos(hAngle + angleSpan)); float y1 = (float) (radius * Math.sin(vAngle) * Math.sin(hAngle + angleSpan)); float z1 = (float) (radius * Math.cos(vAngle)); float x2 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.cos(hAngle + angleSpan)); float y2 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.sin(hAngle + angleSpan)); float z2 = (float) (radius * Math.cos(vAngle + angleSpan)); float x3 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.cos(hAngle)); float y3 = (float) (radius * Math.sin(vAngle + angleSpan) * Math.sin(hAngle)); float z3 = (float) (radius * Math.cos(vAngle + angleSpan)); vertex.add(x1); vertex.add(y1); vertex.add(z1); vertex.add(x3); vertex.add(y3); vertex.add(z3); vertex.add(x0); vertex.add(y0); vertex.add(z0); vertex.add(x1); vertex.add(y1); vertex.add(z1); vertex.add(x2); vertex.add(y2); vertex.add(z2); vertex.add(x3); vertex.add(y3); vertex.add(z3); } } mVertexCount = vertex.size() / COORDS_PER_VERTEX; float vertices[] = new float[vertex.size()]; for (int i = 0; i < vertex.size(); i++) { vertices[i] = vertex.get(i); } mVertexBuffer = GlUtil.createFloatBuffer(vertices); Private void createProgram() {mProgramHandle = glutil.createProgram (VERTEX_SHADER, FRAGMENT_SHADER); maPositionHandle = GLES20.glGetAttribLocation(mProgramHandle, "a_Position"); muMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_Matrix"); Public void drawSphere() {gles20.gluseProgram (mProgramHandle); GLES20.glVertexAttribPointer(maPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, 0, mVertexBuffer); GLES20.glEnableVertexAttribArray(maPositionHandle); GLES20.glUniformMatrix4fv(muMatrixHandle, 1, false, mMVPMatrix, 0); GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, mVertexCount); GLES20.glDisableVertexAttribArray(maPositionHandle); GLES20.glUseProgram(0); } public void setMVPMatrix(float[] matrix) {mMVPMatrix = matrix;} public void setMVPMatrix(float[] matrix) {mMVPMatrix = matrix; }}
Copy the code
The implementation process is mainly to create Program, bind corresponding Attribute attributes, set the total transformation matrix, draw the sphere, etc.
- Here are the tools used to create programs, create mipMap textures, and other methods wrapped:
public class GlUtil { public static final String TAG = "GlUtil"; // 从初始化失败 public static final int GL_NOT_INIT = -1; // 单位矩阵 public static final float[] IDENTITY_MATRIX; static { IDENTITY_MATRIX = new float[16]; Matrix.setIdentityM(IDENTITY_MATRIX, 0); } private static final int SIZEOF_FLOAT = 4; private GlUtil() {} /** * 创建program * @param vertexSource * @param fragmentSource * @return */ public static int createProgram(String vertexSource, String fragmentSource) { int vertexShader = loadShader(GLES30.GL_VERTEX_SHADER, vertexSource); if (vertexShader == 0) { return 0; } int pixelShader = loadShader(GLES30.GL_FRAGMENT_SHADER, fragmentSource); if (pixelShader == 0) { return 0; } int program = GLES30.glCreateProgram(); checkGlError("glCreateProgram"); if (program == 0) { Log.e(TAG, "Could not create program"); } GLES30.glAttachShader(program, vertexShader); checkGlError("glAttachShader"); GLES30.glAttachShader(program, pixelShader); checkGlError("glAttachShader"); GLES30.glLinkProgram(program); int[] linkStatus = new int[1]; GLES30.glGetProgramiv(program, GLES30.GL_LINK_STATUS, linkStatus, 0); if (linkStatus[0] != GLES30.GL_TRUE) { Log.e(TAG, "Could not link program: "); Log.e(TAG, GLES30.glGetProgramInfoLog(program)); GLES30.glDeleteProgram(program); program = 0; } return program; } /** * 加载Shader * @param shaderType * @param source * @return */ public static int loadShader(int shaderType, String source) { int shader = GLES30.glCreateShader(shaderType); checkGlError("glCreateShader type=" + shaderType); GLES30.glShaderSource(shader, source); GLES30.glCompileShader(shader); int[] compiled = new int[1]; GLES30.glGetShaderiv(shader, GLES30.GL_COMPILE_STATUS, compiled, 0); if (compiled[0] == 0) { Log.e(TAG, "Could not compile shader " + shaderType + ":"); Log.e(TAG, " " + GLES30.glGetShaderInfoLog(shader)); GLES30.glDeleteShader(shader); shader = 0; } return shader; } /** * 检查是否出错 * @param op */ public static void checkGlError(String op) { int error = GLES30.glGetError(); if (error != GLES30.GL_NO_ERROR) { String msg = op + ": glError 0x" + Integer.toHexString(error); Log.e(TAG, msg); throw new RuntimeException(msg); } } /** * 创建FloatBuffer * @param coords * @return */ public static FloatBuffer createFloatBuffer(float[] coords) { ByteBuffer bb = ByteBuffer.allocateDirect(coords.length * SIZEOF_FLOAT); bb.order(ByteOrder.nativeOrder()); FloatBuffer fb = bb.asFloatBuffer(); fb.put(coords); fb.position(0); return fb; } /** * 创建FloatBuffer * @param data * @return */ public static FloatBuffer createFloatBuffer(ArrayList<Float> data) { float[] coords = new float[data.size()]; for (int i = 0; i < coords.length; i++){ coords[i] = data.get(i); } return createFloatBuffer(coords); } /** * 创建Texture对象 * @param textureType * @return */ public static int createTextureObject(int textureType) { int[] textures = new int[1]; GLES30.glGenTextures(1, textures, 0); GlUtil.checkGlError("glGenTextures"); int textureId = textures[0]; GLES30.glBindTexture(textureType, textureId); GlUtil.checkGlError("glBindTexture " + textureId); GLES30.glTexParameterf(textureType, GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_NEAREST); GLES30.glTexParameterf(textureType, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR); GLES30.glTexParameterf(textureType, GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE); GLES30.glTexParameterf(textureType, GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE); GlUtil.checkGlError("glTexParameter"); return textureId; } /** * 创建Sampler2D的Framebuffer 和 Texture * @param frameBuffer * @param frameBufferTex * @param width * @param height */ public static void createSampler2DFrameBuff(int[] frameBuffer, int[] frameBufferTex, int width, int height) { GLES30.glGenFramebuffers(1, frameBuffer, 0); GLES30.glGenTextures(1, frameBufferTex, 0); GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, frameBufferTex[0]); GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_RGBA, width, height, 0, GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, null); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_LINEAR); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE); GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, frameBuffer[0]); GLES30.glFramebufferTexture2D(GLES30.GL_FRAMEBUFFER, GLES30.GL_COLOR_ATTACHMENT0, GLES30.GL_TEXTURE_2D, frameBufferTex[0], 0); GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0); GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0); checkGlError("createCamFrameBuff"); } /** * 加载mipmap纹理 * @param bitmap bitmap图片 * @return */ public static int createTexture(Bitmap bitmap) { int[] texture = new int[1]; if (bitmap != null && !bitmap.isRecycled()) { //生成纹理 GLES30.glGenTextures(1, texture, 0); checkGlError("glGenTexture"); //生成纹理 GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, texture[0]); //设置缩小过滤为使用纹理中坐标最接近的一个像素的颜色作为需要绘制的像素颜色 GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MIN_FILTER,GLES30.GL_NEAREST); //设置放大过滤为使用纹理中坐标最接近的若干个颜色,通过加权平均算法得到需要绘制的像素颜色 GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MAG_FILTER,GLES30.GL_LINEAR); //设置环绕方向S,截取纹理坐标到[1/2n,1-1/2n]。将导致永远不会与border融合 GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_S,GLES30.GL_CLAMP_TO_EDGE); //设置环绕方向T,截取纹理坐标到[1/2n,1-1/2n]。将导致永远不会与border融合 GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_T,GLES30.GL_CLAMP_TO_EDGE); //根据以上指定的参数,生成一个2D纹理 GLUtils.texImage2D(GLES30.GL_TEXTURE_2D, 0, bitmap, 0); return texture[0]; } return 0; } /** * 创建OES类型的Texture * @return */ public static int createOESTexture() { int[] texture = new int[1]; GLES20.glGenTextures(1, texture, 0); GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE); GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE); return texture[0]; } /** * 加载mipmap纹理 * @param context * @param name * @return */ public static int loadMipmapTextureFromAssets(Context context, String name) { int[] textureHandle = new int[1]; GLES30.glGenTextures(1, textureHandle, 0); if (textureHandle[0] != 0) { Bitmap bitmap = getImageFromAssetsFile(context, name); GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, textureHandle[0]); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_LINEAR); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE); GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE); GLUtils.texImage2D(GLES30.GL_TEXTURE_2D, 0, bitmap, 0); bitmap.recycle(); } if (textureHandle[0] == 0) { throw new RuntimeException("Error loading texture."); } return textureHandle[0]; } /** * 加载Assets文件夹下的图片 * @param context * @param fileName * @return */ public static Bitmap getImageFromAssetsFile(Context context, String fileName) { Bitmap bitmap = null; AssetManager manager = context.getResources().getAssets(); try { InputStream is = manager.open(fileName); bitmap = BitmapFactory.decodeStream(is); is.close(); } catch (IOException e) { e.printStackTrace(); } return bitmap; }}
Copy the code
At this point we can see a sphere:
Technical exchange, welcome to our wechat: Ezglumes, to bring you into the technical exchange group.
Scan code to pay attention to the public number , learn multimedia audio and video development ~~~