Two years after graduation, I have been working in a map-related company. Although I am not from GIS, I am familiar with maps. Recently when I was watching WebGl, I took MapboxGL to do a DEMO of WebGl 3d data rendering.
First we have a look at the structure of MapboxGL’s GLGS to layers:
MapboxGL basically says so much, the following is WebGL 3d data processing and rendering and adding satellite image texture process (too much code, only some key steps of the code) :
Step 1: Get the data shard to render (in tile form)
// Serialize tile address, calculate xyz coordinates of data tileleturl = normalizeURL( tile.coord.url(this.tiles, null, this.scheme), this.url, this.tileSize ); . Request = ajax.getarrayBuffer (url, done.bind(this)); . PixelObj = pixelObj; // Processed data...Copy the code
Step 2: Get the data and Painter in Render to Render the data slice:
const divisions = 257;
let vertexPositionData = new Float32Array(divisions * divisions * 3);
const pixels = pixelObj.pixels[0];
if(coord.vertexpositionData) {console.log(coord.vertexpositionData) {console.log('cache'.'coord');
vertexPositionData = coord.vertexPositionData;
} else {
console.time('vertex'); // Full data volumefor (let i = 0; i < divisions; ++i) {
for (let j = 0; j < divisions; ++j) {
const bufferLength = (i * divisions + j) * 3;
let dem = parseInt(pixels[bufferLength / 3]);
if(! Dem | | dem = = = 3) {/ / to a default value for invalid data (PS: the dem elevation data quality is not high) dem = 1000; } vertexPositionData[bufferLength] = j * SCALE; vertexPositionData[bufferLength + 1] = i * SCALE * 1; vertexPositionData[bufferLength + 2] = dem; Console. timeEnd() {console.timeEnd() {console.timeEnd() {console.timeEnd();'vertex');
coord.vertexPositionData = vertexPositionData;
}
const indexData = getIndex(divisions);
const FSIZE = vertexPositionData.BYTES_PER_ELEMENT;
const positionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, vertexPositionData, gl.STATIC_DRAW);
const aPosiLoc = gl.getAttribLocation(gl.program, "a_Position");
gl.vertexAttribPointer(aPosiLoc, 3, gl.FLOAT, false, FSIZE * 3, 0); gl.enableVertexAttribArray(aPosiLoc); // Set index const indexBuffer = gl.createBuffer(); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer); gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, indexData, gl.STATIC_DRAW); // https://stackoverflow.com/questions/28324162/webgl-element-array-buffers-not-working gl.getExtension('OES_element_index_uint'); gl.drawElements(gl.TRIANGLES, indexData.length, gl.UNSIGNED_INT, 0); . // Index generation, WebGL rendering has two ways, one is drawElements, the other is drawArray, we use the first method herefunction getIndex(divisions) {
if (drawLerc3D.indexData) {
return drawLerc3D.indexData;
}
console.time('Get index'); const indexData = []; // This is full data render //for (let row = 0; row < divisions - 1; ++row) {
// for (let i = 0; i < divisions; ++i) {
// const base = row * divisions + i;
// if(i < divisions - 1) { // indexData.push(base); // indexData.push(base + 1); // indexData.push(base + divisions); // indexData.push(base + 1); // indexData.push(base + divisions); // indexData.push(base + divisions + 1); // this is half of the data (PS: this is for optimization, sacrificing some precision)for (let row = 0; row < divisions - 2; row += 2) {
for (let i = 0; i < divisions; i += 2) {
const base = row * divisions + i;
if (i < divisions - 2) {
indexData.push(base);
indexData.push(base + 2);
indexData.push(base + divisions * 2);
indexData.push(base + 2);
indexData.push(base + divisions * 2);
indexData.push(base + divisions * 2 + 2);
}
}
}
console.timeEnd('Get index');
drawLerc3D.indexData = new Uint32Array(indexData);
return drawLerc3D.indexData;
}
Copy the code
Step 3: write GLSL and process different color values of different heights corresponding to rendering in GPU
vertex shader
// Uniform mat4u_matrix; // Attribute vec3 a_Position; // Attribute vec2 a_texCoord; varying vec2 v_texCoord; // Elevation data varyingfloat dem;
void main(){ dem = a_Position.z; Gl_Position = u_matrix * vec4(a_position. x, a_position. y, DEM * 32.0, 1.0); v_texCoord = a_texCoord; }Copy the code
fragment shader
// precision lowp float;
// uniform float u_brightness_low;
// uniform floatu_brightness_high; // Color // VARYING VEC3 v_Color; varyingfloatdem; // Texture Uniform sampler2D u_image; varying vec2 v_texCoord; // Select different color vec4 according to different elevationgetColorConst int COLORS_SIZE = 11; vec3 colors[COLORS_SIZE]; // Normalize demfloatN_dem = -2.0 * (DEM / 6000.0-0.5); constfloatMINDEM = 1.0; constfloat MAXDEM = 1.0;
const float STEP = (MAXDEM - MINDEM) / float(COLORS_SIZE - 1); int index = int(ceil((n_dem - MINDEM) / STEP)); Colors [10] = vec3 (0.3686274509803922, 0.30980392156862746, 0.6352941176470588); Colors [9] = vec3 (0.19607843137254902, 0.5333333333333333, 0.7411764705882353); Colors [8] = vec3 (0.4, 0.7607843137254902, 0.6470588235294118); Colors [7] = vec3 (0.6705882352941176, 0.8666666666666667, 0.6431372549019608); Colors [6] = vec3 (0.9019607843137255, 0.9607843137254902, 0.596078431372549); Colors [5] = VEC3 (1.0, 1.0, 0.7490196078431373); Colors [4] = vec3 (0.996078431372549, 0.8784313725490196, 0.5450980392156862); Colors [3] = vec3 (0.9921568627450981, 0.6823529411764706, 0.3803921568627451); Colors [2] = vec3 (0.9568627450980393, 0.42745098039215684, 0.2627450980392157); Colors [1] = vec3 (0.8352941176470589, 0.24313725490196078, 0.30980392156862746); Colors [0] = vec3 (0.6196078431372549, 0.00392156862745098, 0.25882352941176473);if(index > 10){
returnVec4 (0.3, 0.3, 0.9, 0.5); }if(index < 0){
index = 0;
}
for (int i = 0; i < COLORS_SIZE; i++) {
if (i == index) returnVec4 (colors [I], 1.0); } } voidmainGl_FragColor = getColor() {// Render DEM data with color, and choose between texture and gl_FragColor; // Render with texture (satellite image) gl_FragColor = texture2D(u_image, v_texCoord / 256.0/32.0); }Copy the code
Finally: use our own Source and Layer definitions in MapboxGL
map.addSource('DEMImgSource', {// Elevation data"type": "DEM3D"."tiles": [
'http://xxx.xxx.xxx.xxx/{x}/{y}/{z}',]."tileSize": 512, // Google tile address for rendering texture maps"rasterUrl": 'http://www.google.cn/maps/vt?lyrs=s@189&gl=cn&x={x}&y={y}&z={z}', // Autonavi's //"rasterUrl": 'https://webst04.is.autonavi.com/appmaptile?style=6&x={x}&y={y}&z={z}'
});
map.addLayer({ // layer
'id': 'DEMlayer'.'type': 'DEM3D'.'source': 'DEMImgSource'
});
Copy the code
Final render (color render) :
Texture rendering effect:
I have to say that it seems that the visual effect of color rendering is more yao a jian ~ huo
Some major companies have also made some progress in exploring the direction of WebGL: Autonavi Loca: lbs.amap.com/api/javascr…
Baidu Echarts: echarts.baidu.com/examples/in…
UBER: deck.gl/
So for the future of WebGL, I personally think it has a lot of value in data visualization, high-precision maps (unmanned driving) and other aspects
The first time to write an article, many places may not explain clearly, welcome to clap brick ~