Introduction: Why is the thermograph wobbling?
Deck.GL is Uber’s open source geographic data rendering framework. When deck. GL was used to draw thermal maps, it was found that the map layer was obviously shaking when the map was continuously enlarged, and the thermal aggregation results were also problematic. The following demo shows this phenomenon. The yellow layer is the thermal map layer, and the black dots represent the original data. Obviously, when the map is continuously enlarged, the points in the thermal map do not correspond to the original data points, and they are constantly shaking.
The code address
The official demo didn’t have this problem, so what went wrong?
If there’s a difference between the two demos, it’s the data. The map data for the test demo is at the indoor map level, while the official demo data is at the city level. Indoor map data to five or six decimal places after the difference, city level data after one or two decimal places, so it is likely to be caused by the loss of data accuracy.
Based on this conjecture, there are indeed many articles on the Internet that introduce the problems caused by the loss of accuracy of WebGL, which seems to be a common problem. Then how to solve it? Let’s start with the data and see how different geographic rendering frameworks solve this problem.
The front background
Web Mercator projection
Because the earth is round, to display a map on a plane requires a certain projection transformation to draw it onto the plane. Mercator projection is also known as “equal-angle orthoaxial cylindrical projection”. Its equal-angle feature can ensure the invariance of object shape, and also ensure the correctness of direction and mutual position. The specific principle is not described here, interested in their own understanding. Web Mercator projection simplifies the elliptic sphere of the Earth to the prototype sphere, and the coordinate transformation formula is as follows:
As can be seen from the formula, the conversion from latitude coordinates to Y-axis coordinates is nonlinear, and the calculation not only depends on trigonometric functions and logarithmic operations, but also must be calculated for every coordinate in every frame rendering, which obviously brings a lot of computational overhead.
Precision loss
All geographical rendering frameworks need to convert the latitude and longitude coordinates of elements into pixel coordinates of the screen. With the continuous enlargement of the map, the displacement value of the transformation matrix of latitude and longitude coordinates to pixel coordinates becomes larger and larger, that is, the pixel coordinate value becomes larger and larger. Since JS data is a 64-bit double floating-point number, and GLSL data can only be a 32-bit double floating-point number, precision loss is inevitable when data is passed from JS into the shader. If you don’t deal with the loss of precision, you’ll see these elements shake when you zoom in enough to move the map.
The math.fround method converts data from 64 to 32 bits:
Math.fround(-122.4000588); // -122.40006256103516
Copy the code
Assuming the coordinates are [-122.4000588, 37.7900699] and converting them to a 32-bit floating point number is [-122.40006256103516, 37.790069580078125], the real-world difference in the distance between the two points is 0.3325 meters.
So how do different geographic rendering frameworks deal with the massive loss of computation and accuracy?
The practice of Mapbox
Mapbox uses a tile coordinate system. The display elements of geographic information are usually static, so the map can be pre-divided into tiles, each of which contains the actual geographic information elements. Each time the camera changes, you only need to render the data in units of tiles in the viewport. The tile coordinate system is also easy to understand. The following is the tile coordinate system at z 2 scale level:
All the processes of Mapbox are carried out in the plane coordinate system, so the longitude and latitude coordinates of elements are projected to the plane coordinate system through Mercator projection, and the offset matrix of tiles relative to the center point is re-calculated in real time during each rendering process, and the data is transformed into the tile coordinate system:
function pixelsToTileUnits(tile: {tileID: OverscaledTileID, tileSize: number}, pixelValue: number, z: number) :number {
return pixelValue * (EXTENT / (tile.tileSize * Math.pow(2,
z - tile.tileID.overscaledZ)));
}
const translation = [
inViewportPixelUnitsUnits ? translate[0] : pixelsToTileUnits(tile, translate[0].this.transform.zoom),
inViewportPixelUnitsUnits ? translate[1] : pixelsToTileUnits(tile, translate[1].this.transform.zoom),
0
];
const translatedMatrix = new Float32Array(16);
mat4.translate(translatedMatrix, matrix, translation);
Copy the code
In order to reduce the amount of data, Mapbox simplifies some elements and trims elements according to tile information, then obtains the tiles contained in the current viewport, and finally renders the elements contained in the tiles as a unit. See the Mapbox article for details on the process.
At this stage of rendering, the coordinates passed into the shader for each tile element are not longitude and latitude, nor absolute Mercator coordinates, but coordinates relative to the current tile:
// Mercator coordinates -> relative tile coordinates [0, 8192]
function transformPoint(x, y, extent, z2, tx, ty) {
return [
Math.round(extent * (x * z2 - tx)),
Math.round(extent * (y * z2 - ty))];
}
Copy the code
Since relative tile coordinates are used, GLSL’s 32-bit accuracy is sufficient, so the accuracy problem is not present. But correspondingly, every time the camera projection matrix changes, the projection matrix of each tile also needs to be recalculated.
Generally, when the zoom level is 18, for example, the indoor map is 22, the grid is very small, which will lead to a very long time of segmentation. Considering user experience, Mapbox sorted a set of tiles to be rendered according to their distance from the center of the screen, giving priority to the center tile:
// Screen midpoint coordinates
const centerCoord = MercatorCoordinate.fromLngLat(this.center);
const centerPoint = new Point(numTiles * centerCoord.x - 0.5, numTiles * centerCoord.y - 0.5);
// Overwrite tile array sorted by screen center distance
tiles.sort((a, b) = > centerPoint.dist(a.canonical) - centerPoint.dist(b.canonical));
Copy the code
In general, Mapbox puts Mercator projection calculation in CPU, and the coordinates in the shader program are relative to tile coordinates, avoiding the precision loss of GLSL, and greatly reducing data through element simplification, fragment cutting and other operations, effectively controlling the amount of computation in CPU.
The practice of Deck. GL
Deck.gl itself is positioned to deal with millions of data points that change frequently, and performing Mercator projections on the CPU severely affects performance. Deck.gl passes latitude and longitude coordinates directly to the GPU for conversion in the vertex shader. This will inevitably lead to the loss of accuracy when JS transmits data to GLSL. These errors may not be perceived when the map scope is large, but will cause visible deviation, namely “jitter” phenomenon, at a high zoom level. And as the zoom level increases, the errors will become larger and larger.
Deck.GL has tested the pixel error of Y axis at different scales at different latitudes:
The data is split into high and low
In order to solve this problem, deck.GL V3 introduces a method to simulate 64-bit double precision floating point numbers in GLSL, splitting the data into high and low values, and the high and low values of each number will be calculated in the GPU:
highPart = Math.fround(x)
lowPart = x - highPart
It then simulates 64-bit floating-point arithmetic by using cascade arithmetic of 32-bit floating-point numbers. The obvious trade-off is huge GPU consumption. For example, a 64-bit division operation needs to map to 11 32-bit operations, and a 64-bit matrix operation (MAT4 to VEC4) requires 1,952 32-bit operations.
Using this scheme does solve the jitter problem caused by precision loss, but simulating 64-bit matrix computing seriously affects the performance of shader compilation and parsing, and also increases the bandwidth of data transferred from CPU to GPU. Some low-performing graphics drivers are not compatible, and even if they are, it may take a few seconds to compile, resulting in graphics stuttering.
Offset coordinate system
Wanting to preserve high precision while avoiding excessive computational performance, deck. GL solved this problem after v6.2 by using a “Offset Coords” scheme with the center of the screen as the origin of dynamic coordinates.
The basic idea of offset coordinate systems is that the difference between two coordinates that are close to each other is just enough to wipe out the high position, and that precision is sufficient to store the difference using only 32 bits. So we need to pick a fixed point to calculate the difference. The fixed point selects the center point of the viewport, and the calculation of the offset should be done in the shader. Because the latitude and longitude coordinates of the viewport center may change from frame to frame, performance cannot be supported if the offset matrix calculation is repeated in the CPU every frame.
The following code selects whether to handle normal latitude and longitude or the difference between latitude and longitude, depending on the current coordinate system.
// deck.gl/shaders/project.glsl
vec4 project_position(vec4 position) {
// Handle latitude and longitude offsets
if (project_uCoordinateSystem == COORDINATE_SYSTEM_LNGLAT_AUTO_OFFSET) {
// The offset from the center of the viewport remains low in latitude and longitude coordinates. 32 bits are sufficient
float X = position.x - project_coordinate_origin.x;
float Y = position.y - project_coordinate_origin.y;
return project_offset_(vec4(X, Y, position.z, position.w));
} else {
// Omit normal processing latitude and longitude -> world coordinates
return vec4(
project_mercator(position.xy) * WORLD_SCALE * u_project_scale,
project_scale(position.z), position.w ); }}Copy the code
So how do you decide which method to use? Deck.GL sets the threshold of the zoom level and switches between normal and Offset coordinate systems. If the zoom level is greater than 12, the coordinates of the viewport center in the latitude and longitude coordinate system need to be calculated:
const LNGLAT_AUTO_OFFSET_ZOOM_THRESHOLD = 12;
if (coordinateZoom < LNGLAT_AUTO_OFFSET_ZOOM_THRESHOLD) {
} else {
// Use the Offset coordinate to pass in the longitude and latitude coordinates to the viewport center point
const lng = Math.fround(viewport.longitude);
const lat = Math.fround(viewport.latitude);
shaderCoordinateOrigin = [lng, lat];
}
Copy the code
Therefore, in the vertex shader, the calculation result of the final pixel space coordinates can be divided into two parts: the matrix operation of the offset part of the world coordinate system and the projection result of the viewport center:
// Handle offset and normal latitude and longitude conversion to the world coordinate system
vec4 project_pos = project_position(vec4(a_pos, 0.0.1.0));
gl_Position = u_mvp_matrix * project_pos + u_viewport_center_projection;
Copy the code
The projection result of the viewport center point can be computed in each render frame in the CPU, and the matrix calculation of the offset part is done in the shader, so the computation of each frame requires only a small number of uniform updates and can be done almost at zero cost on the CPU or GPU.
As shown below, the new hybrid coordinate system (yellow) is as accurate as the 64-bit mode (red), even though only 32 bits are used, while the original 32 bit mode (blue) wobbles at the same zoom level.
Calculate the difference — Taylor expansion
There is another detail worth noting in this calculation. How do you estimate the difference in the Mercator projection coordinate system from the difference in the world coordinate system? In the offset coordinate system scenario,Is the dynamic center of the screen. The difference between the other points and the center is.Function is the conversion function from the world to the offset coordinate system. Taylor expansion is just a basisPhi is the value of phiThe derivative of the function, can be evaluated with respect toThe value of the function at any point is estimated. The more Taylor expands the series, the smaller the error of the simulation value.
Web Mercator projection formula:
Since the X-axis direction is linear, linear estimation can be used:
The Y-axis direction is nonlinear and can be expanded using Taylor’s second order expansion:
Using vector operations in GLSL can quickly implement the above transformation formula: whereu_pixels_per_degree
The correspondingAnd theu_pixels_per_degree2
The corresponding.The value of the passThe derivative gets.
// offset:[delta lng, delta lat]
vec4 project_offset(vec4 offset) {
float dy = offset.y;
dy = clamp(dy, 1..1.);
vec3 pixels_per_unit = u_pixels_per_degree + u_pixels_per_degree2 * dy;
return vec4(offset.xyz * pixels_per_unit, offset.w);
}
Copy the code
conclusion
So back to the question at the beginning of this article, look at the source code of the heat map and you will see why. Deck.GL’s thermal map module does not convert coordinates when passing them, and the precision of coordinates in the passed shader is lost. As can be seen from the previous section, deck.GL has different strategies for accuracy loss in different versions, so it may be that there is no test coverage in the process of strategy migration, which leads to the problem of thermal diagram module. So I proposed the issue on my own initiative, and it was solved quickly.
Using the new version of the map, the problem was solved:
The code address
To sum up, there are several solutions to the jitter problem of geographic information in WebGL rendering with high zoom level:
-
Using the coordinate system relative to the tile can effectively solve the precision problem. However, as the scaling becomes larger and larger, the time of tile segmentation becomes longer and longer, and if the accuracy of non-tile data is to be solved, it needs to be converted to the corresponding tile.
-
Splitting data into high and low order and one Float64Array into two Float32arrays can solve the accuracy problem, but at the cost of significantly increasing the memory overhead and GPU computing.
-
Offset coordinate system, the difference between two coordinates close to each other is just enough to erase the high position, just use 32 bits to store the difference, precision is perfectly sufficient.
In addition to data jitter, other phenomena caused by WebGL accuracy loss include Z-fighting and z-buffer accuracy loss, which will not be described in this paper. If you are interested, you can search relevant information online.
The resources
- How (sometimes) assuming the Earth is “flat” helps speed up rendering in deck.gl
- Rendering big geodata on the fly with GeoJSON-VT
Author: the author: ES2049 | timeless
The article can be reproduced at will, but please keep this link to the original text.
You are welcome to join ES2049 Studio. Please send your resume to [email protected].