clayton wood

The Nuts and Bolts of Navigation Data Rendering in MAPS.ME

Hi, everybody! Navigation in the MAPS.ME app is one of the main features that we try to emphasize. You will remember that we recently told you all about navigation by foot. Well, today I would like to tell you about how we display navigation data in MAPS.ME.

When I talk about navigation data, I mean particular route lines, arrows to indicate maneuvers, as well as the user’s location on the map. This post isn’t going to get into the algorithms of building routes for OSM data or algorithms for highlighting maneuvers. What we’re talking about in this post is rendering. 

Let’s start the conversation by first determining what the requirements are for displaying the navigation data I talked about above.

  1. The thickness of the route line must vary depending on the scale of the map. Obviously, when zoomed out the line is going to look pretty thin in order to avoid covering up important landmarks. Meanwhile, it has to be thick enough to be seen. When zoomed in, the route line should ideally not be wider than the roads.

  2. The section of the route that has been already traveled should no longer be shown.

  3. Relevant maneuvers should be displayed on the route. The maneuvers we currently support are limited to turns and must be displayed as arrows indicating the direction the relevant turn.

  4. The route line’s color must be arbitrary (and semi-transparent). The color is presently the same throughout the entire route.

  5. The user’s position is displayed as an arrow pointing in the direction of motion.

  6. The end of the route is indicated using a special icon.

Now let’s have a look at the features we have to display all of that. From the route building system, we get the following:

  • A polyline indicating the central axis of the route line.

  • Points on the polyline, which define the point of the maneuver (turn).

  • A point on the polyline, which defines the user’s location.

On top of that, we also have:

  • A color for the route line in RGBA format.

  • Texture from the map skin.

The points illustrated on the polyline are logical. They are not included as part of the polyline and are defined as the distance from the beginning of the line. For example, 0.0 denotes the first point of the polyline, and a value equal to the length of the line denotes the final point.

Now that we’ve gone over all of the initial conditions, let’s get down to the nuts and bolts of the rendering process.

Constructing the Geometry of the Route Line

We define the geometry of the route line as a list of triangles. Since one of our requirements calls for a variable width and it wouldn’t be possible for us to reconstructing its geometry in every single frame, the width of the line shall be declated as a uniform variable.

All of the vertices are grouped along the central axis. For each vertex we save a vector. And it is in the direction of this vector that we will shift this vertex in a vertex shader (we will call this vector the normal). As a result, we get something that looks like the figure below.

You can see in the figure that 2 pairs of normals share point 3. The normals of the same color in the figure are parallel to each other and point in opposite directions. If one of them is known, then the other one can be calculated as the reflection of the other. In order to maintain 2 non-parallel normals going in different directions from a point, the point must be duplicated, which happens naturally when constructing the polygons.

You can also see that at the point where the line segments meet, the defined polygons will overlap on one side and leave a gap on the other side. At first, it looks like the overlapping polygons don’t pose any problem. That is true as long as the route line is completely opaque. If the line is semi-transparent, then the polygons will be darker where they overlap. Consequently, we need to get rid of them. Taking care of the gap is much easier and we’ll insert a polygon to do that.

In order to eliminate overlapping polygons, we modify the normal such that the constructed polygons do not overlap. To do so, we calculate the average of two neighboring normals, and the two points must be shifted along this new vector.

This approach does produce just one subtlety. The normal will no longer be normalized, since we may need to displace the vertex significantly in acute angles. As was already mentioned, we eliminate the breaks in the route lines with the help of polygon insertions. We have 3 ways of performing these insertions.

In the first case, polygon insertion will be performed by triangulating the sector of a circle. In the second case, an acute angle is formed, and in the third case, exactly one triangle is formed.

The geometry construction algorithm evaluates the size of the break, the degree to which the normals separate, and selects the appropriate polygon insertion method. Two round end caps are added at the ends of the route line. They’re created using the first polygon insertion method. The result is the unbroken geometry of the route line, which we can draw in any color we want, including semi-transparent.

Displaying the Traveled Section of the Route

To cut off the section of the route already traveled, the most obvious solution is to regenerate the geometry as the user progresses along the route. It is obvious that this method will take a lot of computational resources. The issue here is not the difficulty of generating the geometry but rather the cost of sending new vertex- and index buffers to the GPU.

To regenerate the geometry for each GPS-coordinate update, we’ve come up with the following approach. In each vertex of the geometry generated, we will store a value equal to the distance from the beginning of the route line to the projection of that vertex on the route line.

By interpolating the values between the vertexes, we can determine, for each fragment, the distance from the beginning of the route line to the given fragment along the route line (DISTANCE_FROM_START). Then with the help of the following simple shader pseudocode, part of the route line is cut off.

vec4 mainFragment()


vec4 color = ROUTE_LINE_COLOR;


color.a = 0.0;

return color;


ROUTE_LINE_COLOR indicates the color of the route line, and CURRENT_CLIP_DISTANCE is the portion of the route that has been traveled so far.
This approach has just one minor flaw: all of the inserted polygon’s vertices will project to the same point.

In the figure above, points 1, 2, and 3 belong to the 2nd type of polygon insertion and are projected toward the central axis of the route line with a value of 7.4.

Technically, this will form an uneven cut in the line at the border of the two segments. In reality, however, this behavior isn’t a problem for us, since in the vast majority of cases the cut (even or uneven) will be under the icon representing the user’s location.

Displaying Maneuvers

As I’ve already stated, for the time being our maneuvers are currently limited to arrows indicating turns. The main problem with this is that at different scales, several different arrows must be combined in order to avoid becoming too small and difficult to discern.

It goes without saying that, once again, we would like to avoid having to regenerate geometry for each separate frame. For this purpose, we have implemented the following algorithm:

  1. Precaching stage. In this stage, the loading mostly involves just the CPU. The line coming from the route building system is broken up into segments. We originally used just one indivisible line, but we stumbled into a serious problem in building a long route.

  2. When interpolating length across large distances we were limited by the precision of floating-point numbers. We use Mercator coordinates in the app, which requires a large number of decimal places. Furthermore, the precision of floating-point numbers in OpenGL ES shaders doesn’t measure up to the precision of floating-point numbers in code for the CPU (2-16 in highp versus 2-24). As a result, we lost precision during floating-point arithmetic in the shaders, which led to display artifacts. All of our attempts to achieve the necessary precision have significantly complicated the fragment shader. Additionally, the length of the route line must always been found, but that didn’t work in this case. Thus, we made the decision to break the route line into segments such that their length doesn’t exceed a certain distance in the Mercator coordinate system. Now the sizes are interpolated independently from each other for each segment, which helped us get rid of precision problems. In order to avoid gaps at the junctions of these segments, the location of the cut was selected in a special way. The line was broken into segments that were closest to the desired location and at the same time long enough so that the arrow of the maneuver couldn’t reach the break point. This stage outputs a collection of vertex and index buffers with the triangular geometry of the route line.

  3. Next comes the updating stage to the geometric data. The code relevant to this stage is executed for each frame immediately before rendering. The first task in this stage is to determine what route line segments are visible on the screen. Then, for each visible segment, the maneuver arrows are projected. The reason is that in order to draw the arrows, we use the same geometry that we used to draw the route line itself and we need to transfer the coordinates of the arrows’ start and end to the shader. In order to calculate this point set, we take the following steps:

    1. We project the arrows onto the route line segment as they are (the original point set comes from the route building system).

    2. We combine overlapping arrows into one (the arrows have certain limitations in their length; just as they do in Mercator coordinates, they have the same limitations in pixels and thus may overlap).

    3. We displace the arrow’s head in such a way that it lies on a straight part of the route (this is necessary to avoid distortions on certain displayed textures).

In this stage we determine the specific geometry we have to draw in the current frame and we also calculate the point array with the coordinates of the start and finish maneuver arrows.

  1. Finally, we have the rendering stage. In this case, we are, of course, loading the GPU. We use the geometry of the route line for rendering (we defined the necessary vertex and index buffers in the previous stage). We pass the point array of the borders and the texture with the image of the arrow into the fragment shader. Each element of the borders’ point array contains a one-dimensional coordinate of the beginning and the end of the arrow, where the coordinate is the distance from the beginning of the current route line segment. The texture coordinates are calculated using these coordinates in the fragment shader for the given texture to be laid out onto the geometry of the route line so that the arrows end up in the right places. In order for the arrow be wider than the on the route line, different width for the line is set (remember that we set it as a uniform variable). Furthermore, the texture of the arrow is divided into 3 parts: the tail, the head, and the center. Its center is able to stretch along the route without any distortions and the head and the tail parts of the arrow always maintain their proportions. The pseudocode of the fragment shader for drawing the arrows is shown below.

struct Arrow


float start;

float end;


Arrow arrows[MAX_ARROWS_COUNT];

sampler2D arrowTexture;

vec4 mainFragment(float distanceFromStart, float v)


for (int i = 0; i < MAX_ARROWS_COUNT; i++)


if (distanceFromStart <= arrows[i].start && distanceFromStart <arrows[i].end) {

float u = (distanceFromStart – arrows[i].start) / (arrows[i].end – arrows[i].start);

return texture(arrowTexture, vec2(u, v));



return vec4(0, 0, 0, 0);



Today we have examined the process of rendering navigation data in MAPS.ME. We do not by any means intend to imply that our solutions are universal, but, nevertheless, I hope that you have found out some useful information for yourself and the projects you are working on.

Leave a Reply