For our texturing, we had no choice but to use tri-planar texture mapping - since we generate an actual a planet and the terrain can be oriented in any direction. Combine that with the fact that the terrain is diggable, we had to make the texture adapt to any angle. Triplanar mapping was the perfect solution.
Doing normal mapping on top of triplanar mapping may seem hard at first, but it's just a little harder than triplanar texture mapping.
To obtain the final fragment color for triplanar mapping, you basically sample the same texture as though it was oriented along the three planes (See diagram on right).
Once you have a sample from each of these planar projections, you combine the three samples depending on the normal vector of the fragment. The normal vector essentially tells you how close to each plane the projection actually is. So if you have a mostly horizontal plane, the normal vector would be vertical and thus you would sample mostly from the horizontal projection.
This same principle can be used to compute the normal from a sample from a normal map. Instead of sampling from the texture, you would sample from the normal map. The RGB color you get would give you the normal vector, as seen in that plane. Then you can combine these normals using the same weights that you use to compute the mixture from the texture coordinates.
Basically you obtain three normal vectors, one on each plane, and each having a certain coordinate system that is aligned with the texture on the side.
On the picture on the right, the red, green and blue are the axis on each projection of the texture, while the dark purple is a sample normal vector. You can imagine, the closer the fragment's normal is to each plane the more it samples from that plane. One thing is that unlike texture mapping, is that when the normal is close to the plane's, but is facing the opposite direction, you have to reverse the normal map's results.
This is what the code for obtaining the normal of one texture from its three normal projections looks like in our terrain shader:
vec4 bump1 = texture2DArray(normalArray, vec3(coordXY.xy, index));
vec4 bump2 = texture2DArray(normalArray, vec3(coordXZ.xy, index));
vec4 bump3 = texture2DArray(normalArray, vec3(coordYZ.xy, index));
vec3 bumpNormal1 = bump1.r * vec3(1, 0, 0) + bump1.g * vec3(0, 1, 0) + bump1.b * vec3(0, 0, 1);
vec3 bumpNormal2 = bump2.r * vec3(0, 0, 1) + bump2.g * vec3(1, 0, 0) + bump2.b * vec3(0, 1, 0);
vec3 bumpNormal3 = bump3.r * vec3(0, 1, 0) + bump3.g * vec3(0, 0, 1) + bump3.b * vec3(1, 0, 0);
return vec3(weightXY * bumpNormal1 + weightXZ * bumpNormal2 + weightYZ * bumpNormal3);
Where weightXY, weightXZ and weightYZ are determined like so from the normal that's calculated at that fragment:
weightXY = fNormal.z;
weightXZ = fNormal.y;
weightYZ = fNormal.x;
I realize that it sounds a bit counter-intuitive that we need the normal before we can calculate the per-fragment normals, but this normal can be simply obtained by other means, such as per-vertex normal calculations. (We obtain it through density difference calculations of the voxels)
Finally, to get good results you need an actual good normal texture. We only had time to create one (neither of us are graphics designers), so here's a video of the rock triplanar normal map, with a short day length on our planet: