Procedural planet generator
Made by Magnus Bredeli
Procedural content generation is used to automatically generate large amounts of unique and exciting game content, instead of manually designing everything! It is used to generate landscapes, mountains, trees, and creatures to mention just a few. I have always been fascinated by space and the vast variety of different planets, and after playing No man’s sky, I was inspired to create a procedural planet generator. The aim of this blog is to go through the process of how to use noise algorithms and custom shaders to generate planets with different terrain types and cool-looking textures.
Table of Contents
- Introduction
- Scope
- Mesh Generation
- Sphere type
- Applying Noise
- Frequency
- Octaves
- Tweaking the parameters
- Shader
- Using color
- Triplanar mapping
- Smooth transitions
- Normal mapping
- Result
- Future development
- Moon
- Ocean depth
- GUI
- Sources
1. Introduction
1.1 Scope
Even though there are plenty of exciting features to add to this project like the solar system, gravity, LOD system, etc., I had to limit the scope to be able to finish in time. Therefore, the end result of this project will be a procedurally generated planet which is either an earth planet, moon planet, or desert planet. Every planet will have unique procedural generated terrain for each seed as a result of applying noise to the mesh.
2. Mesh Generation
Now, where do we start when creating a planet? Before we can start adding textures and custom shaders to make the planet look realistic, we have to generate the mesh. The first part of the mesh is to create the sphere, and then apply the noise to create the terrain! However, first of all, we need to decide what kind of sphere we want to get the desired result.

2.1 Sphere type
One may think that a sphere is a sphere and that’s it. However, when generating a sphere, we have several different ways to distribute and order the vertices and create the triangles needed to generate the mesh. Therefore, when choosing which sphere type is best suited for our purpose, we have to take quality and construction, memory, and rendering costs into consideration.

First of we have the UV Sphere which is the standard sphere we can find in Unity. This sphere contains triangles that are bigger around the equator and smaller close to the poles. This would create an uneven distribution of detail in our planet and is therefore not an ideal sphere type to use.
Next, we have the normalized cube which is originally a cube, but the vertex positions are normalized and multiplied with the sphere radius. This leads to all of the vertices having the same distance from the center, resulting in a sphere. However, the triangles closer to the center of each cube face is larger than the ones close to the edges. To fix this and achieve more uniform divisions in the sphere we can use the spherified cube.
The spherified cube is also originally a cube, but as mentioned, the spherified cube is doing a better job producing more uniform divisions in the sphere. This is achieved by doing some mathematical calculations and adjustments which I will not explain here, but if you are interested, catlikecoding explains it very well in this article: Cube Sphere catlikecoding
At last, we have the icosahedron. The advantage of this sphere type is that each triangle is the same size and each vertex is the same distance from all of its neighbors. Even though this would give us an even distribution of detail, I still chose the spherified cube. The reason is that the icosahedron produces extremely many vertices when scaling up the level of detail, meanwhile, with the spherified cube I can scale it up without producing too many vertices. Here is the generated spherfied cube mesh with different resolutions.
Resolution 10 Resolution 50 Resolution 100
2.2 Applying noise
Now that we have generated the sphere mesh, the next step is to generate the procedural terrain. This means adjusting the height of each vertex in a way that they all together create a natural-looking terrain. A small thing to keep in mind is that since we are working with a sphere and not a flat landscape, the height we need to adjust is a little bit more complicated. A flat landscape can simply define the height as one of the axes, meanwhile, a sphere cannot. The solution is to define the height as the distance from the sphere’s center. Then we can normalize each vertex and multiply this by the radius and noise to decide how far away from the center the vertex needs to be. Here is the difference in code and results:




As you can see, the red blob in the last picture is going to be our planet, but so far it is not looking anywhere near a planet with the natural terrain. Let’s change that, and we will start by creating the earth and desert planet by using knowledge about noise, frequency, roughness, octaves, amplitude, etc.
2.2.1 Frequency
In the last picture, we used only the noise value returned from a noise function. This function takes in the vertex position and calculates a noise value in such a way that it is coherent with the other vertex positions noise value, which is why we are getting smooth hills and valleys. The noise function is a part of the libnoise-dotnet library which contains several different noise functions depending on what kind of noise you want: https://github.com/tbayart/libnoise-dotnet. In comparison, we can see what would happen if we, instead of the noise function, used a random function.

Clearly, we can see that the vertices next to each other have no coherent relationships and they are all living their own life. Though it looks somewhat cool, this is not going to give us a smooth and natural-looking terrain.
Let’s start with the frequency parameter. The frequency controls how many changes occur along a unit length. Increasing the frequency will increase the number of terrain features.
Frequency = 1 Frequency = 2 Frequency = 4
Increasing the frequency is simply done by multiplying the vertex position with the frequency before sending it into the noise function!
float v = noise.Evaluate(point * frequency);
As we can see, the terrain becomes very intense when increasing the frequency. To make it more natural-looking, we can add noise at different frequencies, introducing octaves.
2.2.2 Octaves
By adding noise at different levels we mean adding several layers of noise, but for each layer, the layer’s impact on the elevation is smaller. Here is how it is implemented in code:

First, we define the frequency and amplitude, where the amplitude is how much each layer should impact the noise value. We then have a for-loop that iterates for each octave/noise layer. In the for-loop, we first send in the vertex position and the layers frequency to receive a noise value.
The next step is to add this layer’s calculated noise value, to the total noise value. However, first, we need to multiply it with the amplitude which decides how much the noise value should impact the total noise. In addition, the noise function returns a number between -1 and 1, but we want a number between 0 and 1. Therefore, before multiplying with the amplitude and adding it to the total noise value, we convert it to a number between 0 and 1 by adding 1 and dividing by 2.
Before moving on to the next iteration and calculating the next layer’s noise value, we have to update the frequency and amplitude. How much these are changed, depends on which values you have in the settings in the inspector. An example is having the noiseSettings.frequency = 2
and noiseSettings.persistence = 0.5
which doubles the frequency and halves the impact of each layer.

At last, we return the total noise value multiplied with noiseSettings.noiseStrength which controls how much the noise overall affects the vertices on the sphere. Here is the result:
2 layers 4 layers 6 layers
2.2.3 Tweaking the parameters
Now we are able to add details by increasing the frequency in the layers which doesn’t affect the noise that much! However, the mountains are still quite extreme so let’s try to adjust persistence, frequency and strength in addition. Persistence is what we multiply the amplitude with for each layer, in other words how much more or less each layer matters.
p = 0.3, f = 2, s = 1 p = 0.5, f = 4, s = 1 p = 0.5, f = 2, s = 0.3
To create an ocean floor we can simply add a minimum value parameter that removes all noise features which is beneath the minimum value/height. This is done by adding a small check before we return the noise value: noiseValue = Mathf.Max(0, noiseValue-noiseSettings.minValue);
Min = 0 min = 0.9 min = 1
By tweaking the different values like the number of layers, frequency, noise strength, amplitude, etc., we can finally generate terrain which looks a lot more natural than before! Here are two examples of what an earth planet and desert planet mesh could look like.
Earth mesh Desert mesh
When creating the mesh for the moon, there is no other difference than that we are generating some moon craters. That is done by picking random vertices as centers of the craters, then we loop through all the vertices and check whether they are within a craters radius or not. If they are, a new height value is calculated based on the distance from the crater center. The code looks like this:


There are still a lot more interesting parameters and noise patterns I didn’t cover. Therefore, if you want to dive even deeper into the theory about noise and how all the parameters work together, Redblobgames have great articles covering this: Make maps with noise functions, redblobgames and Noise introduction, redblobgames.
You may have noticed that when looking at the three meshes above, there is no clear way to see whether the mesh is earth, desert, or moon planet. That is where the shader comes into play!
3. Shader
A shader is a program that runs on the CPU and performs calculations that determine the color of pixels on the screen or the position of each vertex. However, this project is not changing the vertex position in the shader, in other words, we are mostly interested in calculating the color of each pixel on the screen. In our shader we will apply textures with the help of triplanar mapping, add more details with normal mapping and do some height calculations to determine the color of the pixel.

3.1 Using color
Before adding triplanar mapping and normal mapping I wanted to add the functionality of calculating the vertex height. This is done by finding the distance between the vertex position and planet center, in other words, the height. We also need the max and min vertex height to be able to normalize the vertex height where 0 is the bottom of the terrain and 1 is the top.

The SetVertex() method runs for every vertex and, as you see in the code after the vertex has received its position on the planet from the noise function, we check if it is higher than the max or lower than the min.

Then in our shader, we can create a method that calculates a value between 0 and 1 which represents the vertex height i the terrain.

At last, we use the calculated vertex height to determine the color of the pixel. In short, without explaining the calculations in-depth, we lerp between the different colors we pick in the inspector tool and use the height when calculating the lerp value. Here is the result!
3.2 Triplanar mapping
Now that we are able to map the vertex height in the terrain, we can proceed to implement triplanar mapping to add textures and then use the vertex height to decide which texture to use.
Triplanar mapping is a 3D renderer where the general idea is mapping a texture three times on the object along the x,y, and z-axis. At last, we blend between these three samples based on the angle of the face resulting in no stretched textures or hard seams. Another advantage is that we don’t have to UV map our mesh!
The first step of implementing this in our shader is to find the UVs for each axis based on the world position of the fragment. The next step is then to do texture samples from our diffuse map with each of the 3 UV sets we’ve just made.

As you can see, we have only declared the yDiff, xDiff, and zDiff. This is because before doing texture samples from our diffuse map, we have to decide which diffuse map to use based on the vertex height! The code itself is too long to show here, but the pseudo-code will give you a good idea about how it works.

After we have figured out which diffuseMap the fragment is going to use, we can finally define the yDiff, xDiff and zDiff. We also have a triplanar blend sharpness parameter which decides how sharp the transition between the planar maps will be. This affects the blendWeight variable which is used to blend together all three samples based on the blend mask in the end.

This triplanar mapping implementation is very much inspired by an article(Triplanar mapping) written by Martin Palko, and I highly suggest reading this if you want to know more about how triplanar mapping works!
Earth Desert
3.3 Smooth transition
Applying textures to the planets gives us endless possibilities for styling the planet! However, the transitions between the textures when we go from one height level to another are very sharp. This is because we have no leaping in between. This can simply be fixed by adding some blend ranges around the level limits.

The code is really similar to the previous one, except we are adding an extra range for each transition between two height levels. When the fragment height is close to the limit between two height levels, we lerp between the two textures where the lerp value depends on how far from the limit and which side it is on.
No lerping Lerping between height levels
3.4 Normal mapping
The planets are starting to take shape and look like actual planets with natural terrain. However, if we look closely, the textures look rather flat and is lacking detail. One way of adding more detail is to increase the vertex amount, but as you saw in the section about the mesh, we already have very many vertices, and trying to add even more is not the best solution.
On the other hand, a good solution is using normal mapping! By using normal maps we can complement textures by adding directions relative to the orientation of the texture.

A normal map has three channels which are red, green, and blue. In unity, red is the x value, green is the y value and blue is the z value. All these are relative to the texture UVs and each pixel in the normal map gives us the direction of the corresponding pixel in the texture.

The code for sampling the normal map with triplanar coordinates is rather complicated and long and is fetched from an article by Ben Golus. The article is about triplanar and normal mapping and if you want to read more about it, you can do so here: Normal mapping for triplanar shader.
After adding the triplanarNormal()
method to our shader, we can use this to calculate the lighting normal and add the light shading!

No normal mapping Normal mapping
Even though you can see there is some difference between the two, the normal mapping is better observed in a close-up.
No normal map Normal map
4. Result
After creating the sphere mesh, applying noise to generate natural-looking terrain, and creating custom shaders with triplanar mapping, smooth transitions, and normal mapping, we finally have created procedural generated planets! I quickly added a cool skybox to put the planets in the right environment!
Earth seed 1 Earth seed 2 Earth seed 3
Desert seed 1 Desert seed 2 Desert seed 3
At last, we have the moon. Sadly I did not manage to generate moon craters in such a way that they were easy to spot and clear. However, it is easier to observe them when the wireframe is activated.
No wireframe Wireframe activated
5. Future development
Although I am pleased with the results, there are still several things I want to improve and features I want to add. Here is what I would focus on next if I had more time to work on the project!
5.1 Moon
Even though I managed to create a moon in the end, I am not really happy with the results. The reason is the bad quality of the moon craters and the fact that they are barely visible on the planet.
One solution would be to improve the ComputeCraterHeight.SetCraterHeight()
method, because this is rather incomplete. Another solution could be using dynamic normal mapping entirely to create the craters.
5.2 Ocean depth
I would love to implement ocean depth in such a way that you could see the terrain beneath the water. One way I am thinking of doing it is generating new terrain inwards the planet for all vertices which is beneath the water level.
5.3 GUI
Instead of only using the inspector tool for tweaking values and generating different planets, I would like to create a GUI to make it easier and more clear for the user.
6. Sources
1. Cajaraville, O. (2015, December 7) Four Ways to Create a Mesh for a Sphere. medium.com
https://medium.com/@oscarsc/four-ways-to-create-a-mesh-for-a-sphere-d7956b825db4
2. Flick, J. Cube Sphere. Catlike Coding.
3. Patel, A. (2020, May) Making maps with noise functions. Red Blob Games.
https://www.redblobgames.com/maps/terrain-from-noise/
4. Patel, A. (2013, August 31) Noise Functions and Map Generation. Red Blob Games.
https://www.redblobgames.com/articles/noise/introduction.html
5. Lague, S. (2018, August 13) [Unity] Procedural Planets (E03: layered noise). Youtube
https://www.youtube.com/watch?v=uY9PAcNMu8s&list=PLFt_AvWsXl0cONs3T0By4puYy6GM22ko8&index=3
6. Lague, S. (2018, September 7) [Unity] Procedural Planets (E04: multiple noise filters). Youtube
https://www.youtube.com/watch?v=H4g-TC__cvg&list=PLFt_AvWsXl0cONs3T0By4puYy6GM22ko8&index=4
7. Lague, S. (2020, July 11) Coding Adventure: Procedural Moons and Planets. Youtube
https://www.youtube.com/watch?v=lctXaT9pxA0&t=1029s Moon crater
8. Palko, M. (2014, March 20) Triplanar Mapping. Martin Palko.
9. Golus, B. (2017, September 17) Normal Mapping for a Triplanar Shader. bgolus.medium.com.
https://bgolus.medium.com/normal-mapping-for-a-triplanar-shader-10bf39dca05a