My Specialization

What I wanted to make

For my specialization I wanted to work with something I found fun and interesting to learn about. I enjoy working with UI and menus. I also like to work with math so I though 3D font rendering could work with both of those topics that I like.

My process

I started with adding the MSDF atlas gen and FreeType libraries into Aurora. I added the libraries through our premake file as projects and file includes.
Aurora is the engine I have been working in for the past year. It is a self-made engine that we have all been working on but the main Engine programmer is Isak Morand-Holmqvist & David Nilsson (read more about Aurora their websites).


Creating and saving single letters

After I included the libraries I needed to the engine I started to try and render single letters from a font and save them to a PNG file for now to make sure everything worked as it should.

I started rendering the letters a, b, and c in different fonts to see if I would get different results.

There was no problem with this, and I had fun trying different fonts and tried to see what would happen if I saved ABC all together in one png and ended up with this image below. It was quite interesting and fun to play with.

After I included the libraries I needed to the engine I started to try and render single letters from a font and save them to a PNG file for now to make sure everything worked as it should.

I started rendering the letters a, b, and c in different fonts to see if I would get different results.

There was no problem with this and I had fun trying different fonts and tried to see what would happen if I saved ABC all together in one png and ended up with this image below. It was quite interesting and fun to play with.

Creating a Font Atlas from a font

After I got the single letters to save into PNG I started to render the full atlas which gave me much more trouble than saving a single letter.

When MSDF atlas gen reads a font and its glyphs I can create a Bitmap in MSDF atlas format giving me the pixels, width, and height of the full atlas. The only problem was that the bitmap pixels were in RGB8 format. This was a problem because DX11 does not allow textures to be created in a RGB8 format they only allow RGBA8 and R11G11B10. So because of this, the atlas turned out like the picture on the right.

To counteract this, I convert the RGB bitmap to my own bitmap as a RGBA so that DX11 can create a texture from it. This resulted in a nice atlas like the one on the left. Another bug or crash the RGBA format fixed was that RBG could only load some fonts since fonts like Arial crashed the engine, but after I converted the RGB to RGBA this crash disappeared.

Creating a Font Atlas from a font

After I got the single letters to save into PNG I started to render the full atlas which gave me much more trouble than saving a single letter.

When MSDF atlas gen reads a font and its glyphs I can create a Bitmap in MSDF atlas format giving me the pixels, width, and height of the full atlas. The only problem was that the bitmap pixels were in RGB8. This was a problem because DX11 does not allow textures to be created in a RGB8 format they only allow RGBA8 and R11G11B10. So because of this, the atlas turned out like the picture above.

To counteract this I convert the RGB bitmap to my own bitmap as a RGBA so that DX11 can create a texture from it. This resulted in a nice atlas like the one above. Another bug or crash the RGBA format fixed was that RBG could only load some fonts since fonts like Arial crashed the engine, but after I converted the RGB to RGBA this crash disappeared.

Rendering words and sentence

After I got an Atlas to render on an image in the Editor using Dear ImGui as a debugging way to render the atlas I wanted to start working on rendering actual text. A single character or a sentence, I wanted to render something that used the glyph data that I had from the font.

I ended up starting down the road of rendering out a sentence but quickly realized that it was going to be more difficult than I had hoped.

I started on the code that will take in a letter or sentence from the editor and render it. I used the glyph data from the font to calculate where the first letter should be positioned and used the advance and the glyphs min, max, and bearing to figure out where the next letter was going to be. 

The advance is how far to the right from the glyphs (0,0) point the next character can be written and the bearing is how far up and right the character should be placed.

Rendering words and sentence

After I got an Atlas to render on an image in the Editor using Dear ImGui as a debugging way to render the atlas I wanted to start working on rendering actual text. A single character or a sentence, I wanted to render something that used the glyph data that I had from the font.

I ended up starting down the road of rendering out a sentence but quickly realized that it was going to be more difficult than I had hoped.

I started on the code that will take in a letter or sentence from the editor and render it. I used the glyph data from the font to calculate where the first letter should be positioned and used the advance and the glyphs min, max, and bearing to figure out where the next letter was going to be. 

The advance is how far to the right from the glyphs (0,0) point the next character can be written and the bearing is how far up and right the character should be placed.

Rendering textures of atlases


After I finished the code part of rendering a sentence I needed to test it, and that’s where I had my biggest problem that would change my specialization goal and change how far I got. I needed a way to render the text. Saving the bitmap to an image for this was not enough so I decided to create a component for the text rendering.


I started by just being able to render a single texture, the atlas. I had a PNG so I converted the atlas PNG to a DDS and went to render it.

Above you can see a video of what I made. I made a component where you can throw in a font and it will then render that font atlas to the screen. I had some trouble with the rendering of the component. I set up the Pixel shader and the Vertex shared that I needed and created a graphics command for it to send the data. And I ended up with a black texture (shown in the picture above). I knew that there was something there and there was data, but the texture didn’t render. I could not figure out what it was for a long time. I thought maybe something became null? Maybe a color I sent in was wrong? Maybe the font I loaded didn’t work?

After many trials and errors where either nothing changed or the engine crashed, I realized that I had simply forgotten to send the tint color of the texture to the graphics card with all the other data. So the default data was 0, it was black. I realized this by adding that I could change the tint of the texture in the editor and realized that it didn’t change anything so I thought I must have forgotten to send the data, which was correct. I worked on this bug for well over a week not realizing the small mistake, the single line of code that I missed.


Final work


And for now, with the time I had this is how far I got. The video shows how far I got with my time and my planning. I wanted to be able to render an actual sentence but for now, I have the code but have not been able to test if it does in fact work. I can load as far as I know any font that has the standard Latin alphabet. And I can render the atlas texture in 3D space in Aurora.

I didn’t get the text rendering to the point I wanted it to be. I wish I could have been able to render sentences. But I have learned a lot while trying to render text and I find the text rendering interesting. I do not, however, like shaders it is not one of my strong suits.