SSAO support

Finally, I added SSAO (Screen Space Ambient Occlusion) support after 40h of work! It was my first time implementing it, so it took quite a while. I was very close for a long time, but some tricky parts eluded me. I think it was made more complicated for me because of the multi-viewport setup I have. The matrices I used to render to a part of the viewport where incorrect, but it only appeared when I started using them for SSAO. Finding this was quite tricky (especially when you have to come back to this after working all day on something completely different!).

Anyway, here are the results:

If you want some explanation about what is SSAO, you will have to read further!

Time spent for this Time spent so far
40 h 175 h

So the idea with SSAO is basically to simulate how much a point on a surface is hidden from indirect light (Ambient) by its surrounding. This is done by looking at a small set of positions around said point, and seeing if these points are hidden from the current camera (viewing the image, hence Screen Space) by another nearby surface (Occlusion). While the approximation might be confusing, it works quite well in practice. With that said, we can’t really afford a lot of samples per point. Indeed, it requires a lot of additional calculation for every “pixel” on screen, so we cannot afford to do it with a lot of samples.

To implement this, I did the following:

  • Wrote logic to create randomized uniformly distributed samples in a hemisphere
  • Created/reused a separate shader to display particles
    • I used it to verify the SSAO samples were indeed in a sphere
  • Added an extra viewprojection matrix to the camera class
    • It combines the already existing view matrix and projection matrix (which were often used in the multiple shaders)
    • Having it on the camera class avoids useless recalculation when rendering. That was probably an early/unecessary optimization (felt good though…)
  • Added logic to use the SSAO samples (vertex array / buffer objects)
  • Reorganized the rendering handler codebase to make it cleaner (having clean scope separation between different shader code calculation)
  • Added a full screen debug camera option (using game logic)
    • This allowed from a single button press to go full screen while disabling the other cameras (very simple to add, thanks again to the good object design of the project)
    • It was useful since SSAO is quite subtle, it was important to have a way to zoom in on the models while refreshing the shader code
  • Got crazy frustrated trying to get SSAO working while:
    • Creating the samples
    • Doing the pre-render from camera
    • Verifying things made sense
    • Projecting samples on surfaces, according to the surface normal
    • Crying :’-(
    • Verifying the pre-generated values fetched from framebuffers were correct
    • Using the sample in indirect light calculation
    • Doubting initial result because you see a lot of gray lines (due to low sample counts)

Additionally, while working on all that, I also took time to try and fix the strange landing pad reflections you might have seen in previous posts (I have to admit, it was a nice way to take a break from the rest). It turns out it was only poorly configured material being used on very flat surfaces. I also fixed the XYZ/RGB arrow models, their normals were inverted, making the model somehow see-through.

As a final point, in the images of this post, the samples are always the same, for every point/fragment viewed by the camera. This is causing aliasing when we use low sample counts (As you can see in the 64 sample picture), and is the last thing I need to fix before I can consider my SSAO implementation “finished”.


Suggestions de lecture :