Intuition for Gamma Correct Rendering
I want to start out by apologising for a lack of pretty graphics in here, which is a little odd for a post about visual quality. It’s a simple answer as to why this is though – I’m currently typing this on my laptop, watching as the Windows 7 system recovery progress bar loops on the screen of my main PC (uh… yeah, it’s behaving a little odd. But that’s another topic entirely).
Ok, now that you know you’re in for a lot of monotonous text, let’s get on with it!
Gamma correct rendering may sound like a simple enough concept at first, but to do it correctly can be very challenging – especially once you throw hardware variations into the mix. Possibly worse yet is that it is something you must keep in mind throughout development, and educate your teammates about. Or you ignore the issue completely and live with the consequences, but it will come around to bite you down the road. Repeatedly.
First some definitions:
- <Gamma/sRGB/Linear> space
The curve of raw data values to represented values for the data that you’re working with (could by bytes, floats, or rgb triple). Linear is easy as it’s an exact match. Gamma and sRGB on the other hand are not identical, and instead define a curve that gives more raw data values to the lower ranges of the represented values compared to the higher ranges. For the purpose of this post, we’ll call gamma and sRGB the same thing (though they aren’t necessarily this way, as sRGB refers to a very specific curve, while gamma can be anything).
- To gamma a texture/value
To convert from linear space to gamma space. This will result in raw linear values (the same as their represented values) of 0.2 going to raw gamma values of ~0.5 (but still representing 0.2). The exact values depend on the curve of the gamma space that you convert to of course.
- To degamma a texture/value
The opposite of the above; to convert from gamma space to linear space by applying the inverse of the gamma curve. So, with the above example you would get back the original linear raw value of 0.2 from the gamma value ~0.5.
So what does it really take to be gamma correct? There are 3 primary areas of concern: the pipeline, the shader, and the render target.
The Pipeline and Tools
- Some of the source data that you’re given will be in gamma space, and other source data will be in linear space. The pipeline has to know what it has, and what it should do with it. Easier said than done, as this requires meta data to be present – either artist-set or automatically-set based on usage. Having artists specify what everything should be interpreted as is obviously the easiest choice when faced with having to shoehorn gamma correct behaviour into an existing pipeline without the backend architecture to support it, but does carry with it the consequence that user-errors will be abundant.
- If you’re going to perform any processing on textures in the pipeline (resizing, mipmap generation, blending the edges of cubemap faces, etc), the operations must be done in linear space. The gotcha with this is that you must operate with as much precision as possible throughout this process to avoid issues with quantization due to the conversion to linear space and back again. This usually means converting to a floating point texture immediately, and only converting back as the very last step. Yes, and you’ll probably have issues achieving this with some of the external libraries you use. So go and modify them too (and diverge from what’s in SVN, making taking updates all that much harder). Fun stuff.
- The final conversion to gamma space after you’ve done your processing may also bite you due to hardware variations. If you’re lucky you’ll only be targeting platforms that have proper support for sRGB – but many are not so lucky and as such will be in for a world of fiddly pain thanks to . But at least it’s documented now, which has only happened in the last couple of years. Extremely fun stuff.
- Everything you do in a shader should be in linear space. Simple.
- There are states that you can set on the various platforms to automatically convert textures when sampled from gamma space to linear space, but these states do live in different places for different platforms.
The Render Target
- The joys of hardware variations will strike you severely here, and throw a spanner (wrench for those of North American heritage) in the works.
- Frame buffers are usually stored in gamma space and you output linear space values from the pixel shader. Thus, blending the output of the pixel shader with the frame buffer can be done in linear space (correct) or gamma space (incorrect) depending on the platform. DX9 and the PS3 will do it incorrectly, but DX10+ and the X360 will do it correctly.
Here’s a spoiler: not everything should be gamma corrected at every step. But what does that mean? Why is that? Well, that’s exactly what this post is about!
The simplest rule for whether a texture should be treated in the pipeline as linear or not is if it was authored by an artist painting using values that give the look they want on screen (“I want this blue for the sky”), it should be treated as being in gamma space. Everything else should be linear. Except when it shouldn’t be. Crap.
Oh, and that means that vertex colours should also be treated as being in gamma space. Except when they shouldn’t be. Double crap.
What are these exceptions? Take a lightmap as an example; according to the above rule, it should be in linear space – and it should. However, it’s not uncommon to have very dark lightmaps and when in linear space this will result in excessive banding in dark areas. If the same lightmap was stored in gamma space and converted to linear space only when it was sampled in the shader, you would have far more precision (and thus less banding) in the dark areas. The trade-off here is reduced precision in the light areas, but generally that’s less noticeable thanks to human vision being more sensitive to variations in darks than lights (undoubtedly to see creatures with big fangs lurking in the shadows). A trade-off is a trade-off though, and it’s not always what you want. Crytek for instance uses a metric of something like if at least 15-20% of an image has values of less than 96, an image will be stored in gamma space, and otherwise it’ll be stored in linear space – but that’s based on little more than what works for them.
There are still cases when you don’t want to do this though, such as for normal maps, as the value range has absolutely nothing to do with brightness.
What about other things? If an artist picks a tint colour for a material, it has been selected in gamma space and thus should be converted to linear space. Same goes for vertex colours that are used similarly. Fog colours too. Oh, and colours a player picks for their character’s clothes. Seeing a pattern here? There’s a lot more to being gamma correct than just textures, and it impacts a lot of people on the team who probably don’t even know what gamma correction is.
That’s why it’s a difficult area to get right.
I’m hopeful that the next generation of consoles will address the remaining issues and allow us to be entirely gamma correct across all platforms, consistently.
Because that’s why it’s currently an impossible area to get right.
For more information (and pretty pictures), I highly recommend taking a look at the presentation Post Processing in the Orange Box from Alex Vlachos at Valve.