You might have heard these terms:
- 10-bit Simultaneous Display (e.g. Eizo CS2420)
- 30 Bit Display (Adobe Photoshop - Preferences - Performance - Graphics Processor Settings - Advanced Settings)
- 10 bits per channel (bpc, NVIDIA Control Panel - Change Resolution - Output colour depth)
and you probably know that the color in computer displays and programs are expressed as R(ed)G(reen)B(lue) values. Furthermore, you also might know that each of these are 8 bits, meaning they can take values from 0 to 255. A quick calculation of 256*256*256 ~= 16.7 million tells us the number of colors that can be displayed on a display.
This is slowly changing. There are a few details but very briefly, there are displays now that can display 10 bits of color from each of RGB, so the total number of colors that are visible on the display now is 1024*1024*1024 ~= 1 billion.
This is the hardware side of the story. However, the software side is I guess as complicated as that and is partly the reason why the use of such displays are still not that common. On the software side:
- The graphics card, the driver and the operating system should support 10bpc (bits per channel) output.
- The application has to work in such a way to send 10bpc RGB values to show on the screen.
The first is pretty much solved at the moment. I am using a Quadro M620 with NVIDIA Drivers on Windows 10, and it supports 10bpc. I believe Apple computers and macOS since 10.11/El Capitan supports 10bpc as well.
The application support is another story. If you are a software developer, you can guess the amount of work you need to do to change an existing application to support this. At the time of writing this post, Photoshop supports 10bpc (or 30 Bit Display in its settings), but for example Lightroom does not, which is actually quite surprising for me.
I have seen a software support problem very easily on my computer (NVIDIA Quadro M620, Windows 10, Eizo CG247X). A test image containing small variations -e.g. a gradient- (and it is obviously stored as RGB 16bpc) displayed by Photoshop works fine. However, if I take a screenshot (with print screen or with Snip tool, the result is the same) and then look at the screenshot, I see bands as if it is shown on a normal display (8bpc). So very simply, the application takes the screenshot in 8bpc.
As you might guess HDR10 standard also uses 10bpc, and even more, Dolby Vision uses 12bpc.