Quote:
Originally Posted by luk008
Disabling AMD dithering I discovered that my TV is in fact 8 bits. I can see a smooth gradient with 10 bits when dithering is enabled. So should I still use 10 bits or just stay with 8?
|
I see no point in using 10 bit output in MadVR unless your monitor\TV supports 10 bit input. It will only decrease the overall image quality because of the need of two dithering steps, one by MadVR (16 -> 10) and another by the GPU (10 -> 8).
We disable dithering here just for test purposes. It should always be enabled when converting to a lower bit depth, otherwise you may see banding artifacts.