Quote:
Originally Posted by nevcairiel
Don't forget that a big value of 10-bit encoding is that your files actually get smaller, so even on 8-bit output with proper dithering there is still a advantage here (which is the main advantage all those anime people try to leverage when doing 10-bit encodes)
|
Understood. But I'm trying to figure out what the potential quality advantage of using 10-bit sources with 10-bit encoding and then 10-bit display. Getting rid of dithering on output could be a meaningful change. I can see that it could improve quality, particularly with gradients. On the other hand, dithering noise could mask subtle quality issues in a 10-bit image.
Quote:
In any case, madshi plans to add 10-bit output support to his madVR eventually, maybe he can also make it work on Geforce GPUs then.
|
So, is the only way anyone knows of to actually see the full 10-bit luma range is to look at the file in Premiere Pro or CS6 with a Quadro card?
For all the buzz around 10-bit, I'd assumed some people were actually looking at the files in their full glory somewhere
.