View Single Post
Old 13th January 2019, 12:17   #54264  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by madjock View Post
It is far from wrong, it is stated everywhere by nVidia that you need a Pro card to get 10bit like a Quadro, hence why we can only select 8 or 12bit.
You are just plain out wrong. If you have a monitor which only accepts 10-bit, and not 12-bit, then NVIDIA will offer you 10-bit output on consumer cards. I know, because I'm looking at it right this second with just such a monitor.

Please stop spreading your half-knowledge. The only thing you need Quadro for is 10-bit OpenGL output, which has some weird historical reasons.
I encourage you to read up on this topic, which has been discussed countless times in this very thread already. You don't need a Pro card for 10-bit. And what logic would that even be, why would 10-bit be limited to Pro cards, but 12-bit be allowed? Clearly 12 is superior to 10 if its fully supported, since its even more bits.

The reason you get 12-bit on HDMI TVs is that they only offer the highest bitdepth the display supports, which in that case is 12. It has nothing to do with "Pro" at all. If you have a screen which only supports 10-bit, like some PC monitors, then it'll offer 10-bit output in the control panel. There is absolutely no "Pro" limitation whatsoever here. 10-bit or 12-bit are used for proper HDR output even by the OS itself, as well as games. One doesn't game on Pro cards much.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 13th January 2019 at 12:24.
nevcairiel is offline   Reply With Quote