View Single Post
Old 13th January 2019, 11:57   #54263  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by nevcairiel View Post
This is just wrong. You can get 10-bit/12-bit output on any modern GPU, depending on monitor support.

It is far from wrong, it is stated everywhere nVidia wise anyway that you need a Pro card to get 10bit like a Quadro, hence why we can only select 8 or 12bit. 10 bit they have reserved for Direct X games as far as I can read up on


While this is true for now, its also irrelevant. We use 12-bit because thats what NVIDIA gives you with many HDMI TVs. Its either 8 or 12, and if you can output 10-bit with madVR, then clearly 12 is the only mode to actually preserve those 10-bits, because 8 would reduce it.
The majority of people myself included have an 8bit TV with FRC to make it 10, so am I going to see an improvement going from 12 to 10 on a fake 10 bit TV anyway ? are we not all talking about dithering in one way or another, would we see any extra colours, because again all I can read is it is due to banding issues so I have really no idea if it is better to stay at 8bit as we could never tell anyway ?

Last edited by madjock; 13th January 2019 at 13:46.
madjock is offline   Reply With Quote