View Single Post
Old 13th January 2019, 18:48   #54272  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by nevcairiel View Post
You are just plain out wrong.
So you keep saying.

Here is the thing though, so you think nVidia out of the goodness of their hearts have let us have the option of 12bit to help us out, you are quite delusional.

The 10bit is for graphic designers and the likes, if you choose to ignore this fact which is all over the web, then good for you mate.

Does it make any sense no, but it is what it is, if it was that simple we would have no 8bit, as surely nVvidia would want what is the best for us and it would default to 12 bit straight away, like no breaking drivers every 2nd release and fluid 23.976 frame rates etc etc, and of course 12 bit stays there after a reboot unless you frig something.

https://www.pugetsystems.com/labs/ar...t-Output-1221/
https://www.reddit.com/r/nvidia/comm...abling_10_bit/
https://forums.evga.com/gtx-1080-sup...-m2510043.aspx
https://devtalk.nvidia.com/default/t.../post/5078463/
https://www.eizoglobal.com/support/c...18-nvidia-amd/



But lets leave that there.

Last edited by madjock; 13th January 2019 at 19:24.
madjock is offline   Reply With Quote