View Single Post
Old 11th April 2019, 13:36   #55679  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,890
@If6was9

using up to 12 bit input support is a default feature on nearly all TV for over 6 years now.

in a perfect world even if your panel is 6 bit sending 10 bit wouldn't be worse at all. while i agree that blindly using 10 bit is not best for picture quality simply because far more TV produce bending with 10 bit input then not it has to work.

a GPU driver that is changing the colors of the image just by switching between 8 bit and 10 bit renderer is clearly not working properly.

how should someone write a color critical application with that behaviour?

intel and nvidia doesn't do this so they either both workaround an windows bug or the issue is at AMD.
huhn is offline   Reply With Quote