View Single Post
Old 4th March 2019, 14:51   #33  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
How do you set 10bits RGB out on nVidia? The only options at least here on a 4K display are 8bits and 12bits.

The display must support 12bits natively in order for 10bits content to show without banding when 12bits is selected.

Recent JVC projectors support 12bits from the input to the panels, but not many displays do. Most support only 8bits or less, a few support 10bits, very few actually support 12bits natively, even if they accept a 12bits input.

So if you send 12bits to a panel that doesn't support 12bits, or only supports a 12bits input but then dithers to the panels because they are only 10 or 8bits, then you get this minor banding you are seeing sending 12bits to your display.

Are you 100% sure that your display supports 12bits from the input to the panels, through the whole internal chain, the way recent JVC projectors do?

I don't see the point of the test if you're not asking people to test on native 12bits displays.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 4th March 2019 at 14:55.
Manni is offline   Reply With Quote