View Single Post
Old 21st February 2019, 18:06   #54938  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 708
Quote:
Originally Posted by Manni View Post
Did anyone notice that 12bits on nVidia seems to distort patterns when using MadTPG as a pattern source (in 4K UHD SDR at 23p) compared to 8bits?
Could anyone with access to calibration software/equipment confirm this?
This is most visible when tracking gamut saturation and luminance linearity, but it also impacts greyscale. White balance isn't impacted significantly.
8bits is very linear and doesn't distort.
12bits breaks linearity and distorts the native gamut significantly.
It would be great if someone could confirm.
I don't use FSE, this is in full screen windowed mode.
If you don't experience this, please specify OS, nVidia driver version, and software used to calibrate/measure. Although unlikely, it could be caused by a specific setting in madVR, but I haven't had the time to investigate this yet. I have all the performance settings unchecked.
Details of rig in my sig (except driver, I was testing 385.28 for 12bits as I cant' get a custom refresh rate to work with CRU on the new JVC models).

I noticed this in the form of less gradient smoothness.

But I don't think it's nvidia's issue, the TV probably doesn't react very well to 10 or 12 bit. Because its frc dithering algorithm may have some inherent sharpening or debanding which doesn't allow for perfectly smooth gradient.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote