View Single Post
Old 30th July 2012, 15:02   #11  |  Link
jmac698
Registered User
 
Join Date: Jan 2006
Posts: 1,867
I did a bit of research on this as I made the deepcolor wiki entry.
http://avisynth.org/mediawiki/High_b..._with_Avisynth

It seems that ffdshow, vlc, madvr, mplayer all support sending true 10bit data to the video card drivers*. The video card has to support a feature called LUMINENCE16. The video card can then pass the info through the HDMI cable, assuming that the HDMI negotiation reported deepcolor support. Finally, it's up to the tv/monitor to make use of this information. I think it would be very hard to find out how a tv handles this. It could just accept 10bit video and then only use 8 bits, it could dither to 8 bits, or it could (after it's own LUT processing) output 10 bits.

I think the only way to test is to use a special test video with brightness levels of 0,1,2,3 and 4, then look closely at the tv with the brightness turned up, to see if you can find 2 or 5 grey bars.
It shouldn't be hard to make the test, all the info you need is in the wiki entry.

Someday if this becomes an important marketing feature, you will see the feature listed. Otherwise it's just ignored. There's uses for 10bit panels with only 8bit video, because it can reach a better calibration.

There are professional use monitors where this kind of feature is highlighted. If you really want to be sure, you'd have to get one of them.

10 bit LCD monitor:
http://www.luminous-landscape.com/re...es/10bit.shtml

ps
apparently luminence16 is deprecated in dx, and was poorly supported by hardware. This is referring to a shader format. I suppose that video gets sent to a shader for output. I don't know much about how video is rendered.

*Playback guide with lav filters
http://wiki.bakabt.me/index.php/Hi10P

My research is conflicted - but beware, any reference you find that is even 1 year old will be out of date in this issue.

Last edited by jmac698; 30th July 2012 at 15:16.
jmac698 is offline   Reply With Quote