Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th July 2012, 19:01   #1  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Is there any advantage in using a 10-bit-panel when playing videos encoded in Hi10P

Hi there~

I didn't know where to put this, I hope that this is the right location.

Anyway, I'm currently in the process of deciding whether to buy a new television or not. And if I decide to buy one, I obviously have to decide which one to get.

I already narrowed it down to two models. One is the Toshiba 46TL933G/46TL963G and the other is the Sharp 46LE730E/46LE732E.
Both sets will primarily be used with my PC, they'll be connected via HDMI and I'll use the PC for playback of everything.

But now here's the catch: I realized by chance that the Sharp TV uses a 10-bit-panel, but the Toshiba apparently doesn't.
Since I'm mainly watching videos (anime...) encoded with the Hi10 profile this immediately spawned my interest.
Are there any advantages in using a native 10 bit panel for playing "upscaled" (the source material obviously wasn't 10 bit) Hi10 vids? Will I get a better picture than with the Toshiba panel? Or is there no difference whatsoever as long as the source material wasn't 10 bit as well?

This 10 bit stuff is making it hard to decide which tv set to get. The Toshiba one is cheaper, I'd get free 3D glasses, it's newer and it has more gimmicks. The Sharp one has the 10 bit panel...

So... what can I expect out of a 10 bit panel what I can't expect out of a 8 bit one?

Thanks for all answers!


P.S.
And I'd also be interested in the general comparison of Toshiba and Sharp TVs - which ones are normally higher quality?
clancy688 is offline   Reply With Quote
Old 19th July 2012, 19:35   #2  |  Link
SassBot
Guest
 
Posts: n/a
The advantage is you don't need to dither the 10-bit decoded result to 8-bit upon playback. But with a sufficiently decent dither you shouldn't notice the difference.
  Reply With Quote
Old 19th July 2012, 19:41   #3  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,403
Check what formats are accepted over HDMI on the TV. A 10-bit panel doesn't guarantee it will accept 10-bit formats.

Also make sure your video card can output 10(30) bit color. I did a quick search but I am still not sure how well this is supported currently.

I don't think you would be able to tell the difference between a full 10-bit color depth and MadVR dithering down to 8-bit for display but I could be wrong.
Asmodian is offline   Reply With Quote
Old 19th July 2012, 20:20   #4  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Quote:
Originally Posted by Asmodian View Post
Check what formats are accepted over HDMI on the TV. A 10-bit panel doesn't guarantee it will accept 10-bit formats.
Hm, apparently not. The tech specs are saying that "HDMI Deep Color" isn't supported. But where's the logic in putting a 10 bit panel into a tv set which isn't able to receive 10 bit color information?

Quote:
HDMI Features : Deep Color / x.v.Color - / -
Quote:
Also make sure your video card can output 10(30) bit color. I did a quick search but I am still not sure how well this is supported currently.
I think my card should be able to do so. I have a HD6870, and there the tech specs are saying something about "Deep Color".

Quote:
HDMI® (With 3D, Deep Color and x.v.Color™)
Max resolution: 1920x1200
Quote:
I don't think you would be able to tell the difference between a full 10-bit color depth and MadVR dithering down to 8-bit for display but I could be wrong.
So if I'm using madVR the video will automatically be dithered down to 8-bit, regardless if my hardware is supporting 10 bit or not?

Last edited by clancy688; 19th July 2012 at 20:22.
clancy688 is offline   Reply With Quote
Old 19th July 2012, 21:23   #5  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Quote:
Originally Posted by clancy688 View Post
So if I'm using madVR the video will automatically be dithered down to 8-bit, regardless if my hardware is supporting 10 bit or not?
Correct.
The big advantage of 10 bit encoding lies in the increased precision the encoders/decoders can use during their calculations anyways, the output bit depth is less important.
sneaker_ger is offline   Reply With Quote
Old 19th July 2012, 21:30   #6  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Oookay... then the Sharp goes right out of the window, thank you very much.
clancy688 is offline   Reply With Quote
Old 19th July 2012, 21:38   #7  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
So you are ruling out the Sharp because it does not handle 10 bit input (or because madVR can't output 10 bit yet), but not the Toshiba which does not support 10 bit in the first place?
sneaker_ger is offline   Reply With Quote
Old 19th July 2012, 21:57   #8  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Nope, I'm ruling out the Sharp because of several reasons:

- apparently no advantages when playing 10 bit video
- I found no user reviews whatsoever, so I have no idea how it's behaving compared to the Toshiba (for which I did find several reviews)
- the Toshiba is newer, cheaper and offering more features, while the Sharp's only positive aspect was the 10 bit panel (and MAYBE an overall better image quality, but there I have, as stated above, no comparison or reviews at all)

The 10 bit panel was the main positive aspect of the Sharp, with its usability gone it doesn't hold up against the Toshiba.

Moreover, since madVR isn't able to output 10 bit anyway, I don't see any reason to specifically look for 10 bit panel TVs and exclude any other 8 bit panel which may be superior in other aspects. I was told that there's only slight to none difference between dithered and native 10 bit video output, so why should I exclude 8 bit panels in expectation of a madVR feature which's uncertain to be included anyway and even if it does, doesn't offer any significant quality improvements.
clancy688 is offline   Reply With Quote
Old 20th July 2012, 01:08   #9  |  Link
Mug Funky
interlace this!
 
Mug Funky's Avatar
 
Join Date: Jun 2003
Location: i'm in ur transfers, addin noise
Posts: 4,555
no TV is set to unity gain straight out of the box, and even if you sit and calibrate the thing (after turning off all the useless motion flow, edge enhancement, dynamic contrast, ad infinitum, and setting it to "underscan" the picture so you get native 1:1 pixels), you'll find that there's still a certain amount of colour processing going on.

a 10 bit panel excels in this area - making the inevitable digital colour processing not make your pictures look like arse.

most of the banding you'll get on playback is coming from the TV's guts, not from any encoding you've done. 10 bit helps here immensely.

that said, short of reviews of the sharp, i'd want to burn myself a test DVD with familiar samples on it and convince the shop assistant to let you play it on both.
__________________
sucking the life out of your videos since 2004
Mug Funky is offline   Reply With Quote
Old 23rd July 2012, 15:31   #10  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Now something reeeeally interesting:

In the end, I didn't buy the Sharp. I didn't buy the Samsung either.

What I bought was a Sony KDL-46HX755. I just set it up and connected it to my PC. And guess what the TV was telling me about my connection:

1080p 10 bit. ^^;;;

The HDMI is Deep Color capable (obviously) but does anyone know if it's a 10 bit panel inside there? I couldn't find out... :/
clancy688 is offline   Reply With Quote
Old 30th July 2012, 15:02   #11  |  Link
jmac698
Registered User
 
Join Date: Jan 2006
Posts: 1,867
I did a bit of research on this as I made the deepcolor wiki entry.
http://avisynth.org/mediawiki/High_b..._with_Avisynth

It seems that ffdshow, vlc, madvr, mplayer all support sending true 10bit data to the video card drivers*. The video card has to support a feature called LUMINENCE16. The video card can then pass the info through the HDMI cable, assuming that the HDMI negotiation reported deepcolor support. Finally, it's up to the tv/monitor to make use of this information. I think it would be very hard to find out how a tv handles this. It could just accept 10bit video and then only use 8 bits, it could dither to 8 bits, or it could (after it's own LUT processing) output 10 bits.

I think the only way to test is to use a special test video with brightness levels of 0,1,2,3 and 4, then look closely at the tv with the brightness turned up, to see if you can find 2 or 5 grey bars.
It shouldn't be hard to make the test, all the info you need is in the wiki entry.

Someday if this becomes an important marketing feature, you will see the feature listed. Otherwise it's just ignored. There's uses for 10bit panels with only 8bit video, because it can reach a better calibration.

There are professional use monitors where this kind of feature is highlighted. If you really want to be sure, you'd have to get one of them.

10 bit LCD monitor:
http://www.luminous-landscape.com/re...es/10bit.shtml

ps
apparently luminence16 is deprecated in dx, and was poorly supported by hardware. This is referring to a shader format. I suppose that video gets sent to a shader for output. I don't know much about how video is rendered.

*Playback guide with lav filters
http://wiki.bakabt.me/index.php/Hi10P

My research is conflicted - but beware, any reference you find that is even 1 year old will be out of date in this issue.

Last edited by jmac698; 30th July 2012 at 15:16.
jmac698 is offline   Reply With Quote
Old 21st August 2012, 20:05   #12  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Quote:
Originally Posted by jmac698 View Post
I think the only way to test is to use a special test video with brightness levels of 0,1,2,3 and 4, then look closely at the tv with the brightness turned up, to see if you can find 2 or 5 grey bars.
It shouldn't be hard to make the test, all the info you need is in the wiki entry.
Thanks, but unfortunately, I don't have any idea of encoding / creating videos.

All I can tell you is that the TV is telling me "1080p24 10bit" when connected to the PC (and 1080p24 12bit when connected to my BD-Player).

But another thing I just stumbled upon today:


In my video driver's setting, there are four different settings under "pixel format":

YCbCr 4:4:4
YCbCr 4:2:2
RGB 4:4:4 Pixel Format Studio (limited RGB)
RGB 4:4:4 Pixel Format PC Standard (full RBG)

The first one was initially selected, but since every video's decoded into RGB (afaik...?) I set it to RGB 4:4:4 full.
But now there were many details lost in black scenes. I tested with a real life video where a man was sitting in a dim room - his black jacket was basically a black blot. ^^;
So I went into my TV's settings and experimented. And under "HDMI Dynamic Range" I was successful. It was set on "Auto", other options were "limited" and "full". I changed it on "full" and suddenly I could see all the previous hidden details on the still black (not grey-black) jacket.

So now I'm curious - why? ^^
(And is it correct to use RGB instead of YCbCr for the TV?)
clancy688 is offline   Reply With Quote
Old 21st August 2012, 22:05   #13  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,403
A simple explanation which can get more complicated if you think about the equations for the conversion from YUV to RGB but this is the idea:
Full uses 0-255 for black - white.
Limited uses 16-235 for black - white.

Most video is actually limited, like blurays, but as almost all computer monitors are full the video is expanded to full range when playing on a PC. If your TV is expecting 16 to be full black but getting video data with black at 0 all the shadow details between 0 and 16 get clipped to black. The same is true of >235 whites but that is less obvious.

I like using full RGB 4:4:4 because that way the desktop, games, web video, and blurays, all look good on the TV. The only issue is that you have to make sure everything is converted to full range. If you notice dull whites and grey blacks you are probably sending 16-235 video but your TV is expecting 0-255.

As the almost all video is stored as YCbCr it seems like it would be best to send YCbCr to the TV but from what I have be able to tell with my limited testing the video is always converted to RGB then back to YCbCr if set. I use RGB as it avoids the extra conversion.
Asmodian is offline   Reply With Quote
Old 24th August 2012, 00:22   #14  |  Link
jmac698
Registered User
 
Join Date: Jan 2006
Posts: 1,867
I agree with everything the last poster said. It may be possible that some combination of player software, video card, driver, and TV is able to send true YUV data to the TV; I don't know.

It should make a deepcolor testclip sometime so people can test. Only a pc direct connection would work.

I kinda doubt that the TV has a true 12bit panel, even though it is accepting a 12bit connection. I'm sure there's some dither going on regardless; often panels use a type of flickering to do dither. In any case, this dithering does make some difference.

Last edited by jmac698; 24th August 2012 at 00:26.
jmac698 is offline   Reply With Quote
Old 25th August 2012, 20:50   #15  |  Link
clancy688
Registered User
 
Join Date: Jul 2011
Location: Germany
Posts: 30
Thanks for the explanations. After switching to RGB, something happened which I have absolutely NO explanation for:

As many current displays measuring 46 inches or more, my TV set has "Dirty Screen Effect". It's caused by unevenly illumination of the panel and basically looks like vertical and horizontal banding. Here's a (very bad) example. My screen doesn't even remotely look like that (two pillars in the upper middle with horizontal scans and several spots with vertial pans). You won't notive it at all with a more or less unmoving image. You can only see it during horizontal and vertical pans over a regular background (white fog, green gras, blue sky).
Anyway, my TV is effected as well. The effect is hardly noticable at all with normal real-live footage, but it's more pronounced with soccer-games and ESPECIALLY with anime. Contrary to real live footage, anime mostly don't have much variety in their images. The cheaper the anime is produced, the more pans and regular spaces are used.
So when watching normal footage, I hardly saw the effect (probably once every five minutes for a split second). But with normal anime, I saw it basically every time there was a pan over a regular background. Which was pretty annoying when watching some shows. And somehow, the imagery played by my PC seemed to be effected more than the imagery played by my BD-player.
But AFTER changing to RGB (from YCrCb) the problem practically disappeared. The "banding" is still there. But you can't see it anywhere as pronounced as before. Thankfully, the horizontal banding seems to be masked more than the vertical (mostly horizontal pans in anime). The RGB option masked this negative effect very effectively. And now I'd like to know WHY. As far as I know, DSE is a panel fault, so different video sources shouldn't change anything...?
Has anyone any idea what magic was at work here? ^^

Last edited by clancy688; 25th August 2012 at 20:52.
clancy688 is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:02.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.