Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 2nd January 2013, 21:23   #1  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
How are you playing back 10-bit H.264 in 10-bit

So, we've had a lot of talk about the value of 10-bit encoding.

How about 10-bit playback? How are people displaying 10-bit output?

I've got a 10/12-bit capable display hooked up via DisplayPort to a GeForce GTX 570 in a Win7 system. That's all supposed to be able to play back a full 10-bit contrast range.

But what player will actually get all 10 precious bits to the display? What are the rest of you guys using to test this?
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 2nd January 2013, 23:20   #2  |  Link
mandarinka
Registered User
 
mandarinka's Avatar
 
Join Date: Jan 2007
Posts: 729
Don't you need Quadro drivers to output 10+ bits to the monitor?
In any case, obviously normal people use the normal players, without any surprises. It just depends if you use LAV video to dither down, or if you leave it to MadVR which is in theory a bit better, since it goes straight to RGB after scaling.
mandarinka is offline   Reply With Quote
Old 2nd January 2013, 23:48   #3  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by mandarinka View Post
Don't you need Quadro drivers to output 10+ bits to the monitor?
In any case, obviously normal people use the normal players, without any surprises. It just depends if you use LAV video to dither down, or if you leave it to MadVR which is in theory a bit better, since it goes straight to RGB after scaling.
You need Quadro to do 10-bit OpenGL, but DirectX should work fine on both Geforce and Quadro, apparently:

http://nvidia.custhelp.com/app/answe...3011/related/1

Quote:
NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector.
So, I think I need a player with 30-bit DirectX support.

I guess I could get a Quadro board, but it'd be pretty expensive to match the power of my SLI 570's to get just this one feature...
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 3rd January 2013, 02:12   #4  |  Link
mandarinka
Registered User
 
mandarinka's Avatar
 
Join Date: Jan 2007
Posts: 729
MadVR works in DirectX mode. If you are right, then basically the only change you would need added there is dithering to 10-bit RGB result instead of 8-bit, at the end.
mandarinka is offline   Reply With Quote
Old 3rd January 2013, 08:12   #5  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Don't forget that a big value of 10-bit encoding is that your files actually get smaller, so even on 8-bit output with proper dithering there is still a advantage here (which is the main advantage all those anime people try to leverage when doing 10-bit encodes)

In any case, madshi plans to add 10-bit output support to his madVR eventually, maybe he can also make it work on Geforce GPUs then.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 3rd January 2013, 19:21   #6  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by nevcairiel View Post
Don't forget that a big value of 10-bit encoding is that your files actually get smaller, so even on 8-bit output with proper dithering there is still a advantage here (which is the main advantage all those anime people try to leverage when doing 10-bit encodes)
Understood. But I'm trying to figure out what the potential quality advantage of using 10-bit sources with 10-bit encoding and then 10-bit display. Getting rid of dithering on output could be a meaningful change. I can see that it could improve quality, particularly with gradients. On the other hand, dithering noise could mask subtle quality issues in a 10-bit image.

Quote:
In any case, madshi plans to add 10-bit output support to his madVR eventually, maybe he can also make it work on Geforce GPUs then.
So, is the only way anyone knows of to actually see the full 10-bit luma range is to look at the file in Premiere Pro or CS6 with a Quadro card?

For all the buzz around 10-bit, I'd assumed some people were actually looking at the files in their full glory somewhere .
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 4th January 2013, 12:05   #7  |  Link
JEEB
もこたんインしたお!
 
JEEB's Avatar
 
Join Date: Jan 2008
Location: Finland / Japan
Posts: 512
The two things you bring up are separate, that's the first thing I'd like to point out.
  • A fully 10bit YCbCr playback chain up until the RGB conversion. This is right now attainable through mplayer2/mpv with the gl3/opengl renderer, as well as MadVR. Both can take 10bit output from the decoder and then convert it to RGB (possibly with dither etc., MadVR is still limited to 8bit RGB output)
  • 10bit RGB output from 10bit YCbCr. This is not really /that/ important, as 8bit BT.709 YCbCr is like 1/4 of 8bit RGB's colors to begin with even with full range content. So if there really are more colors encoded in the 10bit encode, it should be visible just fine on a proper 8bit RGB monitor/connection as well. This is why in general I try to separate these two things, and tell people that they do not need 10bit monitors to enjoy 10bit encodes of things. Yes, I know just comparing color amounts without taking into account a couple of other variables isn't fully correct, but you get my point methinks.

    But let's put that aside. You want 10bit YCbCr to be converted to 10bit RGB. To my knowledge, wm4's opengl-based renderer in mplayer2/mpv (gl3 in mplayer2, opengl/opengl-hq in mpv) should be able to do that just fine. MPC-HC's custom EVR-based renderer can also output 10bit RGB -- but it also cannot do 10bit YCbCr input yet.
__________________
[I'm human, no debug]
JEEB is offline   Reply With Quote
Old 4th January 2013, 15:32   #8  |  Link
poisondeathray
Registered User
 
Join Date: Sep 2007
Posts: 5,346
Quote:
Originally Posted by JEEB View Post
T[*]10bit RGB output from 10bit YCbCr. This is not really /that/ important, as 8bit BT.709 YCbCr is like 1/4 of 8bit RGB's colors to begin with even with full range content. So if there really are more colors encoded in the 10bit encode, it should be visible just fine on a proper 8bit RGB monitor/connection as well. This is why in general I try to separate these two things, and tell people that they do not need 10bit monitors to enjoy 10bit encodes of things. Yes, I know just comparing color amounts without taking into account a couple of other variables isn't fully correct, but you get my point methinks.
It's the other way around isn't it? The sRGB cube is fully contained with Y'CbCr space . ie. even 8bit Y'CbCr contains more potential colors than 8bit RGB . This is the reason behind "wide gamut" displays, and ITU Rec.1361

http://software.intel.com/sites/products/documentation/hpc/ipp/ippi/ippi_ch6/ch6_color_models.html

Gavino's pretty function
http://forum.doom9.org/showthread.php?t=154731
poisondeathray is offline   Reply With Quote
Old 4th January 2013, 16:02   #9  |  Link
JEEB
もこたんインしたお!
 
JEEB's Avatar
 
Join Date: Jan 2008
Location: Finland / Japan
Posts: 512
Quote:
Originally Posted by poisondeathray View Post
It's the other way around isn't it? The sRGB cube is fully contained with Y'CbCr space . ie. even 8bit Y'CbCr contains more potential colors than 8bit RGB . This is the reason behind "wide gamut" displays, and ITU Rec.1361
I did kind of misrepresent it I guess. If you convert to 8bit RGB from all of the 8bit YCbCr (full range, BT.709) values (see the source code on Chikuzen's blog), you get only ~1/4 of the available values used ((2**8)*(2**8)*(2**8) = 16,777,216 with 8bit RGB).

Of course this is far from a perfect thing in any way (gamma stuff comes to mind first of all), and is mostly meant to show that 10bit YCbCr (esp. limited range content) can be watched on a 8bit RGB screen quite fine.
__________________
[I'm human, no debug]

Last edited by JEEB; 4th January 2013 at 16:05.
JEEB is offline   Reply With Quote
Old 4th January 2013, 16:26   #10  |  Link
poisondeathray
Registered User
 
Join Date: Sep 2007
Posts: 5,346
Quote:
Originally Posted by JEEB View Post
I did kind of misrepresent it I guess. If you convert to 8bit RGB from all of the 8bit YCbCr (full range, BT.709) values (see the source code on Chikuzen's blog), you get only ~1/4 of the available values used ((2**8)*(2**8)*(2**8) = 16,777,216 with 8bit RGB).

Of course this is far from a perfect thing in any way (gamma stuff comes to mind first of all), and is mostly meant to show that 10bit YCbCr (esp. limited range content) can be watched on a 8bit RGB screen quite fine.

I think you're looking it the wrong way

It's the opposite. An 8bit RGB screen cannot display all 8bit Y'CbCr values. You need a 10bit RGB screen to display (almost) all 8bit Y'CbCr values (legal or otherwise)

When you convert to 8bit RGB, from 8bit Y'CbCr (see Chikuzen's blog), you only use a fraction of the values - you said this yourself. Thus many values are not represented and discarded (the are not visibile) . These don't have a legal "mapping" in 8bit RGB space (the 8bit Y'CbCr cube is larger than the 8bit RGB cube) . All values of 8bit RGB can "map" to Y'CbCr space, but the reverse isn't true.
poisondeathray is offline   Reply With Quote
Old 4th January 2013, 17:02   #11  |  Link
JEEB
もこたんインしたお!
 
JEEB's Avatar
 
Join Date: Jan 2008
Location: Finland / Japan
Posts: 512
Quote:
Originally Posted by poisondeathray View Post
I think you're looking it the wrong way...
Not really, just put it in a different way. Yes, 8bit RGB doesn't contain everything from the 8bit YCbCr space, but neither IMHO would 10bit RGB (feel free to show me incorrect) because the colorspaces are just that different.

Just trying to not make this be some fuel to those idiots who talk about 10bit screens being needed for "proper" 10bit H.264 playback and such :P

Anyways, yes -- different colorspaces and YCbCr is wider (many valid YCbCr values end up being truncated when converting to RGB). More (actual) bits on the screen is better.
__________________
[I'm human, no debug]
JEEB is offline   Reply With Quote
Old 4th January 2013, 18:38   #12  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by JEEB View Post
Not really, just put it in a different way. Yes, 8bit RGB doesn't contain everything from the 8bit YCbCr space, but neither IMHO would 10bit RGB (feel free to show me incorrect) because the colorspaces are just that different.
Hence xvYCC: http://en.wikipedia.org/wiki/XvYCC
Which expands the gamut to a superset of Rec. 709 while incorporating most of sRGB.

Quote:
Just trying to not make this be some fuel to those idiots who talk about 10bit screens being needed for "proper" 10bit H.264 playback and such :P
Well, I'd think that >8-bit sRGB would be helpful in reducing banding at least.

Quote:
Anyways, yes -- different colorspaces and YCbCr is wider (many valid YCbCr values end up being truncated when converting to RGB). More (actual) bits on the screen is better.
Well, we need to be clear about terminology here. Precision and gamut are orthogonal. You could have 8-bit xvYCC or 10-bit Rec. 709, with different strengths and weaknesses.

Anyone know of a good link to the vui.txt file referenced in this? http://forum.doom9.org/showthread.php?t=101058?
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 4th January 2013, 18:56   #13  |  Link
JEEB
もこたんインしたお!
 
JEEB's Avatar
 
Join Date: Jan 2008
Location: Finland / Japan
Posts: 512
Quote:
Originally Posted by benwaggoner View Post
Anyone know of a good link to the vui.txt file referenced in this? http://forum.doom9.org/showthread.php?t=101058?
This should work to get the current version until VideoLAN possibly migrates to another web git interface

And yes, naturally -- there are different gamuts, as well as precision formats And I do agree that 10bit sRGB/8bit xvYCC screens and connections would indeed show 10bit YCbCr content better than 8bit sRGB screens (on the basis of theory and not implementation, since I have no 10bit sRGB monitors nor calibration equipment to actually make any good observations on this matter)
__________________
[I'm human, no debug]

Last edited by JEEB; 4th January 2013 at 19:08.
JEEB is offline   Reply With Quote
Reply

Tags
10-bit, hi10p

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:18.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.