Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th May 2012, 10:30   #1  |  Link
liquidskin76
Registered User
 
liquidskin76's Avatar
 
Join Date: Dec 2008
Posts: 233
To 10bit or not to 10bit!

Hey all,

Just wanted to check opinion on whether 10bit x264 is now the way to go, bearing in mind that the decoder i'll be using (ffmpeg/XBMC) now supports 10bit.

My main goal is to reduce file size compared to 8bit. Is there any reason why i shouldn't crack on and start using 10bit for my encodes from now on?

Encode times are a bit longer, however not too fussed about that.

Cheers
liquidskin76 is offline   Reply With Quote
Old 25th May 2012, 10:52   #2  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,815
http://x264.nl/x264/10bit_02-ateme-w..._bandwidth.pdf

You must remember that 10bit won't play on any other device (TV / Media Players / PS3 ...) also DXVA won't work either.
10 bit will work only on HTPC (software decoding)
Atak_Snajpera is offline   Reply With Quote
Old 25th May 2012, 22:15   #3  |  Link
liquidskin76
Registered User
 
liquidskin76's Avatar
 
Join Date: Dec 2008
Posts: 233
Hey Atak,

Cheers for the info. Forgot about hardware decoding.

I'll stick with 8bit!

Cheers
liquidskin76 is offline   Reply With Quote
Old 25th May 2012, 22:22   #4  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Honestly, I think 10 bit is a lost cause for mainstream H.264 use. Higher precision encoding probably will come with H.265, but not before.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 26th May 2012, 23:25   #5  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
I completely agree. AVC Intra / Ultra are the only places we'll see 10 bit H.264 in any sort of widespread use.
__________________
These are all my personal statements, not those of my employer :)
Blue_MiSfit is offline   Reply With Quote
Old 27th May 2012, 14:31   #6  |  Link
turab
Registered User
 
Join Date: Apr 2012
Posts: 38
10-bit encoding is actually quite popular amongst anime fansubbing groups.
turab is offline   Reply With Quote
Old 30th May 2012, 01:01   #7  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Quote:
Originally Posted by turab View Post
10-bit encoding is actually quite popular amongst anime fansubbing groups.
Who have been a quite technically influential group for many years, to be certain, but are hardly mainstream. There are way more iPads in the world than there are anime fansub watchers.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 31st May 2012, 04:07   #8  |  Link
Audionut
Registered User
 
Join Date: Nov 2003
Posts: 1,281
Some of us don't like conforming to how apple thinks we should do things
__________________
http://www.7-zip.org/
Audionut is offline   Reply With Quote
Old 31st May 2012, 04:27   #9  |  Link
ajp_anton
Registered User
 
ajp_anton's Avatar
 
Join Date: Aug 2006
Location: Stockholm/Helsinki
Posts: 805
Unfortunately, we are a small group of people not worth making 10-bit compatible hardware for. We'll just have to wait until 10-bit becomes the next marketing trick for the masses.
ajp_anton is offline   Reply With Quote
Old 1st June 2012, 06:12   #10  |  Link
Mug Funky
interlace this!
 
Mug Funky's Avatar
 
Join Date: Jun 2003
Location: i'm in ur transfers, addin noise
Posts: 4,555
you'll need mainstream display devices that can reliably show the difference for a start.

10-bit is good as a mezzanine format, and for some odd corner-cases of the encoding world (anime is but a genre, fansubbers aren't the authority, though they're an interesting case. they also seem to believe that anime is the hardest thing to encode well, though this hasn't been true since the introduction of h.264. banding is only an issue if you insist on scrubbing out not only the noise, but most of the detail as well).

i would LOOOVE it if i could buy a $400 TV that is accurate to 10 bits in all the channels. until then, 10-bit distribution is just one of those OCD settings, like --me tesa, well in the realm of diminishing returns.
__________________
sucking the life out of your videos since 2004
Mug Funky is offline   Reply With Quote
Old 1st June 2012, 11:41   #11  |  Link
turab
Registered User
 
Join Date: Apr 2012
Posts: 38
Quote:
Originally Posted by Mug Funky View Post
until then, 10-bit distribution is just one of those OCD settings, like --me tesa, well in the realm of diminishing returns.
You'd be surprised at the amount of compression with certain sources. With film, I've personally noticed a 5-10% reduction in file size, so it's probably not worth it, but with anime, I hear it's more -- maybe 20-30% -- which is quite significant.
turab is offline   Reply With Quote
Old 1st June 2012, 17:19   #12  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Quote:
Originally Posted by Mug Funky View Post
10-bit is good as a mezzanine format...
Relatd question: What video tools support decoding 10-bit from 10-bit H.264 bitstreams? Alas, most encoding tools don't even have 10-bit pipelines, so I fear those that can decode 10-bit H.264 might wind up truncating to 8-bit and going from there.

But yes, it would be a GREAT mezzanine alternative to the industry standard MPEG-2 (inefficient, generally less than pristine quality) and ProRes (HUGE!).
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 1st June 2012, 17:22   #13  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Also, has x264 or other H.264 encoders looked at retaining all 10 bits of 10-bit sources to the quantization stage, even when doing an 8-bit encode? It seems that having more precise data would enable the encoder to better determine when to dither and when not to, gaining some of the value of 10-bit encoding.

Essentially, a perceptually optimized in-loop dither.

Now that the 10-bit source piepline exists for x264...
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 1st June 2012, 19:12   #14  |  Link
RunningSkittle
Skittle
 
RunningSkittle's Avatar
 
Join Date: Mar 2008
Posts: 539
x264 is 16bit internally.
http://forum.doom9.org/showpost.php?...0&postcount=24
RunningSkittle is offline   Reply With Quote
Old 1st June 2012, 19:23   #15  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
Quote:
Originally Posted by RunningSkittle View Post
Cool.

But, by default, are 10-bit sources input at 10-bit into the 16-bit mode? If input is YV12, it's still only 8-bit.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 1st June 2012, 19:38   #16  |  Link
mandarinka
Registered User
 
mandarinka's Avatar
 
Join Date: Jan 2007
Posts: 729
Quote:
Originally Posted by RunningSkittle View Post
You are misinterpreting that. What he means there is that doing the math involved requires 16-bit integers just to store *10-bit* values and perform operations on them.
Similarly, with 12bit bitdepth, IIRC some operations would even require moving to 32-bit integers, because 16bit ones wouldn't have enough headroom.

As you can read there though, the values need to actually be clipped to stay within the bitdepth. So saying that x264 is "internally working in 16bit" is incorrect.

Last edited by mandarinka; 1st June 2012 at 19:40.
mandarinka is offline   Reply With Quote
Old 1st June 2012, 20:35   #17  |  Link
MasterNobody
Registered User
 
Join Date: Jul 2007
Posts: 552
x264's filter chain works at 8 or 16-bit depth depending from input and filters. Before sending it to libx264 (actual encoding) it is converted to 8 or 10-bit according to build version (downsampling is made with dithering). So no, actual dithering is made before encoding, not in-loop.
MasterNobody is offline   Reply With Quote
Old 1st June 2012, 20:58   #18  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
So, what I'd love to see is 10-bit input into quantization, and the final dithering decision being made at that stage based on what will look and compress the best.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 2nd June 2012, 00:32   #19  |  Link
kieranrk
Registered User
 
Join Date: Jun 2009
Location: London, United Kingdom
Posts: 707
Quote:
Originally Posted by benwaggoner View Post
So, what I'd love to see is 10-bit input into quantization, and the final dithering decision being made at that stage based on what will look and compress the best.
It was discussed briefly at VideoLan Developer Days last year but that's about it.
kieranrk is offline   Reply With Quote
Old 3rd June 2012, 03:48   #20  |  Link
Sapo84
Registered User
 
Join Date: May 2008
Posts: 40
Quote:
Originally Posted by Mug Funky View Post
they also seem to believe that anime is the hardest thing to encode well, though this hasn't been true since the introduction of h.264. banding is only an issue if you insist on scrubbing out not only the noise, but most of the detail as well.
Interesting point of view given the fact that banding is present in (many) Cruncyroll encodes, (many) transport stream and (lots of) blu-ray (madoka and the -monogatari series are a fine example of that).
Sometimes there is banding even in the master itself (madoka, confirmed by mp3dom).

That means fansubbers have to deband even if they would prefer not to (already grainy BD source), and 10bit helps a lot in preserving the gradients without going insane with debanding/addgrain.
It's not that anime are difficult to encode, the problems lies in the fact that they are produced by incompetent people.

And that's why the good ones started using 10bit, the vast majority of clueless rippers/encoders just followed because 10bit was the next cool thing.
Sapo84 is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 14:21.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.