Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 31st October 2010, 04:53   #21  |  Link
Dark Shikari
x264 developer
 
Dark Shikari's Avatar
 
Join Date: Sep 2005
Posts: 8,666
Quote:
Originally Posted by popper View Post
actually DS perhaps you might ask Francois about the SB Decoder ?, i think it will/might, as its not a fixed ASIC fixed-function decoding device such as NV and amd seem to use now, and he could probably work it into the patch if not already there somehow if it ever goes through.
I suspect the SB is more like the most recent iteration of Broadcom's VideoCore; it's not a "pure ASIC", but rather built out of programmable blocks. So, for example, for a motion search, it could load some region of pixels and perform arbitrary checks against it. But that region of pixels still likely has to be 8-bit.
Dark Shikari is offline   Reply With Quote
Old 22nd June 2011, 04:30   #22  |  Link
lovelove
Registered User
 
Join Date: Mar 2010
Posts: 106
I come from image editing and I was wondering why did you go from 8 to 10 bit instead of directly to 16 bit?

As far as (still) images are concerned a higher bit depth (16 [or 10] bit compared to 8 bit) is ONLY needed IF 8 bit would result in banding/posterization artefacts. However the price is a bigger file size.

So how do you guys end up saying "10 bit gives better quality at the same bitrate due to greater precision" ??
IMO, the opposite is the case.
10 bit gives marginally better quality (probably only distinguishable for gradients with banding) BUT needs a higher bitrate (due to greater precision).
At the SAME bitrate, I suppose 8 bit would still have better quality.

Last edited by lovelove; 22nd June 2011 at 04:39.
lovelove is offline   Reply With Quote
Old 22nd June 2011, 06:00   #23  |  Link
Dark Shikari
x264 developer
 
Dark Shikari's Avatar
 
Join Date: Sep 2005
Posts: 8,666
Quote:
Originally Posted by lovelove View Post
I come from image editing and I was wondering why did you go from 8 to 10 bit instead of directly to 16 bit?
Because each extra bit of intermediate precision cuts 50% off the amount of compression lost due to intermediate rounding. After 2 extra bits, you've eliminated 75% of the inefficiency.

Also, 10-bit is the highest bit depth at which luma motion compensation can be implemented without unpacking to 32-bit.

Also, the spec is capped at 14-bit.
Dark Shikari is offline   Reply With Quote
Old 22nd June 2011, 15:58   #24  |  Link
dukey
Registered User
 
Join Date: Dec 2005
Posts: 560
What's the point of 10bit video ? Most LCD displays only use 6bit per channel and dither the result. Seems like massive over kill to me. In direct x or opengl, you can render to 16 or 32bit per channel buffers, but the final result is always clamped down for display.
dukey is offline   Reply With Quote
Old 22nd June 2011, 19:53   #25  |  Link
Dark Shikari
x264 developer
 
Dark Shikari's Avatar
 
Join Date: Sep 2005
Posts: 8,666
Quote:
Originally Posted by dukey View Post
What's the point of 10bit video ? Most LCD displays only use 6bit per channel and dither the result. Seems like massive over kill to me. In direct x or opengl, you can render to 16 or 32bit per channel buffers, but the final result is always clamped down for display.
10-bit is the intermediate precision. The output precision isn't important.

Audio has used 32-bit intermediate precision (in the form of float) for decades even though the output is only 16-bit.
Dark Shikari is offline   Reply With Quote
Old 22nd June 2011, 20:21   #26  |  Link
me7
Registered User
 
Join Date: Mar 2007
Posts: 217
The final version of Libav 0.7 was released. I guess widespread support of 10-bit AVC is underway.
me7 is offline   Reply With Quote
Old 23rd June 2011, 11:27   #27  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,926
Quote:
Originally Posted by dukey View Post
What's the point of 10bit video ? Most LCD displays only use 6bit per channel and dither the result. Seems like massive over kill to me. In direct x or opengl, you can render to 16 or 32bit per channel buffers, but the final result is always clamped down for display.
You see the Industry has something "new" to bring in for Consumers after the 3D Hype (Ultra Precission Display) UPD for short
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 23rd June 2011 at 11:39.
CruNcher is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:34.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.