Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > New and alternative video codecs

Reply
 
Thread Tools Search this Thread Display Modes
Old 5th December 2019, 09:10   #2021  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,575
I didn't mean to imply that encoding 8 bit content in 10 bit AV1 would be the right thing to do.

I just meant that given premium content almost always being > = 10 bit 4:2:2 it makes sense to preserve 10 bit, especially when picky studios are highly critical of how subtle gradients are handled in their content.

Last edited by Blue_MiSfit; 5th December 2019 at 09:14.
Blue_MiSfit is offline   Reply With Quote
Old 5th December 2019, 12:58   #2022  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 123
Quote:
Originally Posted by Blue_MiSfit View Post
I also hope that the dav1d team focuses more on 10 bit soon. I think social media / user generated content is a huge use case for AV1, and most of that content is 8 bit for now.
It's a shame Google pursued gAV1 rather than shunting engineer time to dav1d, they could work much faster on all fronts with some serious money behind them.

I get the whole competition is good angle, but it doesn't seem to have made much of a difference, other than prompting dav1d to shore up their ARM32 priorities - from which it seems dav1d are soundly ahead on all fronts again.
soresu is offline   Reply With Quote
Old 5th December 2019, 13:00   #2023  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 123
Quote:
Originally Posted by utack View Post
Beating a dead horse at this point maybe, but dav1d having a milestone for better PPC support and no word about adding basic 10bit support seems extremely odd
Netflix has signalled they are only interested in 10bit content, Youtube started encoding 10bit for their new higher resolution videos as well.
It should clearly be a priority over armv7 and PPC assembly, and imho all the "make it fast" milestones are not reached yet.
They have basic 10 bit support I think, just very little SIMD asm to accelerate it beyond C code.
soresu is offline   Reply With Quote
Old 6th December 2019, 02:03   #2024  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,575
Regarding Google working on their own AV1 decoder, I'd imagine this has to do with being the masters of their own destiny so they can use the decoder however they want on Android / anywhere else, free from any potential licensing incompatibility with dav1d.
Blue_MiSfit is offline   Reply With Quote
Old 6th December 2019, 09:23   #2025  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,835
Quote:
Originally Posted by Blue_MiSfit View Post
Regarding Google working on their own AV1 decoder, I'd imagine this has to do with being the masters of their own destiny so they can use the decoder however they want on Android / anywhere else, free from any potential licensing incompatibility with dav1d.
dav1d has one of the most liberal licenses anywhere (2-clause BSD), Android already uses libraries under far stricter licenses.

Nevermind that Chrome already uses dav1d as well, so its not like Google somehow doesn't know about it, or something like that.

The reason is probably some BS politics, and has no logical backing.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 7th December 2019, 02:31   #2026  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,575
Hmm. That's unfortunate, I wonder if we'll ever know the real story here
Blue_MiSfit is offline   Reply With Quote
Old 9th December 2019, 23:23   #2027  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 2,984
Quote:
Originally Posted by Blue_MiSfit View Post
I didn't mean to imply that encoding 8 bit content in 10 bit AV1 would be the right thing to do.

I just meant that given premium content almost always being > = 10 bit 4:2:2 it makes sense to preserve 10 bit, especially when picky studios are highly critical of how subtle gradients are handled in their content.
Premium content is NOT >=10-bit 4:2:2. 10-bit 4:2:0 is all that anyone is distributing to consumers via streaming and disc. The masters are, certainly, but it's not like those look visually better than distribution codecs at a sufficient bitrate.

Sent from my SM-T837V using Tapatalk
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 9th December 2019, 23:25   #2028  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 2,984
Quote:
Originally Posted by huhn View Post
isn't 10 bit HEVC still generally better then 8 bit but the difference is very very small?



and the whole story of 8bpp and 16bpp x265. wasn't that an very important part why x264 10 bit was much better thanks to 10 bit bpp instead of 8.

is the 8 bpp x265 even alive?



how does AV1 handle this is this even comparable?



isn't a 8 bit quality pipe absolutely impossible anyway? just the level/RGB conversation pretty much proves it or even the chroma upscale.
There is a significant but shrinking number of devices, mainly handheld, that support 8-bit HEVC but not 10-bit.

More common is devices with 10-bit decoders that then feed into 8-bit compositors, GPUs, or display controllers.

Sent from my SM-T837V using Tapatalk
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 10th December 2019, 00:52   #2029  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,420
Quote:
Originally Posted by benwaggoner View Post
Do we have any evidence that 8-bit sources encode better in 10-bit than 8-bit in AV1?
If nothing else, 8b RGB to 10b YUV to 8b RGB is substantially higher quality than 8b YUV to 8b RGB (especially near white and black), and 10b YUV can decode to higher depth RGB wherever the panel supports it. It sure beats having to deband afterward, or drop in a shedload of noise (and bits) to hide banding. Those with 6b RGB panels will continue to wail, because hardware companies refuse to incorporate simple tricks like dithering on budget hardware until they're promoted as a requirement, but that's can't be helped.

It's true that AV1 and HEVC fixed AVC's rather staggering difference in quality between 8- and 10-bit encoding, but there's still some utility if the toolchain wants to keep quality at the forefront.
__________________
There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order.
foxyshadis is offline   Reply With Quote
Old 10th December 2019, 03:07   #2030  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,575
Quote:
Originally Posted by benwaggoner View Post
Premium content is NOT >=10-bit 4:2:2. 10-bit 4:2:0 is all that anyone is distributing to consumers via streaming and disc.
Of course. That's what I said (specifically referencing masters being >= 10 bit)

Last edited by Blue_MiSfit; 10th December 2019 at 03:09.
Blue_MiSfit is offline   Reply With Quote
Old 10th December 2019, 20:29   #2031  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 2,984
Quote:
Originally Posted by Blue_MiSfit View Post
Of course. That's what I said (specifically referencing masters being >= 10 bit)
Ah, my apologies for the confusion, then.

Sent from my SM-T837V using Tapatalk
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 10th December 2019, 22:44   #2032  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,971
Quote:
Originally Posted by benwaggoner View Post
There is a significant but shrinking number of devices, mainly handheld, that support 8-bit HEVC but not 10-bit.

More common is devices with 10-bit decoders that then feed into 8-bit compositors, GPUs, or display controllers.
i don't see anything that diminished the benefit of using more bit deep in encoding here.
Quote:
Originally Posted by foxyshadis View Post
If nothing else, 8b RGB to 10b YUV to 8b RGB is substantially higher quality than 8b YUV to 8b RGB (especially near white and black), and 10b YUV can decode to higher depth RGB wherever the panel supports it. It sure beats having to deband afterward, or drop in a shedload of noise (and bits) to hide banding. Those with 6b RGB panels will continue to wail, because hardware companies refuse to incorporate simple tricks like dithering on budget hardware until they're promoted as a requirement, but that's can't be helped.
the fact that AMD supports native 6 bit output with there own dithering which helps a lot of sub optimal screens show the benefit of doing this correctly.
the best panel i have ever tested in term of banding is a 6 bit FRC panel the processing just doesn't add banding on such a "bad" device.

and i totally agree using more bits in encoding to delay the dithering until the end device or at least the presentation of a device is a clear benefit.

just reading about 4:2:2 master is letting me wonder if this is just a bad habit that simple didn't stop instead of 16 RGB or higher even the aged BBB has this as a master and what reason could be there not to do that.
huhn is offline   Reply With Quote
Old Yesterday, 01:15   #2033  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,575
I always figured the 10 bit 4:2:2 thing most likely goes back to SDI / HD-SDI which was traditionally 10 bit 4:2:2. Capturing the full bandwidth of this signal into a file based format in a way that's perceptually lossless was long since considered "good enough" and widely used mezzanine file formats like ProRes do a great job of this - especially in the old days of 50-80 Mbps 1080p MPEG-2 service masters.

Given the context of final distribution always being 8 bit 4:2:0 until recently, perceptually lossless 10 bit 4:2:2 was indeed always good enough.

Now that we're doing HDR, 10 bit is mandatory, and 12 or even 16 bit (ProRes 4444 / 4444 XQ or JPEG 2000) is common for source material when encoding for Dolby Vision. Sure, this gets shaped into a 10 bit IPT file but allegedly more of the input can be recovered after Dolby magic.

Studio archival masters are generally 16 bit full range RGB TIFFs or the OpenEXR equivalent (not sure about the specifics of that), and probably DPX for older stuff. Maybe 16 bit lossless JPEG 2000 if the studio is IMF native.

Last edited by Blue_MiSfit; Today at 01:00.
Blue_MiSfit is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:01.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.