Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > VP9 and AV1

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd December 2020, 12:40   #2341  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,558
I've moved dav1d-specific posts to dav1d accelerated AV1 decoder, beginning from a bit over a year ago. There's still plenty of room for a general decoders comparison thread, and of course an encoders face-off thread.
foxyshadis is offline   Reply With Quote
Old 3rd December 2020, 23:52   #2342  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by soresu View Post
A lack of prevalent VP9 HW support didn't stop Google from pushing it for Youtube, and I doubt it will be any different for AV1.

Especially when there is already far better SW decoding for AV1 than VP9 had at the equivalent time from release, coupled with far more performant mobile CPU cores to decode it with - Apple aside the best ARM core at the time was A57 at around 2 Ghz, now we have X1 at around 2.84 Ghz which has to be more than 3x faster at least.
Yeah, in many ways AV1 is already more mature than any VPx implementation ever got to. There certainly are a lot more encoder vendors competing on making better looking and faster encoders, which is a huge deal.

SW DRM is simply not allowed for lots of premium content, however. AV1 is a lot more practical for user-generated and other non-commercial content than for professional licensed content.

Also, the reduced battery life of using a SW decoder matters a lot more when watching a two hour movie than short-form content.

Quote:
I imagine that Qualcomm didn't want to support VP9 either to begin with, but it eventually ended up in there as will AV1 in good time - whether that actually happens before AV2 is released is a different story.
AV2 is in pretty early stages. EVC and VVC are the next two standardized codecs that HW vendors are going to be deciding whether to put in.

Quote:
It's also worth noting that for all Qualcomm's market dominance elsewhere, China and India's market will probably be populated with many handsets that use Mediatek SoC's that do support AV1 - and their combined populations/potential market are not something to sniff at for sure.
Yeah, another divide between "Hollywood" content that is globally licensed and more regional content where DRM rules can be much more relaxed. A lot of those markets use ASOP not Google Android, and so might not include all the software decoders like AV1.

As a content creator, if one is choosing one codec beyond H.264, HEVC certainly offers a much bigger audience for 2021 except for Firefox and Chrome.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 4th December 2020, 15:10   #2343  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by benwaggoner View Post
Firefox or Chrome, which artificially block the use of HEVC HW decoders available to the underlying system.
Oh wow, I had no idea about that. So is that the actual reason why UHD isn't available on Chrome & Firefox for those streaming services that offer it? I always assumed it was because of stronger DRM in Edge.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 4th December 2020, 20:09   #2344  |  Link
ksec
Registered User
 
Join Date: Mar 2020
Posts: 117
Quote:
Originally Posted by soresu View Post
A lack of prevalent VP9 HW support didn't stop Google from pushing it for Youtube, and I doubt it will be any different for AV1.

It's also worth noting that for all Qualcomm's market dominance elsewhere, China and India's market will probably be populated with many handsets that use Mediatek SoC's that do support AV1 - and their combined populations/potential market are not something to sniff at for sure.
VP9 has had hardware support from nearly Day 1 through many different IP vendors along with HEVC as VP9 and HEVC are similar. And it was relatively simple and doesn't cost much in extra die space when HEVC was "the" requirement ( at least at the time ).

All the current hardware decoder including those in Laptops have a much higher power usage allowance, i.e You could have a hardware decoder working in 1+W range without problem. Compare to a mobile phone where it is expected to operate in few hundred mW range. This time around it isn't so simple because VVC has barely finished and on the surface doesn't seems to share that much with AV1. How this translate to hardware decoding block differences remains to be seen, especially when the power requirement is much more stringent. I have previously written this will change with 5nm SoC as both transistor budget and power usage improves, I was referring to TSMC's 5nm, the Sanpdragon 888 based on Samsung 5nm, which has a lower transistor density so it isn't quite there yet.

Finally Mediatek only has one chip that has AV1 decoder. And that is their High End flagship. 90% of Mediatek volume are low to mid range SoC. And transistor budget are even tightener in those segment.

I just wish people are more mindful of different interest in video codec, from hardware to software and from users to producers.
__________________
Previously iwod

Last edited by ksec; 4th December 2020 at 20:22.
ksec is offline   Reply With Quote
Old 5th December 2020, 01:15   #2345  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by el Filou View Post
Oh wow, I had no idea about that. So is that the actual reason why UHD isn't available on Chrome & Firefox for those streaming services that offer it? I always assumed it was because of stronger DRM in Edge.
Nope, HEVC decode actually did work by default in both browsers before it was explicitly blocked. Which was done for political, not technical reasons.

This is partly a reflection of the strong focus on user-generated content to Google (YouTube) and Facebook.

Among other things, this is why there's no browser-based HDR premium content. While AV1 technically can do HDR, no one has released an encoder with mature HDR tuning. x265 needed quite a lot of feature development to get optimal HDR encoding, since PQ and 709 have some pretty foundational differences and different optimization requirements.

The net effect is we'll probably see premium content playback on Windows/Mac continue to shift away from browsers towards apps. The large majority of PC and Mac systems can decode 10-bit HEVC in HW.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 18th December 2020, 06:07   #2346  |  Link
rubait
Registered User
 
Join Date: Jul 2020
Posts: 5
Any idea if DXVA Checker supports AV1 decode yet. I tried opening some files with it but doesn't seem to recognize it.
rubait is offline   Reply With Quote
Old 18th December 2020, 12:06   #2347  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 1,120
Quote:
Originally Posted by rubait View Post
Any idea if DXVA Checker supports AV1 decode yet.
It does: https://bluesky-soft.com/en/DXVAChecker.html
hajj_3 is offline   Reply With Quote
Old 18th December 2020, 18:13   #2348  |  Link
rubait
Registered User
 
Join Date: Jul 2020
Posts: 5
Quote:
Originally Posted by hajj_3 View Post
I tried it but it doesn't recognize the file, as it doesn't give me the option to choose a decoder when I open the file. Has anyone tried it on a Tiger Lake and does it require any special container. I have tried .MP4
rubait is offline   Reply With Quote
Old 24th December 2020, 11:31   #2349  |  Link
Jamaika
Registered User
 
Join Date: Jul 2015
Posts: 697
Is loopfilter mask function {CONFIG_LPF_MASK} needed in the av1 codec? Seems neglected and buggy.
Jamaika is offline   Reply With Quote
Old 6th January 2021, 16:33   #2350  |  Link
utack
Registered User
 
Join Date: Apr 2018
Posts: 63
Quote:
Originally Posted by GTPVHD View Post
https://videocardz.com/newz/lenovo-c...-rtx-3060-12gb
https://cdn.videocardz.com/1/2020/12...50-RTX3060.png
https://psref.lenovo.com/Product/Len...?ViewSpec=true

RTX 3050 4GB GDDR6 leaked by Lenovo, the cheapest Nvidia card with AV1 fixed-function hardware decoding when released next year.
Not sure if ffmpeg currently works inefficently but mpv with an 8K AV1 Video it allocates just over 4000MB VRAM for me, so that would not work
Could someone cross-check with the native Windows Video Player?
utack is offline   Reply With Quote
Old 6th January 2021, 22:47   #2351  |  Link
Greenhorn
Registered User
 
Join Date: Apr 2018
Posts: 61
Quote:
Originally Posted by utack View Post
Not sure if ffmpeg currently works inefficently but mpv with an 8K AV1 Video it allocates just over 4000MB VRAM for me, so that would not work
Could someone cross-check with the native Windows Video Player?
don't have MPV installed to test, but testing with a 8K60 HDR clip downloaded from Youtube shows the native media player allocating ~800 megabytes. (MPC-BE with madVR allocates ~2.1 gigabytes.) 1660 TI with a 6GB buffer.

8K is going to have huge memory requirements regardless of the codec, though; I'd expect it to basically scale by the number of frames prerendered by the player rather with additional codec overhead being negligible. I almost wonder if the suspiciously low numbers are due to decoding being too slow to fill some internal buffer, with the MS (libaom) decoder being slower than dav1d in MPC.

Last edited by Greenhorn; 7th January 2021 at 03:16.
Greenhorn is offline   Reply With Quote
Old 7th January 2021, 02:23   #2352  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by utack View Post
Not sure if ffmpeg currently works inefficently but mpv with an 8K AV1 Video it allocates just over 4000MB VRAM for me, so that would not work
Could someone cross-check with the native Windows Video Player?
There are about 34M pixels in an 8K video frame. And there's no point in non-HDR 8K and HDR will be 10-bit minimum. With 4:2:0 and assuming no bit alignment overhead, that's 2.5 bytes per pixel, about 85M per frame. Of course, there is always bit alignment overhead, and internal high precision frequency transforms could easily make for 48 bits/pixel. Frame-parallel decoding could involve several of those. And a few RGB decoded frames for buffer could be way bigger than that. 16-bits per channel at RGBA 444 would be 272 MB/frame.

HW decoders have an easier time of it because they don't need frame-level parallel decoding nor RGB buffers since they can write 420 straight to GPU. But yeah, 4GB for 8K SW decoder seems quite plausible for me if a decent number of frames need to be buffered at different stages.

None of that is specific to AV1, but AV1 is the only thing people are talking about doing 8K SW decode with. My own research hasn't found any content that actually looks better at 8K than 4K, so there's a whole lot of solution looking for a problem going on in that scenario. 8K YouTube looks better than 4K YouTube because YouTube is bit-starved at every resolution and bitrate
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 7th January 2021, 02:29   #2353  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Coming to think of it, SW AV1 decoding is actually going to have an impact on global CO2 emissions. A CPU can easily draw 20 more watts in SW decode versus HW decode. 500K simultaneous YouTube viewers watching AV1 could be another 5 MWatt more power consumption and emissions than if YouTube used HEVC. Even assuming low-emissions NG plants, that would be around an extra megaton of global CO2 emissions an hour.

Yowza.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 9th January 2021, 04:01   #2354  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 196
Quote:
Originally Posted by benwaggoner View Post
And there's no point in non-HDR 8K and HDR will be 10-bit minimum.
Arguably there's no point in >10bpc content.

I've seen plenty 8bpc content without banding so it clearly isn't inherent and I doubt that the average human could tell the difference between 10 and 12 bpc content at all.
soresu is offline   Reply With Quote
Old 10th January 2021, 15:08   #2355  |  Link
takla
Registered User
 
Join Date: May 2018
Posts: 182
Quote:
Originally Posted by benwaggoner View Post
Coming to think of it, SW AV1 decoding is actually going to have an impact on global CO2 emissions. A CPU can easily draw 20 more watts in SW decode versus HW decode. 500K simultaneous YouTube viewers watching AV1 could be another 5 MWatt more power consumption and emissions than if YouTube used HEVC. Even assuming low-emissions NG plants, that would be around an extra megaton of global CO2 emissions an hour.

Yowza.
Yikes. I can already see the headlines for laws being passed (in EU countries anyway)
takla is offline   Reply With Quote
Old 11th January 2021, 03:34   #2356  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 196
Quote:
Originally Posted by takla View Post
Yikes. I can already see the headlines for laws being passed (in EU countries anyway)
With products capable of 8K AV1 decode already on the market, by the time they passed a law there would be far more in consumer hands making the law redundant - I'd take it as a given someone willing to waste money buying an 8K TV probably would be willing to shell out for the latest and greatest PC and gfx card too.

That being said, the Samsung 8K TV models already have terrible power efficiency even without other issues coming in to play - I'm not sure whether it is to do with them having more FALD zones or just higher peak nits (or a combo of both) but the lowest efficiency rating their 4K QLED TVs have is B, whereas their 8K TV's can go as low as D (A being the best rating).

There's also the hybrid decoder recently committed for XB1 and later consoles using DX shaders and UWP, it would be interesting to see what the power consumption on the XSX doing 8k AV1 decode when using that.

Last edited by soresu; 11th January 2021 at 03:38.
soresu is offline   Reply With Quote
Old 11th January 2021, 04:53   #2357  |  Link
takla
Registered User
 
Join Date: May 2018
Posts: 182
Quote:
Originally Posted by soresu View Post
With products capable of 8K AV1 decode already on the market, by the time they passed a law there would be far more in consumer hands making the law redundant
Yeah true. I thought about it some more and came to the same conclusion.
takla is offline   Reply With Quote
Old 12th January 2021, 15:43   #2358  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 196
It seems that at least Samsung's mobile division is pushing AV1 support going by their latest reveal at CES of the new Exynos 2100 SoC destined for Galaxy S21.

Given reports put AV1 support in 2020 QLED models I will wait until actual hardware is in reviewers hands before I dance for joy.
Attached Images
 
soresu is offline   Reply With Quote
Old 13th January 2021, 00:08   #2359  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by soresu View Post
Arguably there's no point in >10bpc content.

I've seen plenty 8bpc content without banding so it clearly isn't inherent and I doubt that the average human could tell the difference between 10 and 12 bpc content at all.
The problem is dithering doesn't encode very well. A smooth gradient from Y'=64 to Y'=72 across a 1920x1080 frame is going to have banding in 8-bit without really good dithering that actual frequency-transform compression tends to lose.

And HDR with 8-bit is much harder. Just encoding Rec 2100 content in 8-bit yields a horrible mess.

And it's challenging to detect full 4K detail in SDR for natural images, and in many cases impossible even by expert viewers. HDR is what makes 4K generally worthwhile for natural images. Seeing the difference between carefully selected 4K and 8K HDR moving images is only possible by expert viewers with 20/10 vision and only on a minority of "stress test" clips.

Higher resolutions pay off a lot more for computer games, but that's more about the limitations of anti-aliasing technology and the much greater local contrast of synthetic graphics. Rendering games at 4K and downscaling to 1080p still looks a lot better than native 1080p gaming.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 13th January 2021, 00:15   #2360  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by soresu View Post
With products capable of 8K AV1 decode already on the market, by the time they passed a law there would be far more in consumer hands making the law redundant - I'd take it as a given someone willing to waste money buying an 8K TV probably would be willing to shell out for the latest and greatest PC and gfx card too.

That being said, the Samsung 8K TV models already have terrible power efficiency even without other issues coming in to play - I'm not sure whether it is to do with them having more FALD zones or just higher peak nits (or a combo of both) but the lowest efficiency rating their 4K QLED TVs have is B, whereas their 8K TV's can go as low as D (A being the best rating).

There's also the hybrid decoder recently committed for XB1 and later consoles using DX shaders and UWP, it would be interesting to see what the power consumption on the XSX doing 8k AV1 decode when using that.
TVs don't have the horsepower for SW decode in any case. The big power differential is with computers which can provide lots of peak compute in exchange for much more power draw. And we're talking 2022 before even half of new PCs have AV1 HW decode, and 2025+ before the installed based could be even 50%. YouTube using any codec that doesn't have a HW decoder on a system that does have some HW decoders must hugely add up. Plus the encoding power needed is also a lot higher.

The XSX decoder is probably better, but consoles are power beasts in general. Xbox and PS consoles generally draw >100 watts to just have something on the screen. Compare to things like Roku or Fire TV which draw <10 watts running full blast.

Of course, when doing streaming over 4/5G, higher bandwidths also mean more antenna power, so there's some tradeoff there somewhere.

Environmental organizations should really come out with a browser plugin to force YouTube et all to only stream the best codec that has a HW decoder.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:53.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.