Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > New and alternative video codecs

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th June 2019, 14:37   #1741  |  Link
bstrobl
Registered User
 
Join Date: Jun 2016
Posts: 55
First SoC launched by Realtek: https://www.realtek.com/en/press-roo...-cas-functions
bstrobl is offline   Reply With Quote
Old 17th June 2019, 14:40   #1742  |  Link
EwoutH
Registered User
 
Join Date: Feb 2019
Location: Delft, Netherlands
Posts: 15
Press release: https://www.realtek.com/en/press-room/news-releases/item/realtek-launches-worldwide-first-4k-uhd-set-top-box-soc-rtd1311-integrating-av1-video-decoder-and-multiple-cas-functions
EwoutH is offline   Reply With Quote
Old 17th June 2019, 20:07   #1743  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 86
Faster than I expected for a hardware ASIC release, with HDMI 2.1 support no less - though HDMI 2.1 is obviously overkill for 4K60p video, the release doesn't say anything about higher than 4K support.
soresu is offline   Reply With Quote
Old 17th June 2019, 21:04   #1744  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 893
nice to see but realtek charges a lot more for their chips than amlogic, allwinner, huawei, mediatek and rockchip which is why cheap android tv boxes don't use them.
hajj_3 is offline   Reply With Quote
Old 17th June 2019, 22:25   #1745  |  Link
IgorC
Registered User
 
Join Date: Apr 2004
Posts: 1,306
https://www.reddit.com/r/AV1/comment...han_x264_x265/

Everybody compares AV1 to VP9 on 8 bits for both.

What about VP9 10 bits?
It makes also sense to compare AV1 8 bits vs VP9 10 bits.

VP9 10 bits has several advantages over AV1 8 bits at this moment:
  • Significantly faster encoding
  • Significantly faster decoding
  • Hardware support
  • Should has ~10-20% better compression than VP9 8 bits (not that far from AV1’s 25-30%)

It makes sense to employ VP9 10 bits at least for 2-3 years more until a final jump to AV1. It will give additional time for AV1 to develop better encoders and decoders.

Well, Google and Netflix already use VP9 10 bits for their HDR content but SDR could benefit as well.

Plus VP9 benefits a LOT from 10 bits because it suffers from blocking and banding not less than H.264 as it indicates here https://sonnati.wordpress.com/2016/0...ntion-part-ii/

Last edited by IgorC; 17th June 2019 at 22:28.
IgorC is offline   Reply With Quote
Old 19th June 2019, 09:24   #1746  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 86
Always interesting to hear about AI/ML tidbits related to video encoding, the Visionular speaker at the Big Apple Video conference (26th June) has this in her summary:

"Finally, we will introduce certain AI+codec techniques that could provide certain novel coding tools leveraging the use of deep learning for the next AOM standard, possibly AV2."
soresu is offline   Reply With Quote
Old 19th June 2019, 18:33   #1747  |  Link
dapperdan
Registered User
 
Join Date: Aug 2009
Posts: 179
I think this paper covers one such proposal for AV2 and has the speaker as an author:

https://link.springer.com/chapter/10...319-94361-9_18

Couldn't quickly find a public version, but the citations took me to this which I think was another technique discussed:

https://arxiv.org/pdf/1804.09291

Last edited by dapperdan; 19th June 2019 at 18:36.
dapperdan is offline   Reply With Quote
Old 20th June 2019, 08:56   #1748  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,413
Reminds me of NNEDI, both in that there can be surprising visual gains and in that it will require enormous CPU & GPU power just to decode.

But isn't discussion of AV2 getting way off topic? We have an entire forum for discussing new and potential codecs, this thread is just getting more polluted and useless every week.
__________________
There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order.
foxyshadis is offline   Reply With Quote
Old 20th June 2019, 19:40   #1749  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 2,874
Quote:
Originally Posted by IgorC View Post
https://www.reddit.com/r/AV1/comment...han_x264_x265/

Everybody compares AV1 to VP9 on 8 bits for both.

What about VP9 10 bits?
It makes also sense to compare AV1 8 bits vs VP9 10 bits.
Why not compare AV1 10-bit vs. VP9 10-bit?

The barriers to using 10-bit seem pretty similar in both cases, namely longer encoding time, lack of 10-bit sources or processing chains (getting much better), reliance on good dithering in display system, somewhat slower SW decode, and rarer HW decode support.

AFAIK, no one is planning any 8-bit only AV1 decoders, so 10-bit might be able to be used by default more often with AV1 if HW decoders become dominant. SW decoders need more optimization for 10-bit to make it competitive for higher resolutions.

I expect 10-bit to become generally mainstream as it is required for HDR, and we're near or past the tipping point where the majority of new video consumption devices support at least HDR-10.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 20th June 2019, 23:43   #1750  |  Link
IgorC
Registered User
 
Join Date: Apr 2004
Posts: 1,306
Of course You can compare AV1 and VP9 both 10bits.

My main point was not very disruptive move from VP9 8 bits to VP9 10 bits as a short term strategy (2-3 years).

Let’s put some numbers. My i7 notebook uses 20-25% of CPU during Youtube 1080p@60fps (VP9 8 bits). If it was VP9 10 bits that would be 5-7% additional CPU usage. Still pretty acceptable.
Now AV1 8 bits consumes whooping 60% at that resolution and framerate (and that with the last version of dav1d). While my notebook still can play it but a fan noise and overall slowness are quite annoying.
Let alone AV1 10 bits. Dav1d hasn’t any 10 bits code yet and it will take some time to get fast 10 bits decoding and/or hardware acceleration. My notebook gets very hot and drops a few frames here and there with near 100% CPU load with AV1 10 bits on 1080p@60. Also my another notebook with Kaby Lake i7 already has VP9 8-/ 10- bits hardware acceleration. So why not?

VP9 8 bits suffers from strong blocking and banding in dark areas and tones in my experience with Youtube and mobile Netflix videos. While VP9 10 bits can handle it very well with a little extra CPU overhead.
IgorC is offline   Reply With Quote
Old 20th June 2019, 23:55   #1751  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 2,874
Quote:
Originally Posted by IgorC View Post
Of course You can compare AV1 and VP9 both 10bits.

My main point was not very disruptive move from VP9 8 bits to VP9 10 bits as a short term strategy (2-3 years).

Letís put some numbers. My i7 notebook uses 20-25% of CPU during Youtube 1080p@60fps (VP9 8 bits). If it was VP9 10 bits that would be 5-7% additional CPU usage. Still pretty acceptable.
Now AV1 8 bits consumes whooping 60% at that resolution and framerate (and that with the last version of dav1d). While my notebook still can play it but a fan noise and overall slowness are quite annoying.
Let alone AV1 10 bits. Dav1d hasnít any 10 bits code yet and it will take some time to get fast 10 bits decoding and/or hardware acceleration. My notebook gets very hot and drops a few frames here and there with near 100% CPU load with AV1 10 bits on 1080p@60. Also my another notebook with Kaby Lake i7 already has VP9 8-/ 10- bits hardware acceleration. So why not?

VP9 8 bits suffers from strong blocking and banding in dark areas and tones in my experience with Youtube and mobile Netflix videos. While VP9 10 bits can handle it very well with a little extra CPU overhead.
Well the obvious NOW strategy is to keep using H.264 or HEVC, which have broad HW decoder support, and not use CPU at all. x264 properly tuned is at least as good as VP9 for lots of real world content.

I don't see any software decoder solution becoming mainstream, since we have more than good enough HW codec options now.

Fingers crossed that all AV1 HW decoders include 10-bit support. It would be great to have a codec out there where >8-bit support is guaranteed.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 21st June 2019, 01:19   #1752  |  Link
IgorC
Registered User
 
Join Date: Apr 2004
Posts: 1,306
Quote:
Originally Posted by benwaggoner View Post
Well the obvious NOW strategy is to keep using H.264 or HEVC
Yeah, nice try.

But fortunately Google and Netflix don't think so.
Both make a major accent on VP9 and AV1.

Cheers.
IgorC is offline   Reply With Quote
Old 21st June 2019, 04:46   #1753  |  Link
Quikee
Registered User
 
Join Date: Jan 2006
Posts: 41
Quote:
Originally Posted by benwaggoner View Post
Fingers crossed that all AV1 HW decoders include 10-bit support. It would be great to have a codec out there where >8-bit support is guaranteed.
There are 3 AV1 profiles (Main, High and Professional) and all define a mandatory 10-bit support. From that I think it's a good chance there will be HW 10-bit support from the beginning. Of course HW manufacturers still can disappoint.

VP9 has 10-bit support only from profile 2 on, where profile 0 and 1 are 8-bit only, which is why 10-bit HW support is less common.
Quikee is offline   Reply With Quote
Old 22nd June 2019, 07:51   #1754  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 86
Quote:
Originally Posted by benwaggoner View Post
AFAIK, no one is planning any 8-bit only AV1 decoders, so 10-bit might be able to be used by default more often with AV1 if HW decoders become dominant. SW decoders need more optimization for 10-bit to make it competitive for higher resolutions.
True enough, but the libaom decoder isn't nearly as well optimised as dav1d at present, and even the dav1d devs seem to be completely ignoring 10 bit optimisation while they concentrate on getting 8 bit working well across at least x86 SSSE3/SSE4/AVX2, and ARM NEON SIMD targets.

I'd say that the progress so far is pretty incredible for such a young codec, decoder and encoder wise.

From what I remember it also took a while for the OpenHEVC decoder library to get optimised, and they weren't concentrating on mobile nearly as much as the AV1 groups seem to be - though the increased core counts and IPC of current ARM implementations may have influenced that focus to some degree.
soresu is offline   Reply With Quote
Old 22nd June 2019, 08:22   #1755  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,733
Quote:
Originally Posted by soresu View Post
the dav1d devs seem to be completely ignoring 10 bit optimisation while they concentrate on getting 8 bit working well across at least x86 SSSE3/SSE4/AVX2, and ARM NEON SIMD targets.
10-bit isn't being "ignored", 8-bit was quite simply a much higher priority since content with that is actually available to the public since YouTube started shipping it, while 10-bit is not. And there is only so many hours in a day.

Work on 10-bit has started now, but due to the nature of the beast, SIMD stuff cannot be easily ported from 8-bit to 10-bit, so its a lot of work still.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Old 27th June 2019, 09:09   #1756  |  Link
dapperdan
Registered User
 
Join Date: Aug 2009
Posts: 179
Lots of interesting talks at the Big Apple Video event (that's Apple as in New York, not Mac Os X):

Probably the most interesting for this group is the second half of Ronald Bultje's talk which covers his Eve-AV1 encoder and some conparisons with other codec and encoders:



But lots of other interesting stuff from other speakers too if you click through to the channel to see the full list.

I'll also mention this one from Cisco as it's got a kind of boring sounding title but had some interesting stuff around complexity Vs speed in AV1 after the live demos.


dapperdan is offline   Reply With Quote
Old 27th June 2019, 17:03   #1757  |  Link
mandarinka
Registered User
 
mandarinka's Avatar
 
Join Date: Jan 2007
Posts: 724
I wonder how much does VMAF really speak about visual quality and compression efficiency while keeping detail (as opposed to the usual issue with metrics, the "blur more for maximum PSNR/SSIM" effect), seeing how in those slides, *everything* except Rav1e and x264 is shown as matching or outdoing x265. Well, I guess there's already the usual assertion/claim that x265 = lipvpx-vp9 that raises questions. I always stop wondering at that point in these presentations...

Last edited by mandarinka; 27th June 2019 at 17:12.
mandarinka is offline   Reply With Quote
Old 27th June 2019, 19:12   #1758  |  Link
dapperdan
Registered User
 
Join Date: Aug 2009
Posts: 179
My theory is that the people hired for the subjective tests that underly the objective stats or that vote VP9 as very slightly better than x265 in the MSU tests on subjectify.us have a different notion of quality than the kind of person who is interested in codecs for their own sake.

Like, I read a paper recently where someone was applying their grain synthesis approach to HEVC and the subjective tests they did to prove it worked showed they could get basically all the subjective benefit by just doing the noise removal step and not bothering to add the grain back in, something that could be done by any encoder, for any codec (and I'm guessing this makes up part of the secret sauce of some encoders).

But I guess someone who said they could get a massive increase in subjective quality via the Psy optimisation of basically blurring the input would get some pushback on that view in some quarters, even with subjective tests to back it up.

Link to the paper. It seems at higher qualities the people saw the added grain as a defect rather than a quality improvement (though still a statistical tie mostly).

https://arxiv.org/abs/1904.11754

Last edited by dapperdan; 27th June 2019 at 19:52.
dapperdan is offline   Reply With Quote
Old 28th June 2019, 16:29   #1759  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 86
Having watched some (but not all) the BAV presentations - I know that AV1 is currently not ideal for running a battery of tests at short notice, but did they really need to use such outdated versions of competing codecs?

Im pretty sure that the x265 build was from January, and the libaom build from february in one of them.

Maybe I'm missing something and those builds were picked for stability?
soresu is offline   Reply With Quote
Old 28th June 2019, 22:31   #1760  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,505
Great talk from Ronald. If only I had the time to do an evaluation of Eve_AV1
Blue_MiSfit is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:57.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.