Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > New and alternative video codecs

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th October 2023, 07:10   #1  |  Link
rwill
Registered User
 
Join Date: Dec 2013
Posts: 343
Chrome maybe deprecates Theora

https://groups.google.com/a/chromium...m/ajHePzglAwAJ

A hard blow for Internet Human Rights, it is planned that Chrome will deprecate the Theora Video Codec support.
rwill is offline   Reply With Quote
Old 25th October 2023, 08:02   #2  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,341
Internet Human Rights? What?

We have better patent-free and opensource video codecs, and if the usage of Theora is practically near zero, any remaining use can be replaced by a polyfill, and it faces emerging security threats without rapid responses in place... time to move on to VP9 and AV1.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Old 25th October 2023, 08:35   #3  |  Link
birdie
Artem S. Tashkinov
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 337
Quote:
Originally Posted by rwill View Post
https://groups.google.com/a/chromium...m/ajHePzglAwAJ

A hard blow for Internet Human Rights, it is planned that Chrome will deprecate the Theora Video Codec support.
A hard blow to pretty much no one, because I bet out of all videos posted online, there are like 0.000001% using this abandoned irrelevant codec.

Browsers mustn't support something just because 10 people in the whole world need it. At the same time, 99.999999% people are exposed to possible vulnerabilities hidden in this codec. It's just not worth it.

You wanna play such videos? Compile ffmpeg and have fun.

Let's not waste time discussing this nothingburger.
birdie is offline   Reply With Quote
Old 25th October 2023, 11:41   #4  |  Link
Ritsuka
Registered User
 
Join Date: Mar 2007
Posts: 95
Doesn't Wikipedia already use a webassembly Theora decoder for unsupported browsers? Or was it only for vorbis?
Ritsuka is offline   Reply With Quote
Old 25th October 2023, 16:59   #5  |  Link
birdie
Artem S. Tashkinov
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 337
Quote:
Originally Posted by Ritsuka View Post
Doesn't Wikipedia already use a webassembly Theora decoder for unsupported browsers? Or was it only for vorbis?
It was Vorbis and it was by Firefox, not Wikipedia, AFAIK.
birdie is offline   Reply With Quote
Old 25th October 2023, 18:06   #6  |  Link
Ritsuka
Registered User
 
Join Date: Mar 2007
Posts: 95
It seems it has been used since 2015 on wikipedia, and Theora is supported too: https://github.com/brion/ogv.js/
Ritsuka is offline   Reply With Quote
Old 27th October 2023, 01:51   #7  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Ritsuka View Post
It seems it has been used since 2015 on wikipedia, and Theora is supported too: https://github.com/brion/ogv.js/
Wow. That's the only place I've seen Theora in the wild for over a decade.

For those who weren't there for the blow-by-blow, it started in the late 90's when On2 released VP3, and tried giving decoders away for free to drive an encoder market. They released free decoders for QuickTime and other platforms. Didn't get much traction (rate control in the available encoder was horrifically bad). By the time they had a roadmap to having VP6 in Flash (I forget if VP4 or 5 were used for anything significant), they wanted to dump it. They tried to give it away for free to Terran Interactive where I worked, among other places. No takers.

So when there was a need for a royalty-free codec, they offered it up. And it worked, but it was still a pretty mediocre mid-90's CD-ROM codec. The open source version got some extensions and tuning, so it was better than stock VP3, but there was only so far it could be pushed. Theora launched with compression efficiency well behind the RealVideo and Windows Media codecs available at the time. I actually demonstrated that a well-tuned MPEG-1 could outperform it for some reason.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 27th October 2023, 16:43   #8  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,259
Quote:
Wow. That's the only place I've seen Theora in the wild for over a decade.
I second that, I didn't even know that there were browsers supporting theora.

Cu Selur
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 27th October 2023, 17:14   #9  |  Link
microchip8
ffx264/ffhevc author
 
microchip8's Avatar
 
Join Date: May 2007
Location: /dev/video0
Posts: 1,843
F* Theora. Move on, nothing to see!
__________________
ffx264 || ffhevc || ffxvid || microenc
microchip8 is offline   Reply With Quote
Old 29th October 2023, 00:42   #10  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,883
Rip Theora.
You've never been good, but you paved the way for what AV1 is today and for that we will always be thankful.

FranceBB is offline   Reply With Quote
Old 30th October 2023, 16:52   #11  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by FranceBB View Post
Rip Theora.
You've never been good, but you paved the way for what AV1 is today and for that we will always be thankful.
Figuratively in the sense of being a royalty-free video codec. Although arguably MPEG-1 and H.263 were also royalty free and available earlier. SO many proprietary codecs of the 90's were basically H.263 with some extra features.

And literally in that VP3 became Theora, and also VP4-7, and with Google's purchase of On2, VP8 and VP9. AV1's code base started with the VP10 in-progress implementation, with contributions from other stakeholders and efforts (Thor and Daala most notably).
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 6th November 2023, 14:00   #12  |  Link
kurkosdr
Registered User
 
Join Date: Aug 2009
Posts: 309
Quote:
Originally Posted by benwaggoner View Post
Figuratively in the sense of being a royalty-free video codec. Although arguably MPEG-1 and H.263 were also royalty-free and available earlier. SO many proprietary codecs of the 90's were basically H.263 with some extra features.
H.263 could not possibly be royalty-free at the time, it relies too much on H.262 aka MPEG-2 (which at the time was heavily patented), and let's not forget that MPEG-4 ASP has had a sizeable patent pool too (the last US patent expires next week), so I doubt at least some of those patents didn't also affect H.263.

Proprietary codecs of the 90s operated on the principle of "keep the spec secret and hope nobody takes the time to find out we step on a ton of H.263 patents". WMV is a good example of this: the moment the spec was opened as VC-1, patent holders started pointing at parts of the spec and saying "Hey! We have a patent on that", and soon after a MPEG LA patent pool was assembled for VC-1 (which is active to this day).

Theora was the first format to withstand patent lawsuits even with an open spec. And considering the MPEG LA assertions against VP8 were settled, it's the only format with an open spec that had no patent assertions against it at all. I know, irrelevant after the MPEG LA assertions against VP8 were settled, but it's worth mentioning: Theora was crap because it had to avoid a patent thicket.

And let's not forget Theora and VP8 are the reason the H.264 patent holders agreed to not charge "content fees" for free web video, which made H.264 a truly universal standard.

Last edited by kurkosdr; 6th November 2023 at 14:21.
kurkosdr is offline   Reply With Quote
Old 29th December 2023, 04:20   #13  |  Link
monsterogv
Registered User
 
Join Date: Mar 2023
Posts: 5
zero day attack, I got it on streaming. codecs like vp9 a donation. zero day attack on mpeg streaming, how does the revised technology experience this?
monsterogv is offline   Reply With Quote
Old 2nd January 2024, 03:49   #14  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by kurkosdr View Post
H.263 could not possibly be royalty-free at the time, it relies too much on H.262 aka MPEG-2 (which at the time was heavily patented), and let's not forget that MPEG-4 ASP has had a sizeable patent pool too (the last US patent expires next week), so I doubt at least some of those patents didn't also affect H.263.
It's been a long time, but I thought there was some sort of agreement not to enforce H.263 patents, like there had been for MPEG-1.

Quote:
Proprietary codecs of the 90s operated on the principle of "keep the spec secret and hope nobody takes the time to find out we step on a ton of H.263 patents". WMV is a good example of this: the moment the spec was opened as VC-1, patent holders started pointing at parts of the spec and saying "Hey! We have a patent on that", and soon after a MPEG LA patent pool was assembled for VC-1 (which is active to this day).
IIRC, there were a bunch of VC-1 patents that Microsoft had been waiting to file until the standard was done, but that other companies had figured out from the reference code implementation (which was pretty much just the production code with all the fun optimizations), and filed first. I didn't have any real engagement with the VC-1 design or patent process; I joined Microsoft after the bitstream was frozen.

Quote:
Theora was the first format to withstand patent lawsuits even with an open spec. And considering the MPEG LA assertions against VP8 were settled, it's the only format with an open spec that had no patent assertions against it at all. I know, irrelevant after the MPEG LA assertions against VP8 were settled, but it's worth mentioning: Theora was crap because it had to avoid a patent thicket.
Theora was almost entirely an open-source implementation of VP3; On2 started to engineer around software patents a long time back. I remember stuff like scanning from the bottom of the frame instead of top, just trying to do things differently than the "obvious" way to get around swaths of patents that assumed coding would always be from the upper left.

Quote:
And let's not forget Theora and VP8 are the reason the H.264 patent holders agreed to not charge "content fees" for free web video, which made H.264 a truly universal standard.
I think that was more VP6, which was the "good" codec for Flash before they added H.264. On2 had a whole lot of predatory pricing around encoders, decoders, distribution; pretty much anything you could think of monetizing. A memorable object lesson in "how to kill your ecosystem with greed."

A whole lot of the subjective quality of VP6 was due to its advanced postprocessing, far beyond any other codec did. Noise synthesis, deblocking, deringing, all of that. When that got turned off in a decoder (through configuration, or because fps dropped too much), the actual baseband video was not great. VP6 was handily outperformed by VC-1 Main Profile and H.264 Baseline.

That said, VP7 also gave us http adaptive streaming! Move Networks' stuff was all based on VP7. I was involved in getting them to use VC-1, but the company sort of cratered soon after. Big fees and poor flexibility for business model tanked them. It turned out to cheaper just to reimplement. Microsoft had done a technology cross-licensing deal with Move, hence Smooth Streaming, which begat DASH.

Sometime I'll tell the story of the fight to use fragmented MPEG-4 instead of ASF. What finally sealed the deal was my demonstrating how the lack of ASF support for B-frames would make H.264 support infeasible (H.264's max 16 b-frames is a lot of a 48 frame GOP)!

Both ASF and QuickTime were originally designed without any consideration of bidirectional prediction, which required a whole lot of reearchitecting to make work. This is one reason why VC-1 was optimized around not using more than 1 b-frame in a row.

Sorenson Video 3 in QuickTime just offset the frame indices by 2 frames so it could also use a single b-frame (so the payload of "Frame 4" was actually the second frame). So audio sync would be off two frames. It took me ages to get the post-Terran Interactive Media Cleaner Pro team to just implement a two frame audio shift to compensate.

B-frames in ASF did the same thing, although the audio offset was built in so didn't become an issue.

QuickTime was years late in supporting MPEG-1 because of the lack of B-frame support; having three in a row with Open GOP was just impossible to do within QuickTime without major refactoring. That was why MPEG-1 decode was implemented as a format playback plug-in, not as a QuickTime codec. I think that finally got fixed in QuickTime 7.

Fortunately the QuickTime-derived MPEG-4 file format got b-frames figured out from the start.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 2nd January 2024, 13:47   #15  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,883
I'm always fascinated when I read about these stories.
Back when I was still playing around, Ben was living what has then become history.
It's always nice to have these things narrated by those who actually lived/were part of it.
Oh and by the way, I had no idea you were working for M$ in 2006!
I feel like this is the Doom9 video codec version of the Professor Brailsford tales on Computerphile
FranceBB is offline   Reply With Quote
Old 3rd January 2024, 20:22   #16  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
As always, thanks for sharing your war stories, Ben
__________________
These are all my personal statements, not those of my employer :)
Blue_MiSfit is offline   Reply With Quote
Old 9th January 2024, 20:20   #17  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by FranceBB View Post
I'm always fascinated when I read about these stories.
Back when I was still playing around, Ben was living what has then become history.
It's always nice to have these things narrated by those who actually lived/were part of it.
Oh and by the way, I had no idea you were working for M$ in 2006!
I feel like this is the Doom9 video codec version of the Professor Brailsford tales on Computerphile
Thanks!

My first encode was with Macromind Director Accelerator back in 1989. I was taking a grad level 3D animation class at Mass in my 2nd year at Hampshire College, and we needed a way to move our rendered frames around on 1.4 MB Mac floppy disks (dorm room access to school servers was only via 1200 baud modem). I turned out to be a terrible animator, and got a D in the class, but started designing (terrible) video codecs in my head in class after that. The teachers and some students went on to create Infini-D if anyone remembers that.

I remember the revelation that was Apple Compact Video (later renamed Cinepak). And when IMA audio came out, and we could do (mediocre) 16-bit audio in half the bitrate of 8-bit. We could do 22 kHz mono instead of 11 at 1x CD-ROM bitrates.

CD-ROM itself became popular after I'd already been doing encoding for money (trade show and kiosk use initially). I used to buy blank CD-R discs for $18 for our $2000 1x burner.

Oddly, much of my early career happened by accident when I was trying to become a screenwriter and movie producer. I actually scripted a 2-page scene of people flirting over video compression ratios back in 1993 - what I've done since seems almost fated. I didn't finally realize that compression was my actual career until 1997.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 10th January 2024, 12:30   #18  |  Link
Emulgator
Big Bit Savings Now !
 
Emulgator's Avatar
 
Join Date: Feb 2007
Location: close to the wall
Posts: 1,531
1987, my penultimate year of studying for my Diplom-Ingenieur grade.

I was sitting behind my implementation of a Signal Processing PCB
(Programmable preamp, Antilaliasing filtering, Sample & Hold around an industrial-grade 15-bit ADC feeding a TMS32000 Signal processor.

My tutor and his master pupil had the FFT butterfly algo ready and then somebody else (our professor) came in
and while reconsidering the project from a recent meeting perspective he mentioned something along the lines:
"...and one day we will need to perform data reduction, it is done already in the states..."

I was like: Eek, Devil's craft. Now we finally have good, clean data, and lots of them.
Well, how and why to reduce what we just gained ?
This would introduce faults wouldn't it ?

37 years and some Terawatthours later: What an art that has become. ;-}
__________________
"To bypass shortcuts and find suffering...is called QUALity" (Die toten Augen von Friedrichshain)
"Data reduction ? Yep, Sir. We're that issue working on. Synce invntoin uf lingöage..."
Emulgator is offline   Reply With Quote
Old 10th January 2024, 17:40   #19  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Emulgator View Post
1987, my penultimate year of studying for my Diplom-Ingenieur grade.

I was sitting behind my implementation of a Signal Processing PCB
(Programmable preamp, Antilaliasing filtering, Sample & Hold around an industrial-grade 15-bit ADC feeding a TMS32000 Signal processor.

My tutor and his master pupil had the FFT butterfly algo ready and then somebody else (our professor) came in
and while reconsidering the project from a recent meeting perspective he mentioned something along the lines:
"...and one day we will need to perform data reduction, it is done already in the states..."

I was like: Eek, Devil's craft. Now we finally have good, clean data, and lots of them.
Well, how and why to reduce what we just gained ?
This would introduce faults wouldn't it ?

37 years and some Terawatthours later: What an art that has become. ;-}
Two years head start on me!

Yeah, I remember so many philosophical arguments about lossy versus lossless compression in the early days (and it still comes up sometimes, particularly with audio).

It generally got down to something like "do you want to spend your bits on 96x72 of perfect pixels, or 320x240 of good pixels? Downscaling is lossy too!"
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 12th January 2024, 21:23   #20  |  Link
kurkosdr
Registered User
 
Join Date: Aug 2009
Posts: 309
Quote:
Originally Posted by benwaggoner View Post
Two years head start on me!

Yeah, I remember so many philosophical arguments about lossy versus lossless compression in the early days (and it still comes up sometimes, particularly with audio).

It generally got down to something like "do you want to spend your bits on 96x72 of perfect pixels, or 320x240 of good pixels? Downscaling is lossy too!"
Another big problem of lossless video compression is that it wouldn't help in the broadcast world where the bitstream has to fit in a given bits per second capacity of the multiplex, because lossless compression does not guarantee any bitrate reduction. This means you'd essentially be capped to the same resolution you'd have with uncompressed video (and lossless compression would only help viewers reduce their storage costs when recording/timeshifting). This means that even a state-of-the-art DVB-T2 multiplex could only do 352x288 or so, and that's generous (you'd have to use fragile 256QAM FEC 2/3 modulation to get ~40Mbps out of DVB-T2, and you'd have to pick an even worse FEC to accommodate uncompressed/lossless audio too).

Of course, lossy compression has the problem that too low bitrates are too often used. The temptation to cram yet another channel is just too big for broadcasters. Every time I see 4Mbps average bitrates on digitalbitrate.com for FullHD and H.264 channels, it just makes me sad. I mean, some of us pay a subscription to some "public broadcaster" whether we want it or not, so at the very least, they should give us artifact-free broadcast video, yes even when there is lots of stuff going on in the video.

IMO an ideal solution would be to have a minimum statistical bitrate per channel legislated. Divide the mux reasonably instead of cramming an ever-increasing number of channels in. Governments legislate for all kinds of specs, so why not? Unfortunately, most people don't even know that quality loss due to lossy compression is not only a thing but can be very significant too. Most people only know resolution is a thing and that's it.

Last edited by kurkosdr; 12th January 2024 at 21:42.
kurkosdr is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:48.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.