Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
25th October 2023, 07:10 | #1 | Link |
Registered User
Join Date: Dec 2013
Posts: 343
|
Chrome maybe deprecates Theora
https://groups.google.com/a/chromium...m/ajHePzglAwAJ
A hard blow for Internet Human Rights, it is planned that Chrome will deprecate the Theora Video Codec support. |
25th October 2023, 08:02 | #2 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,341
|
Internet Human Rights? What?
We have better patent-free and opensource video codecs, and if the usage of Theora is practically near zero, any remaining use can be replaced by a polyfill, and it faces emerging security threats without rapid responses in place... time to move on to VP9 and AV1.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
25th October 2023, 08:35 | #3 | Link | |
Artem S. Tashkinov
Join Date: Dec 2006
Posts: 337
|
Quote:
Browsers mustn't support something just because 10 people in the whole world need it. At the same time, 99.999999% people are exposed to possible vulnerabilities hidden in this codec. It's just not worth it. You wanna play such videos? Compile ffmpeg and have fun. Let's not waste time discussing this nothingburger. |
|
25th October 2023, 18:06 | #6 | Link |
Registered User
Join Date: Mar 2007
Posts: 95
|
It seems it has been used since 2015 on wikipedia, and Theora is supported too: https://github.com/brion/ogv.js/
|
27th October 2023, 01:51 | #7 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
|
Quote:
For those who weren't there for the blow-by-blow, it started in the late 90's when On2 released VP3, and tried giving decoders away for free to drive an encoder market. They released free decoders for QuickTime and other platforms. Didn't get much traction (rate control in the available encoder was horrifically bad). By the time they had a roadmap to having VP6 in Flash (I forget if VP4 or 5 were used for anything significant), they wanted to dump it. They tried to give it away for free to Terran Interactive where I worked, among other places. No takers. So when there was a need for a royalty-free codec, they offered it up. And it worked, but it was still a pretty mediocre mid-90's CD-ROM codec. The open source version got some extensions and tuning, so it was better than stock VP3, but there was only so far it could be pushed. Theora launched with compression efficiency well behind the RealVideo and Windows Media codecs available at the time. I actually demonstrated that a well-tuned MPEG-1 could outperform it for some reason. |
|
30th October 2023, 16:52 | #11 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
|
Quote:
And literally in that VP3 became Theora, and also VP4-7, and with Google's purchase of On2, VP8 and VP9. AV1's code base started with the VP10 in-progress implementation, with contributions from other stakeholders and efforts (Thor and Daala most notably). |
|
6th November 2023, 14:00 | #12 | Link | |
Registered User
Join Date: Aug 2009
Posts: 309
|
Quote:
Proprietary codecs of the 90s operated on the principle of "keep the spec secret and hope nobody takes the time to find out we step on a ton of H.263 patents". WMV is a good example of this: the moment the spec was opened as VC-1, patent holders started pointing at parts of the spec and saying "Hey! We have a patent on that", and soon after a MPEG LA patent pool was assembled for VC-1 (which is active to this day). Theora was the first format to withstand patent lawsuits even with an open spec. And considering the MPEG LA assertions against VP8 were settled, it's the only format with an open spec that had no patent assertions against it at all. I know, irrelevant after the MPEG LA assertions against VP8 were settled, but it's worth mentioning: Theora was crap because it had to avoid a patent thicket. And let's not forget Theora and VP8 are the reason the H.264 patent holders agreed to not charge "content fees" for free web video, which made H.264 a truly universal standard. Last edited by kurkosdr; 6th November 2023 at 14:21. |
|
2nd January 2024, 03:49 | #14 | Link | ||||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
|
Quote:
Quote:
Quote:
Quote:
A whole lot of the subjective quality of VP6 was due to its advanced postprocessing, far beyond any other codec did. Noise synthesis, deblocking, deringing, all of that. When that got turned off in a decoder (through configuration, or because fps dropped too much), the actual baseband video was not great. VP6 was handily outperformed by VC-1 Main Profile and H.264 Baseline. That said, VP7 also gave us http adaptive streaming! Move Networks' stuff was all based on VP7. I was involved in getting them to use VC-1, but the company sort of cratered soon after. Big fees and poor flexibility for business model tanked them. It turned out to cheaper just to reimplement. Microsoft had done a technology cross-licensing deal with Move, hence Smooth Streaming, which begat DASH. Sometime I'll tell the story of the fight to use fragmented MPEG-4 instead of ASF. What finally sealed the deal was my demonstrating how the lack of ASF support for B-frames would make H.264 support infeasible (H.264's max 16 b-frames is a lot of a 48 frame GOP)! Both ASF and QuickTime were originally designed without any consideration of bidirectional prediction, which required a whole lot of reearchitecting to make work. This is one reason why VC-1 was optimized around not using more than 1 b-frame in a row. Sorenson Video 3 in QuickTime just offset the frame indices by 2 frames so it could also use a single b-frame (so the payload of "Frame 4" was actually the second frame). So audio sync would be off two frames. It took me ages to get the post-Terran Interactive Media Cleaner Pro team to just implement a two frame audio shift to compensate. B-frames in ASF did the same thing, although the audio offset was built in so didn't become an issue. QuickTime was years late in supporting MPEG-1 because of the lack of B-frame support; having three in a row with Open GOP was just impossible to do within QuickTime without major refactoring. That was why MPEG-1 decode was implemented as a format playback plug-in, not as a QuickTime codec. I think that finally got fixed in QuickTime 7. Fortunately the QuickTime-derived MPEG-4 file format got b-frames figured out from the start. |
||||
2nd January 2024, 13:47 | #15 | Link |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,883
|
I'm always fascinated when I read about these stories.
Back when I was still playing around, Ben was living what has then become history. It's always nice to have these things narrated by those who actually lived/were part of it. Oh and by the way, I had no idea you were working for M$ in 2006! I feel like this is the Doom9 video codec version of the Professor Brailsford tales on Computerphile |
9th January 2024, 20:20 | #17 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
|
Quote:
My first encode was with Macromind Director Accelerator back in 1989. I was taking a grad level 3D animation class at Mass in my 2nd year at Hampshire College, and we needed a way to move our rendered frames around on 1.4 MB Mac floppy disks (dorm room access to school servers was only via 1200 baud modem). I turned out to be a terrible animator, and got a D in the class, but started designing (terrible) video codecs in my head in class after that. The teachers and some students went on to create Infini-D if anyone remembers that. I remember the revelation that was Apple Compact Video (later renamed Cinepak). And when IMA audio came out, and we could do (mediocre) 16-bit audio in half the bitrate of 8-bit. We could do 22 kHz mono instead of 11 at 1x CD-ROM bitrates. CD-ROM itself became popular after I'd already been doing encoding for money (trade show and kiosk use initially). I used to buy blank CD-R discs for $18 for our $2000 1x burner. Oddly, much of my early career happened by accident when I was trying to become a screenwriter and movie producer. I actually scripted a 2-page scene of people flirting over video compression ratios back in 1993 - what I've done since seems almost fated. I didn't finally realize that compression was my actual career until 1997. |
|
10th January 2024, 12:30 | #18 | Link |
Big Bit Savings Now !
Join Date: Feb 2007
Location: close to the wall
Posts: 1,531
|
1987, my penultimate year of studying for my Diplom-Ingenieur grade.
I was sitting behind my implementation of a Signal Processing PCB (Programmable preamp, Antilaliasing filtering, Sample & Hold around an industrial-grade 15-bit ADC feeding a TMS32000 Signal processor. My tutor and his master pupil had the FFT butterfly algo ready and then somebody else (our professor) came in and while reconsidering the project from a recent meeting perspective he mentioned something along the lines: "...and one day we will need to perform data reduction, it is done already in the states..." I was like: Eek, Devil's craft. Now we finally have good, clean data, and lots of them. Well, how and why to reduce what we just gained ? This would introduce faults wouldn't it ? 37 years and some Terawatthours later: What an art that has become. ;-}
__________________
"To bypass shortcuts and find suffering...is called QUALity" (Die toten Augen von Friedrichshain) "Data reduction ? Yep, Sir. We're that issue working on. Synce invntoin uf lingöage..." |
10th January 2024, 17:40 | #19 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
|
Quote:
Yeah, I remember so many philosophical arguments about lossy versus lossless compression in the early days (and it still comes up sometimes, particularly with audio). It generally got down to something like "do you want to spend your bits on 96x72 of perfect pixels, or 320x240 of good pixels? Downscaling is lossy too!" |
|
12th January 2024, 21:23 | #20 | Link | |
Registered User
Join Date: Aug 2009
Posts: 309
|
Quote:
Of course, lossy compression has the problem that too low bitrates are too often used. The temptation to cram yet another channel is just too big for broadcasters. Every time I see 4Mbps average bitrates on digitalbitrate.com for FullHD and H.264 channels, it just makes me sad. I mean, some of us pay a subscription to some "public broadcaster" whether we want it or not, so at the very least, they should give us artifact-free broadcast video, yes even when there is lots of stuff going on in the video. IMO an ideal solution would be to have a minimum statistical bitrate per channel legislated. Divide the mux reasonably instead of cramming an ever-increasing number of channels in. Governments legislate for all kinds of specs, so why not? Unfortunately, most people don't even know that quality loss due to lossy compression is not only a thing but can be very significant too. Most people only know resolution is a thing and that's it. Last edited by kurkosdr; 12th January 2024 at 21:42. |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|