Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > New and alternative video codecs

Reply
 
Thread Tools Search this Thread Display Modes
Old 4th May 2013, 15:08   #1  |  Link
flebber
Practising Schemer
 
Join Date: Feb 2008
Location: Newcastle, Australia
Posts: 791
ORBX.js HD Codec

Saw this Brendan Eich article https://brendaneich.com/2013/05/today-i-saw-the-future/

It claims that this js library has:
Quote:
25% better compression than H.264 for competitive quality, adaptive bit-rate while streaming, integer and (soon) floating point coding, better color depth, better intra-frame coding, a more parallelizable design the list goes on.
Is this right?
flebber is offline   Reply With Quote
Old 5th May 2013, 00:54   #2  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 769
nobody knows until they release the encoder for us to test out and compare with x264. I highly doubt it will be 25% better than x264, possibly a h.264 codec by a commercial company, who knows. Also we have no info as to whether it violates the patents of the MPEG-LA yet either. It is an interesting development though which could potentially be important.
hajj_3 is offline   Reply With Quote
Old 5th May 2013, 04:56   #3  |  Link
flebber
Practising Schemer
 
Join Date: Feb 2008
Location: Newcastle, Australia
Posts: 791
Brendan Eich doesn't put too much FUD out there, gets things wrong like the rest of us, obviously though he is looking at this from a Mozilla & free web perspective to make the browser the whole platform.
flebber is offline   Reply With Quote
Old 5th May 2013, 07:29   #4  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 8,990
Personally, if someone claims to not only have developed a platform to run a video decoder inside the browser, but also a codec which beats H.264 hands down, all they can do to convince me is giving me stuff to play with.
So, before we have independent testing of these claims, i'll consider it hyped.

You know how such comparisons to H.264 always use the H.264 reference encoder as comparison, so theirs looks better?
Real world comparisons to real world encoders, and we'll talk!
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 5th May 2013, 15:38   #5  |  Link
iwod
Registered User
 
Join Date: Apr 2002
Posts: 672
On the issues of patents, I think, correct me if i am wrong, this Javascript Codec avoid all the patents by not providing binaries, much like how x264 are providing code, but not compiled binaries, since sometime ago US rules code distribution belongs to free speech.

And it would be great if that is the future, but the problem is still battery, until some special hardware that can make JavaScript running at speed capable of decoding this while retainning ultra low battery life. It still wouldn't work out.

Note: Even with asm.js being nearly 1.2x of Native Code speed, it would still be far away from a power usage dedicated h.264 encoder.

I truly hope someday all the patents for H.265 would only collect the payments through hardware, and allow the software side to be completely free.
iwod is offline   Reply With Quote
Old 6th May 2013, 17:16   #6  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 2,401
Ars Technica had a good, reasonably skeptical article: http://arstechnica.com/information-technology/2013/05/are-video-codecs-written-in-javascript-really-the-future/

This seems of unlikely value to me. On most browsers it won't even support P or B frames right now according to the article. And I'm not really use if something that requires WebGL to be interesting really counts as "JavaScript."

I'd think that most devices with good enough hardware to do JS+WebGL with have a hardware H.264 decoder ASIC, and probably have hardware encode as well. Unless they are doing something CRAZY, there's no way they're getting 25% more efficient at similar wattage!

I expect that "25%" example is probably a case where they didn't configure the H.264 encode correctly. I can't tell you how many booths at NAB I tweaked their x264 command lines for "apples-to-apples" comparisons.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Instant Video

My Compression Book

Amazon Instant Video is hiring! PM me if you're interested.
benwaggoner is offline   Reply With Quote
Old 6th May 2013, 20:40   #7  |  Link
STaRGaZeR
4:2:0 hater
 
Join Date: Apr 2008
Posts: 1,307
Javascript? kthxbai
__________________
Specs, GTX970 - PLS 1440p@96Hz
Quote:
Originally Posted by Manao View Post
That way, you have xxxx[p|i]yyy, where xxxx is the vertical resolution, yyy is the temporal resolution, and 'i' says the image has been irremediably destroyed.
STaRGaZeR is offline   Reply With Quote
Old 8th May 2013, 23:20   #8  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,175
There was a great post in Ars' comments (aside from endless OT bickering about the strengths and weaknesses of JS in general):

Quote:
Originally Posted by w0utert
I fully understand why Mozilla would want to move more and more stuff into the browser, the browser being their core product. I don't agree with most of it though. While technically impressive, I don't really see a lot of benefits to a PDF viewer or a video decoder implemented in JavaScript and running in the browser. I mean, theoretically I can think of some advantages (easy to update, write-once-run-anywhere, plugs a few possible security holes), but those are mainly advantages for developers, not users, and are IMO far outweighed by disadvantages (less efficient, introduces *other* possible security holes, don't integrate with the OS as well, cannot make use of OS/hardware specific features, need a full browser environment to run, don't translate well to hardware solutions etc).

I really think Mozilla is trying too hard, and focussing on the wrong problems. Instead of trying to move _everything_ into the browser, trying to get the whole world into some form of cloud-based computing running in the browser, they should focus on those applications where cloud computing actually makes sense, and those applications that actually benefit from write-once-run-anywhere. There are plenty. PDF viewers, image and video decoders, etc. are not things Mozilla should be worrying about, every OS has these covered, and it wil allways provide a much better experience for such tasks.
I'm not 100% sure I agree, since platform-specific hooks are a source of great pain in platform porting and maintenance, but at the same time they invariably give a better and more consistent experience to the end user. Seen in that light, the idea behind a JS decodable codec would only make sense as a fallback when all else fails, or when the platform port is immature, like the way youtube serves up FLV1 to people with ancient flash versions.
__________________
There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order. ~ Ed Howdershelt
foxyshadis is offline   Reply With Quote
Old 9th May 2013, 10:05   #9  |  Link
dapperdan
Registered User
 
Join Date: Aug 2009
Posts: 143
Given Adobe products long history of security issues I can see a very valid role for PDF.js when reading random files off the internet. Mozilla have also stated that ambitious projects like that help them push the whole browser platform to the next level.

For codecs the big thing for Mozilla is punting the patent issue but I think he may have been quite literal in his "I have seen the future" i.e. not "we should all use this right now" but "we'll be using something like this in the future". It's easy to see that over time codecs (image/audio/video) have trended from single-purpose hardware to software. Following on from that, a continually updated software codec for use with disposable content (e.g. video chat) makes more sense in that world than something nailed down in a specification 10 years ago. (I kind of thought that this might be where Google was aiming with the VP* codecs given their culture of beta releases and incremental improvements via online updates).

Doing it "in javascript" rather than as part of the browser puts that vision further back again, but even before this announcement there's been a progression of work from all sorts of people to open up modern hardware to javascript. How many years do you have to go back to find "you'll never get spotify/photoshop/unreal engine etc. working in a browser".

Like most crazy things it'll start in niches, this orbx thing appears to be a family of codecs, one for screensharing, another for 3D video games. Perhaps this specialisation is giving them an edge in those domains, but I think it's definately where things are headed generally, even if it takes a while to get there.
dapperdan is offline   Reply With Quote
Old 9th May 2013, 15:41   #10  |  Link
mandarinka
Registered User
 
mandarinka's Avatar
 
Join Date: Jan 2007
Posts: 659
Except that with the performance lossess from javascript, your format will likely have to be compromised enough that you will wish you could use a standard codec designed 10 years ago. (Hey - H.264 is almost as old and it rules?)

Bonus points: the standard codec will have decent encoders with good quality/bit results, while your javascript monster might not, especially if it is an inhouse design with proprietary design and development model (VP8, this javascript vapourware).
mandarinka is offline   Reply With Quote
Old 12th August 2015, 10:47   #11  |  Link
vivan
/人 ◕ ‿‿ ◕ 人\
 
Join Date: May 2011
Location: Russia
Posts: 648
Today I've remembered about this vapourware. It seem that they've actually made their comparison public in october: http://aws.otoy.com/docs/ORBX2_Whitepaper.pdf
It says that they have been comparing against x264 --preset slow --bframes 0 --tune ssim / --tune psnr (which is quite sane).
I've been trying to reproduce it and, well, obviously SSIM values are wildly different (--crf 18 gives SSIM-Y 0.9903910 = 20.173 db @ 6289.24 kb/s), so either there're different ways to calculate SSIM in db, or dunno. For PSNR they've used "PSNR-HVS-Y" and I have no idea how to do it.

UPD: oh wait, they do have normal PSNR-Y graph. Real x264 results are:
43.207 @ 2016 kb/s
45.057 @ 2968 kb/s
46.303 @ 3990 kb/s
47.102 @ 5036 kb/s
49.347 @ 10068 kb/s
It might be not that accurate (who knows how they've preformed RGB->YV12 conversion), but still it's much higher than numbers they got and beats their numbers (which could be fake as well).

Last edited by vivan; 12th August 2015 at 14:10.
vivan is offline   Reply With Quote
Reply

Tags
codec, h264, orbx.js

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:26.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2017, vBulletin Solutions Inc.