Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th January 2011, 04:59 | #203 | Link | ||
Registered User
Join Date: Apr 2002
Location: Germany
Posts: 4,926
|
Quote:
Quote:
The comparison Annandtech did is flawed he didn't actually tested the Quality of Nvidias GPU Encoder but that of Arcsofts X264 GPU Encoder which is awfully buggy, they also have another Encoder which isn't based on X264 @ all though not yet used in any of their Products (very fast Cuda Encoder not up to the Quality of Nvidia though). Arcsoft are practically the only ones that do not use Nvidias own Encoder (nvcuvenc API) but their own X264 GPU mod inside of MediaConverter7. We don't know yet how QuickSync was tuned in the Profiles used for Encoding (what settings they call in the Media SDK API (quality level) Cyberlink could have optimized it for Speed rather then Quality in MediaEspresso. There indications that QuickSync (how it's used in Cyberlinks Encoder) favors to destroy the Grain layer more uniform (keeping over time temporal Grain differences low by "Denoising" more blurry result ) compared to Nvidias Cuda Encoder (coming from a recent compare).
__________________
all my compares are riddles so please try to decipher them yourselves :) It is about Time Join the Revolution NOW before it is to Late ! http://forum.doom9.org/showthread.php?t=168004 |
||
12th January 2011, 04:59 | #204 | Link |
Derek Prestegard IRL
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
|
Parkjoy (maybe slowed down to 24p) is always a tough one.
Other than that, I don't know what you have access to. Any DVD or BluRay source is fine. Cruncher, you're unintelligible as always, which is too bad because I get the feeling you have a unique perspective in all this talk of GPU encoding... or at least have access to lots of these encoders!
__________________
These are all my personal statements, not those of my employer :) Last edited by Blue_MiSfit; 12th January 2011 at 05:04. |
12th January 2011, 05:20 | #205 | Link |
Registered User
Join Date: Dec 2008
Posts: 589
|
Deadrats.... encode this at 1080p 6-8 mbps and 720p 2.5 mbps : http://www.google.com/search?q=SAmsu...o+Oceanic+Life Should be a 500 MB h264 file at 40mbps.
Then look at the differences. But do post the encoding settings otherwise any comparison is useless. |
12th January 2011, 05:56 | #206 | Link | |
Registered User
Join Date: Apr 2002
Location: Germany
Posts: 4,926
|
There will also come support for Sandy Bridges Encoder from a company where i guess many wouldn't have suspected to see it from
Quote:
Here is a Video demoing Elementals MediaSDK support http://www.youtube.com/watch?v=BNPf-lMMYLY unfortunately they always take Itunes H.264 Encoder as reference for the Consumer base (and we all know how slow that is)
__________________
all my compares are riddles so please try to decipher them yourselves :) It is about Time Join the Revolution NOW before it is to Late ! http://forum.doom9.org/showthread.php?t=168004 Last edited by CruNcher; 12th January 2011 at 06:25. |
|
12th January 2011, 08:21 | #207 | Link |
Registered User
Join Date: Mar 2006
Posts: 272
|
its Not really a surprise as such CruNcher , its another tick in the 'PR Innovators' book.
it's not even a surprise to me that the AMD OpenCL SDK with its so called Open Decode library that uses the DRM/UVD didn't get a mention ether. as everyone knows that's Tried to work with AMD you cant even get them to provide the Linux Open Decode library to go with the SDK, or even get simple and reported AMD driver bugs fixed in a timely manor. so no reliable HW decode across the 3 versions of the broken DRM/UVD ASIC from even the one 3rd party Linux dev that did sign the NDA, and he's all but given up on them now, and who could blame him, AMD just dont want to play ball never mind be in the game it seems. Last edited by popper; 12th January 2011 at 08:45. |
12th January 2011, 14:10 | #208 | Link | ||
Registered User
Join Date: Apr 2002
Location: Germany
Posts: 4,926
|
Quote:
The overall response to opening the Encoder (Opening means in that sense setting the API free from NDA barriers, though it seems they where internaly discussions about a entirely Open Source GPU Encoder based on their Research, maybe that idea didn't died fully yet,though you don't have to forget the ISVs making business with their closed Encoders now such a project could hurt in some ways and be helpful in others depending on your customer base) was the same as back then with the Decoder in some weeks a lot of even no name Applications (not always good ones though but do they need really to care about that ?) adopted it in no time (also because their excellent support and documentation) (not only the big ISVs had now access after some short ISV NDA exclusive time) and that's the Power of their Ecosystem (Donald and the Doom9 Community helped a lot Debugging VPx in the same way which helped improving the Quality for every Application and for every user involved in the Ecosystem) though it seems AMD cared more about exclusive stuff and keeping it down to a minimum ISVs @ first (for a very long time) and for Linux it seems even heavier restricted. I can mostly only speak about Nvidia and they did everything right from the first day strategy wise without hurting the business aspects to much (on the Desktop) and Utilized the Power of the Community as best as possible (with braking down the NDA ISV barriers really fast without hurting anyone else to much and also seeing more the mostly positive things then negative ones that would arise from that "Openness" i can only congratulate the brain behind this decissions @ Nvidia) which in the end made everyone happy and be part of that Software/Hardware Ecosystem (ISVs,Consumer,non ISV Developers,Geeks) AMD reacted a little slow (and the hell its absolute sad seeing that they have the more Powerful Hardware @ Hand also because many Developer made them aware that it would be good to have that support and saw the demand from users for it early on as Nvidia came up with it,on the other side most resources where going @ that time into Fusion and how to integrate ATI efficiently as possible into their future concept), its almost predictable that Fusion would have a much better adoption rate with such a "Open" available Ecosystem to mostly everyone and we can see AMD is trying to speed up their now again at least on the Win32/64 side finally, it's sad for the Linux part though and that Nvidia is still much stronger their in terms of overall Developer support as you explained it. Btw everyone of the P67 Board buyers forget what reviews are telling you that Intels QuickSync would never work if a discrete GPU is being used that is not true it is either a Bios Restriction or a a Software lock on the OS Driver level but not has anything todo with a Chipset Hardware limitation thing. Intel is doing since several years now allot on Software Restriction research @ their Hardware level they started this with making several things remotely available last year via a service call on their Motherboards and even the lower price Pentium Chips (upgrade it via a call to a higher series no problem). The same they did this time on the P67 series of Boards see this proof talk by some Intel guy http://www.youtube.com/watch?v=gui_6sNc7Eo Intel is fully in the concept of DRMing their Hardware for Business purposes now, so yes who knows how its working on the Chip site and if the same Series is restricted the same way by a clever combination of Bios,Key,Motherboard and Operating System the same as they bringing in TPM now to finally make Hollywood being able to provide full 1080p experience On Demand and without any time differences to a Cinema release, as they feel save now with Microsoft and Intels TPM Ecosystem (we will see how long that holds up though) Please also keep in mind i don't say its possible without this Lucid Virtu (Framebuffer copying under Vista/Win7 very efficiently doable as everything is on the GPU already though this most probably highers latency some ms again especial in Windows mode like shown in that Video so Gamers or people who think of Multi GPU data Displaying should be careful, though it is cool and indeed a dream i guess of everyone in the Consumer level to see that) to actually use both Discrete GPU/CPU together @ the same time that obviously isn't possible physically by the Hardware itself it's just the fact that their Software can see the QuickSync that makes it invalid that the P67 turns everything of via the Hardware, but Intels Software layer (Bios/OS Driver) is actually doing that and Lucid got the Permission to disable it to be able todo what you see inside that Video or the other Logic assumption would be the Intel guy is actually mixing up stuff here (mixing up H67 with P67 though several times repeatedly) and all the reviewers like Anandtech are correct if they say P67 doesn't support the Intel QuickSync GPU feature @ all (the information they get from intel and all believe to be true, the motto being better don't think believe everything my partner tells me and put that into public) you decide what is the truth Now to make it even more Confusing Anandtech Published Article http://www.anandtech.com/show/4113/l...n-sandy-bridge Quote:
__________________
all my compares are riddles so please try to decipher them yourselves :) It is about Time Join the Revolution NOW before it is to Late ! http://forum.doom9.org/showthread.php?t=168004 Last edited by CruNcher; 12th January 2011 at 17:45. |
||
13th January 2011, 00:05 | #209 | Link |
Banned
Join Date: Oct 2010
Posts: 119
|
media sdk vs cuda vs x264
alright everyone, as promised i conducted a test encode and i'm uploading the results for everyone to see. for source i went here:
http://www.demo-world.eu/trailers/hi...n-trailers.php and downloaded "The Gorgeous Ocean World" demo under the "HDclub" heading. i think this source is a good test because it's a nice 1920x1080p29.97 blu-ray source, it only uses a nominal bit rate of 8.34 mb/s, has lots of good motion, lots of vibrant colors, so it makes a good test for any encoder. it also uses 256 kb/s ac3 audio. for a target, i chose 1280x720p29.97 mkv, with 128 kb/s audio. for bit rate i did the following: i calculated a per pixel bit rate for the 1080p source, works out to .134 bits/pixel and derived a nominal bit rate for the target encode of 3.7 mb/s for the video portion, all encodes were done using cbr. i did test encodes with the default settings for x264 and cuda (remember, i don't speak japanese) and i through in a special treat. tmpg express 5 allows you to use the media sdk encoder in software mode if you don't have a QS enabled processor installed. i did a test encode with this as well, it was slow as hell but it should be indicative of the quality of the encodes QS will offer. note: it appears that quick sync is little more than a hardware accelerated version of the IPP encoder, the only commercial app that uses IPP, as far as i know, is gom encoder (in addition to the default x264 encoder) and in numerous tests i found there to be no quality difference between the IPP encoder and x264, they pretty much traded punches, under some tests x264 showed slightly better quality under other tests IPP showed slightly better quality. the big benefit of IPP was that it ran like stink on a monkey with an intel cpu: using a quad core phenom 2 x264 was easily twice as fast as IPP, using a dual core penryn (E7400) IPP smoked x264. this really shouldn't be surprising considering IPP stands for "intel performance primitive". the source runs 4:05 long, x264 took 13:35 and cuda took 8:17, intel media sdk software encoder brought up the rear with a super slow 44:39. you can download the finished encodes here: cuda http://www.mediafire.com/?0dy2c1n8mjy6rj1 media sdk http://www.mediafire.com/?6xp933gxj2xtt9e x264 http://www.mediafire.com/?rcc9yb3elv4ahag from a quality standpoint, i would put the x264 encode and the cuda encode as overall equal, in certain parts x264 seemed to hold a minor edge in visual quality, in other parts it appeared to me that the cuda encoder produced the better results, overall i would call it a tie. the sdk encoder was an interesting case: the over all impression i got was that it produced a slightly lower quality than either x264 or cuda but that it produced more vibrant colors. my guess is if someone wanted to nitpick one could find frames from each encode to support the claim that any of the encoders was better than the other two. the reality is that 3.7 mb/s for 720 is ridiculously low, people usually capture hdtv at 12-18 mb/s and let's be honest with ourselves and point out that 8.3 mb/s is atrocious for 1080p, if we had the original source, straight from the camera (not a previously compressed source), we could get much higher quality encodes with all 3 encoders, even at 3.7 mb/s for 720p. quite frankly, most people do not have access to uncompressed or losslessly compressed film transfers, what most people have is blu-rays that they "backup" (i.e. steal, pirate, "fair use" copy) or the have hdtv captures or they have footage shot on consumer grade camcorders. these people won't be transcoding down to 3.7 mb/s for 720p (unless they are complete imbeciles that have a "bit rate starve" fetish), they'll be using saner bit rates, at 5 mb/s for 720p any differences between the 3 encoders start to vanish and at what i consider the minimum for 720p, 8 mb/s, there is no difference at all. as a test, i tried the cuda encoder at 720p 8 mb/s, just to see what kind of slow down could be expected, and the test encode finished in an almost identical 8 and a half minutes, while x264 slowed down to 14 and a half minutes. this is definitely not a conclusive test, nor do i expect anyone to accept these results, i'm sure the criticism will fly from the x264 faithful, but it does seem like software based encoding is destined to go the way of the dodo. |
13th January 2011, 00:14 | #210 | Link |
Registered User
Join Date: Dec 2005
Posts: 1,460
|
Why would you use CBR encoding?
And why do you thing 3.7 Mbps is ridiculously low for 720p? I do encodes at CRF 21 (veryslow, film, lvl 3.1, 3 b-frames) at that resolution and most of them come out below 3 Mbps. They look fine on normal viewing. Edit: By the way, you certainly didn't encode with x264 using it's defaults. The defaults aren't main profile with both trellis and b-pyramid disabled. They would be slower though. Last edited by nurbs; 13th January 2011 at 00:33. |
13th January 2011, 00:29 | #212 | Link |
interlace this!
Join Date: Jun 2003
Location: i'm in ur transfers, addin noise
Posts: 4,555
|
i'll chime in and ask why use bits/pixel? there is nothing that bits/pixel can meaningfully say about the actual source. and sources vary enormously.
my tests from HDCAM-SR sources (as near to uncompressed as anyone's likely to get without working straight from the original camera neg or the output of a grading workstation), just using x264 at crf 22 show a massive range of bitrates for different sources. some (usually shot on RED in daylight, or at least adequate light) features will come in at about 3.5-4mbps for a 1080p24, and some (usually 16mm or a very fast/grainy 35mm) will come in at 20mbps+ at the same res. bits/pixel will not help there. even some HDCAM shot in studio conditions will have huge bitrate requirements due to oversharpening and the everything-in-focus effect of 1/3 inch sensors. i'll also chime in and say that with these sources, casual viewing reveals little to no difference between x264 encodes at crf22 (which is on the "lower-quality" end) and sonic cinevision at blu-ray bitrates. critical viewing will reveal some grain oddities in x264, but bear in mind we're talking about a 3:1 size difference over the cinevision encodes.
__________________
sucking the life out of your videos since 2004 |
13th January 2011, 00:41 | #213 | Link | |
Banned
Join Date: Oct 2010
Posts: 119
|
Quote:
2) 3.7 mb/s, as i pointed out, works out to just .134 bits per pixel, considering uncompressed a pixel is an 8 bit "entity", using about 1/78th as any bits to represent the same data is stupid beyond belief. when i said i used the default setting i should have added the defaults tmpg express 5 chooses for the encoders with 2 exceptions: it chooses "high" profile and 4.0 level, i changed that to "main" profile and 4.1 level, don't really know why, it just seemed like a good idea at the time. as soon as i get an english version of the app i will be able to play around with the settings and really see what each encoder can do. |
|
13th January 2011, 00:49 | #214 | Link | |
Banned
Join Date: Oct 2010
Posts: 119
|
Quote:
it's bad enough that you are losing about 1.1 million pixels per frame when you go from 1080p to 720p, why compound the quality lose by simultaneously reducing the per pixel bit rate? the people that take a commercially encoded blu-ray, 1080p at 25-35 mb/s and transcode to 720p at < 4 mb/s and actually believe they know what they are doing or that their encodes are high quality productions are retards of the highest caliber, they should bared from using a computer until they squeeze their heads out of their cans. |
|
13th January 2011, 00:54 | #215 | Link | |
Registered User
Join Date: Sep 2007
Posts: 5,377
|
Quote:
You're also using a low quality source (8Mb/s for a 1080p blu-ray ? Come on WTF? ) I suspect using a higher quality source would show bigger differences, and it looks like you're deliberately using poor settings (or maybe it's just the japanese GUI)? Does anyone else see the hiprocracy here? deadrats is calling dark shikari out for not being objective, and calling software based encoding dead, despite basing his "conclusions" on zero objective evidence Come on guys, lets see some properly done objective testing. Last edited by poisondeathray; 13th January 2011 at 00:57. |
|
13th January 2011, 01:00 | #216 | Link |
Registered User
Join Date: Apr 2002
Location: Germany
Posts: 4,926
|
Ehh i didn't meant it that way it was sarcastic based on the different Stories (Intel Guy inside is a P67 Chipset) Anandtech (inside is a H67 Chipset) both @ the same time would be the Z68 Chipset
__________________
all my compares are riddles so please try to decipher them yourselves :) It is about Time Join the Revolution NOW before it is to Late ! http://forum.doom9.org/showthread.php?t=168004 Last edited by CruNcher; 13th January 2011 at 01:04. |
13th January 2011, 01:19 | #217 | Link | |
interlace this!
Join Date: Jun 2003
Location: i'm in ur transfers, addin noise
Posts: 4,555
|
Quote:
as far as scaling quality per pixel dimensions, a simple linear relation may help, but will not tell the whole story (ie, you are using fixed block sizes, which means you'll be proportionally increasing the high-frequency content as well, making compression and lossless-coding of compressed data less efficient in a nonlinear manner). my approach (which i'll admit is slightly naive, but is more robust than bit/pel) is to simply find a crf i like the look of, and encode everything at that. x264 will decide the bitrate, which scales somewhat linearly with pixel dimension, but not quite (smaller frame sizes need a fair bit more per pixel). i think that TMPGenc has a constant quality setting, though being in japanese could be an issue of course (there wasn't an english translation yet? wow). one very obvious alternative though - use the freaking vanilla x264 that everyone else uses! why complicate things with TMPGenc at this point? (a) you don't know how to use it by your own admisson, and (b) nobody else has it yet, and so verification is difficult. use a build from here: http://x264.nl/ also, an adult BD as a test? those are almost exclusively shot on HDV, with fast motion, and (according to various sources) they often use lenses specifically designed to soften away the ugly truth of what sex looks like close-up. in this situation of loads of blocks, double-compressed source (starting in-camera as mpeg-2), and soft source it's no wonder CUDA and x264 look the same.
__________________
sucking the life out of your videos since 2004 |
|
13th January 2011, 01:23 | #218 | Link | |
Registered User
Join Date: Nov 2004
Location: Poland
Posts: 2,843
|
Quote:
Are both encodes BD compliant? Andrew |
|
13th January 2011, 01:26 | #219 | Link | |
Registered User
Join Date: Nov 2004
Location: Poland
Posts: 2,843
|
Quote:
Andrew |
|
13th January 2011, 01:26 | #220 | Link | |||
Banned
Join Date: Oct 2010
Posts: 119
|
Quote:
if you have a higher quality source that can easily be shared with everyone here, then post a link. as for deliberately using poor settings, as i said i don't speak japanese, i can't read japanese, i just was able to figure out how to use the default settings because i recognized the basic layout from previous versions of the software. Quote:
if you really want to see hypocrisy, read through his rants and then read his suggested methodologies for conducting an encoder test and you will see him recommend end users do the exact same thing that he rips other testers for doing. Quote:
i uploaded the test encodes and linked to the source so that you guys can see for yourselves, how much more objective could i be? feel free to conduct your own, better, test and feel free to post the results. |
|||
Tags |
media engine, x.264 |
|
|