Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

Domains: forum.doom9.org / forum.doom9.net / forum.doom9.se

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 9th January 2011, 22:26   #81  |  Link
shon3i
BluRay Maniac
 
shon3i's Avatar
 
Join Date: Dec 2005
Posts: 2,419
Quote:
Originally Posted by deadrats
i realize in this forum DS is unto God, that he can do no wrong, but you guys need to look at things objectively and see if the facts support the myth.
No we just use our eyes, thats all.

Second, you comparing two encoders like ferrari and tractor, both can reach speeds of 200km/h, downhill....
shon3i is offline   Reply With Quote
Old 9th January 2011, 22:51   #82  |  Link
weasel_
x264
 
weasel_'s Avatar
 
Join Date: Dec 2009
Location: Serbia
Posts: 50
Quote:
Originally Posted by deadrats View Post


re #4: the proof is in the pudding, try a test encode at blu-ray bit rates using x264 and mc's cuda powered encoder and then come talk to me.
There is test x264 vs some cuda encoder i dont remember what
Use search ...
U will see how much x264 is better at same bitrate...
Half blind people will see that

There is no hardware encoder that is better in quality-wise then x264

Quote:
Originally Posted by deadrats View Post
i realize in this forum DS is unto God, that he can do no wrong, but you guys need to look at things objectively and see if the facts support the myth.
And what facts u gave ?

AT REALISTIC bitrate ? hahahaah

What is main point of encoder ?
Keep quality at smallest bitrate posible ...
So compare x264 to that encoders you named at low bitrate and u will see for yourself.
At big bitrate every encoder in world will be good. Shon3i gave good example for tractor and ferrari

Last edited by weasel_; 9th January 2011 at 22:56.
weasel_ is offline   Reply With Quote
Old 9th January 2011, 23:12   #83  |  Link
deadrats
Banned
 
Join Date: Oct 2010
Posts: 119
Quote:
Originally Posted by weasel_ View Post
So compare x264 to that encoders you named at low bitrate and u will see for yourself.
At big bitrate every encoder in world will be good. Shon3i gave good example for tractor and ferrari
what is this bizarre fascination that people have with bit rate starving their encodes?

a dvd 9 can hold 8.5 gigs of data, blu-ray can hold up to 50 gigs and we have hard drives in the 3 terabyte range, why would you use ridiculously low bit rates for any encode?

it reminds me of that seinfeld episode where kramer is test driving the saab, the needle hits "e" and he and the salesman decide to see how far they can go before they run out of gas and the car stalls.

all i know is that i just priced out a new SB based build, i can get a good motherboard for $125, a 2500k for $180 and 4 gigs of ddr3 for about $100, as soon as i get my tax refund i'm switching to SB and i'll use the transcoding engine that allows me to encode 1080p at 100fps and you guys can stick with the software based encoder and keep telling yourselves that the quality is much better and it's just as fast as quick sync.

when you sober up, the rest of the world will be waiting for you...
deadrats is offline   Reply With Quote
Old 9th January 2011, 23:18   #84  |  Link
weasel_
x264
 
weasel_'s Avatar
 
Join Date: Dec 2009
Location: Serbia
Posts: 50
And what is point of encoder ?
Encoding at same bitrate as source ? :facepalm:


Quote:
Originally Posted by deadrats View Post
i'm switching to SB and i'll use the transcoding engine that allows me to encode 1080p at 100fps
Great, have fun....

Quote:
Originally Posted by deadrats View Post
and you guys can stick with the software based encoder and keep telling yourselves that the quality is much better and it's just as fast as quick sync.

Last edited by weasel_; 9th January 2011 at 23:21.
weasel_ is offline   Reply With Quote
Old 9th January 2011, 23:21   #85  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,275
Quote:
Originally Posted by deadrats View Post
what is this bizarre fascination that people have with bit rate starving their encodes?

a dvd 9 can hold 8.5 gigs of data, blu-ray can hold up to 50 gigs and we have hard drives in the 3 terabyte range, why would you use ridiculously low bit rates for any encode?
Using bitrates that are too high is a common mistake done in encoder comparisons. As a matter of fact, you can always increase the bitrate until the differences between the competing encoders vanish. That's because at a certain bitrate even the worst encoder in the test will look "transparent" and then it is impossible for any of the other encoders to give better quality. Of course such test is absolutely meaningless! For this reason you must pick a bitrate where the differences between the different encoders are still clearly visible - at least if your goal is to learn something from your test. IMO the typical BluRay bitrates are too high for an encoder comparison. Remember: BluRay was designed with MPEG-2 in mind, so the available bitrates are actually higher than what a good H.264 encoder needs to retain "transparent" quality...

(Needless to mention that running the test at an absurdly low bitrate and then concluding that all encoders in the test look "horrible" would be just as meaningless)
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊

Last edited by LoRd_MuldeR; 9th January 2011 at 23:47.
LoRd_MuldeR is offline   Reply With Quote
Old 9th January 2011, 23:50   #86  |  Link
deadrats
Banned
 
Join Date: Oct 2010
Posts: 119
Quote:
Originally Posted by LoRd_MuldeR View Post
IMO the typical BluRay bitrates are too high for an encoder comparison. Remember: BluRay was designed with MPEG-2 in mind, so the available bitrates are actually higher than what a good H.264 encoder needs to retain "transparent" quality...
that is not true, blu-ray was designed from the get go with all three major compression schemes in mind, avc, mpeg-2 and vc-1 as well as a boat load of audio compression schemes:

http://www.blu-ray.com/faq/

i do think that you touched on an important thought process, it does seem that x264 proponents are of the mind set that blu-ray bit rates are too high, they forget that uncompressed 8bit 1080p29.97 video uses 119 mega bytes per second, so even a bit rate of 30 mega bits per second is on the low side, x264 users seem convinced that they can drop that all the way to 10 mega bits per second (and lower) for 1080p29.97 video and somehow still maintain the same quality.

just because they can't see the differences using the consumer grade 20 inch monitors coupled to their gaming graphics cards does not mean that the quality is the same, just that they need to look at the math, get their cataracts operated on and lay off the wild turkey.

personally i think the "bit rate starve" mentality has it's origins in piracy and file sharing, people eager to conserve bandwidth so they try to transcode to as low a bit rate as possible so that downloads and uploads complete faster.

it's silly, if you could achieve the same quality level these companies would never have invested millions in hd-dvd and blu-ray technology, they would have stuck with dvd9 and extended the spec to include h264/vc-1.
deadrats is offline   Reply With Quote
Old 9th January 2011, 23:53   #87  |  Link
mp3dom
Registered User
 
Join Date: Jul 2003
Location: Italy
Posts: 1,136
Quote:
Originally Posted by LoRd_MuldeR View Post
Using bitrates that are too high is a common mistake done in encoder comparisons. You can always increase the bitrate until the differences between the competing encoders vanish. That's because at a certain bitrate even the worst encoder in the test will look "transparent" and then it is impossible for any of the other encoders to give even better quality. Of course such test is absolutely meaningless! For this reason you must pick a bitrate where the differences between the different encoders are still clearly visible - at least if you want to learn something from your test.
You're right... it's surely useless to compare a 1080p source at 100 Mbps range.

Quote:
IMO the typical BluRay bitrates are too high for an encoder comparison. Remember: BluRay was designed with MPEG-2 in mind, so the available bitrates are actually higher than what a good H.264 encoder needs to retain "transparent" quality...
I disagree. Personally I think 40 Mbps can be enough only most of the times. In particular, for sensitive parts (gradients, fine details, subtle grain, fades, all of the above mixed with complex parts etc) if you see the source and then the compressed file (even during playback) you can easily spot the differences if you've a 'trained' eye (and sometimes even an 'eye' is enough ). The "pro" here is that the final customer doesn't have the master as a reference to see if the problem is on the master or came up after compression.
Probably in this "field" all the encoders have rooms for improvements (I think but above all I wish).
Just as a note: this 40 days old "quick" comparison was made with an average bitrate of 35Mbps (with max set to full 40 Mbps). Considering your post, this result would qualify x264 as 'bad' (which, I think first of all, is not true), but anyway clearly demonstrate that even 35Mbps average can be not enough (for both encoders).

Quote:
Originally Posted by deadrats View Post
personally i think the "bit rate starve" mentality has it's origins in piracy and file sharing, people eager to conserve bandwidth so they try to transcode to as low a bit rate as possible so that downloads and uploads complete faster.
I completely agree with you, even if it's true that a lot of times even official BDs have an average bitrate of 15-20 Mbps which (to me) is again low.

Last edited by mp3dom; 10th January 2011 at 00:02.
mp3dom is offline   Reply With Quote
Old 10th January 2011, 00:17   #88  |  Link
deadrats
Banned
 
Join Date: Oct 2010
Posts: 119
Quote:
Originally Posted by weasel_ View Post
And what is point of encoder ?
Encoding at same bitrate as source ?
you are absolutely correct, i personally believe that there are only a few reasons to transcode something, like if the source is too dimly lit, the color saturation is off, the source is noisy and you are trying to fix said video or the aspect ratio is wrong and you can't fix it via manipulation of flags.

in these cases the output should equal the input, i.e. size and bit rate in = size and bit rate out.

in all other cases you are better off simply buying another hdd (at $90 for 1.5tb it's quite affordable).
deadrats is offline   Reply With Quote
Old 10th January 2011, 00:20   #89  |  Link
shon3i
BluRay Maniac
 
shon3i's Avatar
 
Join Date: Dec 2005
Posts: 2,419
The whole point here is efficiency of some standard. I can't understand why then all of these standards and improvements if there is no real difference. H264 is made to be better than MPEG2. Otherwise why not keep Lossless.

Like for example one car spent 8 liters and another spent 15 liters, both driving at 160km/h. And someone then says that is impossible. And all cars need to spent 15 litres and higher to be faster.

Quote:
Originally Posted by deadrats
that is not true, blu-ray was designed from the get go with all three major compression schemes in mind, avc, mpeg-2 and vc-1 as well as a boat load of audio compression schemes:
What LoRd_MuldeR is want to say is that huge bitrates is reserved in first place for MPEG2 encoders.

Last edited by shon3i; 10th January 2011 at 00:22.
shon3i is offline   Reply With Quote
Old 10th January 2011, 00:29   #90  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,275
Quote:
Originally Posted by deadrats View Post
that is not true, blu-ray was designed from the get go with all three major compression schemes in mind, avc, mpeg-2 and vc-1 as well as a boat load of audio compression schemes:

http://www.blu-ray.com/faq/
I know that in the final spec three formats are allowed (H.264, VC-1 and MPEG-2). Still MPEG-2 is by far the least efficient of those three supported video formats. And in the design of BluRay disc the bitrate (space) requirement was necessarily defined by the least efficient format that was going to be supported, i.e. by MPEG-2.

Quote:
Originally Posted by deadrats View Post
i do think that you touched on an important thought process, it does seem that x264 proponents are of the mind set that blu-ray bit rates are too high, they forget that uncompressed 8bit 1080p29.97 video uses 119 mega bytes per second, so even a bit rate of 30 mega bits per second is on the low side
So what is the point? Even old MPEG-2 can give decent quality at 20-40 MBit/s for HD content.

If state-of-the-art formats/encoders, such as H.264/x264, have proven that transparent quality can be preserved at even lower bitrates, then THAT is the reference other formats/encoders have to be compared to.

(To give yet another car analogy: If you can choose between a Trabi and a Ferrari, would you pick the Trabi, just because both of them are significant faster than an oxcart? ^^)

Quote:
Originally Posted by deadrats View Post
x264 users seem convinced that they can drop that all the way to 10 mega bits per second (and lower) for 1080p29.97 video and somehow still maintain the same quality.
Being convinced of something that is a matter of facts seems logically to me

Quote:
Originally Posted by deadrats View Post
just because they can't see the differences using the consumer grade 20 inch monitors coupled to their gaming graphics cards does not mean that the quality is the same, just that they need to look at the math, get their cataracts operated on and lay off the wild turkey.

personally i think the "bit rate starve" mentality has it's origins in piracy and file sharing, people eager to conserve bandwidth so they try to transcode to as low a bit rate as possible so that downloads and uploads complete faster.

it's silly, if you could achieve the same quality level these companies would never have invested millions in hd-dvd and blu-ray technology, they would have stuck with dvd9 and extended the spec to include h264/vc-1.
It seems all your argumentation goes like this: If you have enough bitrate available that you can waste for hiding the weaknesses of a "bad" (i.e. inefficient) encoder, then using the "bad" encoder is bearable, even though with a "good" (i.e. more efficient) encoder you could have retained the same quality at a much lower bitrate or a better quality at the same bitrate.

There's nothing surprising or interesting about that...

(And indeed using DVD-9 with H.264 and a good encoder would have been perfectly sufficient for distributing HD movies, but of course the industry prefers selling new hardware for new disc formats!)
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊

Last edited by LoRd_MuldeR; 10th January 2011 at 00:55.
LoRd_MuldeR is offline   Reply With Quote
Old 10th January 2011, 00:52   #91  |  Link
AnonCrow
Registered User
 
Join Date: Aug 2009
Location: 61.45° , 23.86°
Posts: 120
Quote:
Originally Posted by deadrats
what is this bizarre fascination that people have with bit rate starving their encodes?
Quite simply, because it can be done - or optimally, because it shouldn't be possible. Such as any meaningful application in 256 bytes, or a GUI, a webserver and a graphical webbrowser on a C64.
In the grand scheme of things: from a company perspective, sure, it's cheaper to pump more CPU power and bytes into something, rather than spend a few years optimizing the code.

Quote:
if you could achieve the same quality level these companies would never have invested millions in hd-dvd and blu-ray technology, they would have stuck with dvd9 and extended the spec to include h264/vc-1
Then they wouldn't have been able to sell more hardware to ignorant consumers.

Quote:
i personally believe that there are only a few reasons to transcode something, like if the source is too dimly lit, the color saturation is off, the source is noisy and you are trying to fix said video or the aspect ratio is wrong and you can't fix it via manipulation of flags.
All of those can already be changed at runtime in any decent player. If the source is grainy and you want encode a degrained version of it, surely the bit-rate would drop a lot if you were encoding at the same quality ?
What about downscaling the video for various different resolutions, or simply downscaling it from 1080i to 720p if the the source really doesn't have that much optical resolution to begin with ?
Also, you nicely ignored the fact that weasel was somewhat flabbergasted with your ideas. Though I'm sure you were a little too, when Windows 7 was released, and you noticed that it had the same or in same parts even lower system requitements than Windows Vista - ignoring the fact that if eg. MS spent the better part of a decade to optimize it (or really rebuild it from scratch), it'd run happily on a 486 /w 32 MB RAM.
Continuing on the car MPG examples: Any people/companies competing for maximum fuel efficiency must be crazy then, even if they manage an order of magnitude more efficient use of whatever fuel used than a typical car (with a real-looking car).

Disclaimer: no, I'm not a Mac user, I'm an Amiga user

Last edited by AnonCrow; 10th January 2011 at 01:12. Reason: typos
AnonCrow is offline   Reply With Quote
Old 10th January 2011, 00:55   #92  |  Link
deadrats
Banned
 
Join Date: Oct 2010
Posts: 119
Quote:
Originally Posted by LoRd_MuldeR View Post
It seems all your argumentation goes like this: If you have enough bitrate available that you can waste for hiding the weaknesses of a "bad" (i.e. inefficient) encoder, then using the "bad" encoder is bearable, even though with a "good" (i.e. more efficient) encoder you could have retained the same quality at a much lower bitrate or a better quality at the same bitrate.
you almost got it, you just need to expand on it a bit: if you can eliminate any quality difference between encoders with enough bit rate AND storage is not a consideration thanks to huge hard drives at very reasonable prices AND a hardware based encoder at 4 times the bit rate is 5-10 times as fast as the little darling of the open source community, then why suffer using the much slower encoder?

x264 is the fastest software based encoder when using the "ultra fast" preset, i'll grant you that. be that as it may, using a x4 620, transcoding a 1080p blu-ray rip to a 15 mb/s h264 with ac3 audio mkv i'm lucky if i see 15 fps, tops.

an SB cpu, using the quick sync engine, did a similar test on anandtech at 100 fps and that's without stressing any of the cpu cores.

according to that very same review, a 6 core 12 thread core i7 980x completes the encoding pass of the x264 hd benchmark at 49 fps. downloading the benchmark and examining the test file as well as the benchmark script shows that the source is a 720p mpeg-2 and the encode target is a 4 mb/s 720p h264, no audio with the priority set to "real time".

what does one need to be smoking to conclude that using x264 to encode a 4 mb/s 720 h264 at 49fps (if you'r lucky), with your cpu maxed out and priority set to real time is preferable to using the quick sync engine to encode a 15 mb/s 1080p, with audio no less, at 100 fps?

furthermore, what kind of reality distortion field is needed to believe that an x264 encode, done at 4 mb/s and 720p will somehow be of higher quality than an encode done at 15 mb/s and 1080p?

are all x264 users also mac users?
deadrats is offline   Reply With Quote
Old 10th January 2011, 01:00   #93  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,869
Pointless discussion

One group talks about encodes which are transparent to the source at very high level, other about something which looks good and is as small as possible (mainly for ripping and sharing over net)

BD was created to give best possible quality for mass consumer (DVD started being not good enough on massive new TVs) and its 40Mbits is good enough to deliver it. It's not overdone, even for x264.

Saying that x264 can achieve the same at 10Mbit as BD at 30Mbit is not true at all.
If x264 can achieve the same transparency at 10Mbit than great- we can put whole series on 1 BD. It also means that other encoders can be improved.

x264 can achieve much more than other encoders at 10Mbit, but it's no near close to transparency, which is achieved on BD discs.
Question is if average consumer does need such a good quality like BD? He could probably live with much lower, but TVs getting bigger and bigger and heavily compressed footage will start being soft, blurred, not detailed.

As deadrats said- don't turn DS into GOD- because he would done almost nothing without all doom9 members- it's their big job for all encodes and reporting any problems and giving suggestions. Many companies have great programmers, but it's lack of testing time, which stops them to make their product better. x264 has biggest testing community ever and free!

Quality wise x264 is as good as pro encoders for BD usage (with some stronger points and some weaker), but in terms of worflow/features/speed lot worse, so none of big studio use it.

In terms of other usage x264 is great and has clear quality advantage, but even so, most of the paid web content is not encoded with x264, but with Carbon Coder, Ateme and other hardware solution mainly because of the workflow.

x264 is an engine- needs GUI and other bits around it to be more popular.

Andrew

Last edited by kolak; 10th January 2011 at 01:28.
kolak is offline   Reply With Quote
Old 10th January 2011, 01:12   #94  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,275
Quote:
Originally Posted by deadrats View Post
you almost got it, you just need to expand on it a bit: if you can eliminate any quality difference between encoders with enough bit rate AND storage is not a consideration thanks to huge hard drives at very reasonable prices AND a hardware based encoder at 4 times the bit rate is 5-10 times as fast as the little darling of the open source community, then why suffer using the much slower encoder?
Still the same argumentation of yours: If you don't need/want a "good" (efficient) encoder to begin with, because you are willing to waste enough bitrate so that even a "bad" (inefficient) encoder will deliver decent quality, then indeed you won't benefit from a "good" encoder and the "bad" encoder will be sufficient for your particular needs. Still this does NOT make the "bad" encoder any better or the "good" encoder any worse. It just constructs a scenario where there's nothing for the "good" encoder to gain, because nothing is expected. Most important that argumentation does NOT add anything valuable to the discussion!

Quote:
Originally Posted by deadrats View Post
x264 is the fastest software based encoder when using the "ultra fast" preset, i'll grant you that. be that as it may, using a x4 620, transcoding a 1080p blu-ray rip to a 15 mb/s h264 with ac3 audio mkv i'm lucky if i see 15 fps, tops.

an SB cpu, using the quick sync engine, did a similar test on anandtech at 100 fps and that's without stressing any of the cpu cores.
With such fancy "speed comparisons" you must be extremely careful! In particular the following two things must be ensured:

(1) Both encodes must come out the same average bitrate. If the one encoder came out at a higher average bitrate than the other one it encoded at a lower compression efficiency and thus had an unfair advantage in the speed comparison.

(2) Both encodes must come out the same visual quality. If the one encoder produced a lower visual quality than the one, it encoded at a lower compression efficiency and thus had an unfair advantage in the speed comparison.

Unless these points are ensured, the FPS numbers are absolutely meaningless. And I'm very suspicious about that
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊

Last edited by LoRd_MuldeR; 10th January 2011 at 01:18.
LoRd_MuldeR is offline   Reply With Quote
Old 10th January 2011, 01:14   #95  |  Link
deadrats
Banned
 
Join Date: Oct 2010
Posts: 119
Quote:
Originally Posted by AnonCrow View Post
What about downscaling the video for various different resolutions, or simply downscaling it from 1080i to 720p if the the source really doesn't have that much optical resolution to begin with ?
what?!? seriously? if something is transfered from film at 1080i then i'm going to go out on a limb and say that it has more than sufficient "optical" resolution (as opposed to what, anal resolution? LOL), likewise if it's a program recorded from hdtv it will still have enough of whatever you refer to as "optical" resolution.

why would you down rez it to 720p, i'm pretty sure i know what you're going to say, i just really want to read it for myself.

Quote:
Also, you nicely ignored the fact that weasel was somewhat flabbergasted with your ideas.
le weasel was "flabbergasted" because he seems to have spent to much time worshiping at the x264 alter and stopped thinking for himself. most people react with disbelief when you first pull the veil that has been blinding them away from their eyes.

Quote:
Though I'm sure you were a little too, when Windows 7 was released, and you noticed that it had the same or in same parts even lower system requitements than Windows Vista - ignoring the fact that if eg. MS spent the better part of a decade to optimize it (or really rebuild it from scratch), it'd run happily on a 486 /w 32 MB RAM.
1) i wasn't surprised, i expected microsoft to use a superior malloc() library, i expected them to rework the thread scheduler and i expected them to gpu accelerate more of the gui, specifically the 2d parts.

microsoft made a number of minor mistakes with vista, like changing the memory management model so that vista assumes that any free ram is wasted ram and caches all available ram as a consequence. win 7 does the same thing but the superior malloc() library is better at allocating and releasing ram as needed.

furthermore vista was the first to feature a fully gpu accelerated gui (for the 3d portions, xp featured partial acceleration), using win 7 leads me to believe that m$ extended that acceleration to 2d surfaces.

in so far as win 7 running on a 486, i don't care how much you optimized the code, if all of it was done using hand coded assembler, it still wouldn't run on an 486 cpu, the windows api, at least the dx parts, have been sse optimized since dx6 (back in the win2k days, modern windows Oses are all 32/64/128 bit hybrids, maybe an embedded version of win 7 could run on a 486 but that's it.
deadrats is offline   Reply With Quote
Old 10th January 2011, 01:14   #96  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,869
Quote:
Originally Posted by LoRd_MuldeR View Post
....
(And indeed using DVD-9 with H.264 and a good encoder would have been perfectly sufficient for distributing HD movies, but of course the industry prefers selling new hardware for new disc formats!)
I would say HD DVD with proper encoding would be good enough, but for how long?
We have 3D already and that would be end of HD DVD...
or.... your DVD-9 would be already to small to deliver 3D movies.


Andrew

Last edited by kolak; 10th January 2011 at 01:19.
kolak is offline   Reply With Quote
Old 10th January 2011, 01:33   #97  |  Link
shon3i
BluRay Maniac
 
shon3i's Avatar
 
Join Date: Dec 2005
Posts: 2,419
Quote:
Originally Posted by deadrats
furthermore, what kind of reality distortion field is needed to believe that an x264 encode, done at 4 mb/s and 720p will somehow be of higher quality than an encode done at 15 mb/s and 1080p?
You tell us? Just remember higher values not always better.

Anyway there is encoder efficiency which work with human perception that control how to use bits and not spent on less noticeable things

Quote:
Originally Posted by kolak
x264 can achieve much more than other encoders at 10Mbit, but it's no near close to transparency, which is achieved on BD discs.
Sorry I disagree, Blu-Rays start to be horrible and horrible, so exactly how we can say that is something transparent which look like....
shon3i is offline   Reply With Quote
Old 10th January 2011, 01:38   #98  |  Link
deadrats
Banned
 
Join Date: Oct 2010
Posts: 119
and just like that software based x264 encoding is dead:

http://tmpgenc.pegasys-inc.com/ja/do...mw5.html#trial

the english version will be available soon, in addition to licensing x264, pegasys will feature the cuda h264 encoder and support for quick sync.

stick a fork in x264, it's done.
deadrats is offline   Reply With Quote
Old 10th January 2011, 01:44   #99  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,869
Quote:
Originally Posted by shon3i View Post
Sorry I disagree, Blu-Rays start to be horrible and horrible, so exactly how we can say that is something transparent which look like....
Whatever it's, x264 at 10Mbit is worse

Something like Island trailer at 28mbit was good enough.


Andrew
kolak is offline   Reply With Quote
Old 10th January 2011, 01:53   #100  |  Link
Jarod Middelman
x264.nl
 
Join Date: Oct 2010
Posts: 12
Quote:
i realize in this forum DS is unto God, that he can do no wrong, but you guys need to look at things objectively and see if the facts support the myth.
Indeed we use our eyes, but i agree with deadrats, i like blocks aswell.

Jarod Middelman is offline   Reply With Quote
Reply

Tags
media engine, x.264

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 21:02.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2026, vBulletin Solutions Inc.