Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th November 2008, 02:48   #21  |  Link
avivahl
Registered User
 
Join Date: Dec 2007
Posts: 215
Small note... his picture came from http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels
The 150Mbit/s is at the "Max video bit rate (VCL) for High 10 Profile" column
and the 200Mbit/s is at the "Max video bit rate (VCL) for High 4:2:2 and High 4:4:4 Predictive Profiles" column.

I personally have no idea what they both mean.
avivahl is offline   Reply With Quote
Old 18th November 2008, 02:50   #22  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Quote:
Originally Posted by Neillithan View Post
Whether or not it's a practical reality is not my concern.
Well, Neil, you're not a CEO of a hardware box manufacturer trying to make products that can be sold at a profit.

Quote:
My PC plays all videos regardless of complexities, resolutions and bitrates.
Such a machine costs hundreds to thousands of dollars. It is simply not a viable idea to think you can sell standalone devices at such a price.

Quote:
Dark Shikari and yourself would have me believe I shouldn't question the nature of things unless I have a degree in rocket science.
It's not rocket science. There's no way affordable standalones and set-top boxes could be made to support 5.1. You can ask anybody knowledgeable in the field. I work for a semiconductor maker that sells into this industry. I know what is viable from a HW perspective. You are welcome to refute this with some facts.

Quote:
Screw that.
That will do wonders for your reputation.
Guest is offline   Reply With Quote
Old 18th November 2008, 02:52   #23  |  Link
Dark Shikari
x264 developer
 
Dark Shikari's Avatar
 
Join Date: Sep 2005
Posts: 8,666
Quote:
Originally Posted by avivahl View Post
I personally have no idea what they both mean.
They're all "professional" formats not intended for ordinary hardware (or even software) players. High is what most STBs and software decoders support, for which the L4.1 max bitrate is 50 megabits.
Dark Shikari is offline   Reply With Quote
Old 18th November 2008, 02:55   #24  |  Link
Milvus
Registered User
 
Milvus's Avatar
 
Join Date: Nov 2006
Location: Paris, France
Posts: 53
Quote:
Originally Posted by avivahl View Post

I personally have no idea what they both mean.
That mean it's a kind of video you will probably never need to watch on a standalone, and joe six-pack will never have in hand.

Standalone don't support 5.1 not only because it would be very hard and expensive, but also because it's useless, even if you are a 1080p maniac.
Milvus is offline   Reply With Quote
Old 18th November 2008, 03:12   #25  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Some constraints of the High 10 profile also result in a significantly lower compression ratio, which can be important for fixed size media, such as DVDs, and for limited broadcast channels.

Last edited by Guest; 18th November 2008 at 14:28.
Guest is offline   Reply With Quote
Old 18th November 2008, 04:27   #26  |  Link
DigitAl56K
Registered User
 
Join Date: Nov 2002
Location: San Diego, CA
Posts: 936
Neillithan,

Just as CoreAVC does, the DivX H.264 Decoder also decodes more than just High 4.0 content. Hardware is a different story though.

The comments Dark Shikari and neuron2 make are correct. Supporting higher levels does introduce a much greater burden on manufacturers in terms of devices capabilities and cost. Widespread support of everything up to level 5.1 is infeasible. Look back to my earlier post where I'm speaking to interoperability across many manufacturers, confidence amongst consumers and content creators and low-cost solutions in many device categories. Think to yourself, "How do we get there from here?". You said it yourself,

Quote:
When I went looking for a media server capable of playing H.264 .mkv files, I found Popcorn hour. When I took a closer look, I realized it only played Level 4.1 (I believe) .mkv files. That's the problem with the world of HD right now. There are so many blu ray players and media extenders boasting playback of H.264, but the truth is, they don't play everything.
If you're not very technically informed devices either play H.264 or they don't. But the world of H.264 is diverse, and earlier I used some of Apple's devices as just one example of a lineup from a single manufacturer where if it was not for Apple's somewhat closed content ecosystem consumers might be mystified as to why files from one device don't play on another.

DivX wants to build a consistent platform. We want to bring all manner of capable devices into an ecosystem where creators and viewers don't have to target specific devices but only a known profile that we can assure will work reliably everywhere. And that's bigger than just H.264. It's the container. It's the audio format. It's the subtitles and the metadata and all other aspects of the experience.

I don't know about you, but I don't want to re-encode my media for every single device I ever buy. I don't want my friends to have to either. I know some of my friends wouldn't know where to start in fact. I don't want my PC trying to transcode all my videos on the fly for my connected device every time I watch them because a year ago I decided I could get an extra 0.5% compression by using an extra b-frame. I want my media to look fantastic, sound amazing, and play the instant I want to watch it with no fuss.

On the other hand, nobody is taking away your freedom to encode your files the way you like. If these values are less important to you then that too is okay. But wouldn't it be nice if you ever shared your files with others that they could enjoy the great experience that a common platform offered?

I really have not found anything that doesn't look spectacular in 1080 using level 4.0.
DigitAl56K is offline   Reply With Quote
Old 18th November 2008, 08:26   #27  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
Quote:
P.S. I'm generally rather patient, but you managed to be annoying, rude, and insulting enough to convince me, one of the few people on this forum who has both the knowledge and cares enough to answer all these questions, to stop helping you in less than a dozen posts. Might I point out that this isn't an honor. But remember, the DS is a forgiving DS. In the end, even fools can be enlightened if they only acknowledge their foolishness and learn from their mistakes.
Seriously, you ignored all of my initial heeds and you have the nerve to say I'm annoying, rude and insulting? I just tried to spawn a discussion not a chain of objections.

You misinterpreted my responses and further misinterpreted. I called you out for being partial in your responses because you look at only the literal meaning of my words and then you tell me I should clarify. I did clarify and you became upset.

In the midst of my responses, apparently my inability to correctly use video terminology annoyed you enough to
start truly insulting me. You called me a fool.

To top it all off, I get told to stop insulting people.

Lets compare insults shall we?

DS: Annoying, rude, fool and foolish versus
ME: Arrogant wisdom-touting bully

Both are mean spirited, yet I'm told to knock it off. I guess you wouldn't dare tell a well renowned forum member to stop insulting so long as the little guy is the easier target.

I've visited this forum many times for discussion and help and all I get in return are threads that end like this.


Quote:
Screw that.
Quote:
That will do wonders for your reputation.
Even you fall victim to looking at only the literal meaning of words. Need I clarify that one? It was a neutral statement and you assume it's contemptuous of this forum? Why? There's no reason to interpret that as such.

I loathe this place.

Last edited by Neillithan; 18th November 2008 at 08:30.
Neillithan is offline   Reply With Quote
Old 18th November 2008, 10:20   #28  |  Link
Leak
ffdshow/AviSynth wrangler
 
Leak's Avatar
 
Join Date: Feb 2003
Location: Austria
Posts: 2,441
Quote:
Originally Posted by neuron2 View Post
It's not rocket science. There's no way affordable standalones and set-top boxes could be made to support 5.1. You can ask anybody knowledgeable in the field. I work for a semiconductor maker that sells into this industry. I know what is viable from a HW perspective. You are welcome to refute this with some facts.
I'm going out on a limb here - but shouldn't it be possible to use one of the lowest end current-gen ATI (or nVidia) GPUs to manufacture something like that?

Radeon HD 4350 cards (which come with ATI's UVD2 engine) start at 35 EUR around here, and that's the price for the GPU, video RAM and the board...
__________________
now playing: [artist] - [track] ([album])
Leak is offline   Reply With Quote
Old 18th November 2008, 10:57   #29  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
Quote:
Originally Posted by Leak View Post
I'm going out on a limb here - but shouldn't it be possible to use one of the lowest end current-gen ATI (or nVidia) GPUs to manufacture something like that?

Radeon HD 4350 cards (which come with ATI's UVD2 engine) start at 35 EUR around here, and that's the price for the GPU, video RAM and the board...
Well, they're right in the sense that Level 5.1 has the potential to bring any current system to its knees. To prove his point, Dark Shikari posted a 10 second video at some absurd resolution like 3840x2160@50fps with a bitrate of 96 megabits per second. Apparently, 5.1 is supposed to be outrageously high quality requiring a super computer from the future, but most people who use L5.1 don't encode their videos near the magnitude of that complexity. L5.1 should be playable and for the videos that go beyond sane compression and HD setting, it doesn't matter.

My PC has a dual core 3ghz AMD, 4gigs of ram and an 8800 GTX OC and it has no trouble playing 1920x1080@60fps. For a PS3 or an Xbox 360, I'd assume their H.264 playback to be on par with my PC, yet they only support L4.1 videos. Why? There is no reason for the limitation.
Neillithan is offline   Reply With Quote
Old 18th November 2008, 11:42   #30  |  Link
DrNein
Registered User
 
Join Date: Sep 2002
Posts: 145
What advantage would 5.1 capabability offer over 4.1 for consumer videos? It seems it is not necessarily better for the purpose and we should not conclude we are missing something because a higher spec is available -especially if it unnecessarily raised costs.
DrNein is offline   Reply With Quote
Old 18th November 2008, 11:43   #31  |  Link
nurbs
Registered User
 
Join Date: Dec 2005
Posts: 1,460
Why do you think supporting level 4.1 is a huge limitation? The majority of people will never get a hand on a source exceeding level 4.1 anyway since blu-ray (and probably broadcast hdtv too) fall under that limitation. With these sources you wouldn't reencode with settings that require a higher level anyway since it would only reduce quality and then you might as well keep them untouched. Also just because a file is labeled level 4.1 or level 5.1 doesn't mean the content actually requires that level. Most of the 720p files that hit filesharing networks probably don't even exceed level 3.1 if the encoder used a sane number of reference frames (<=5). The quality that can be gained by exceeding that limit is minimal (unless maybe on cartoons). I do agree that the decoder should not just look at the level and refuse to play a file, but thats a minor problem since the level flag can easily be changed to represent the actual content of the file.
nurbs is offline   Reply With Quote
Old 18th November 2008, 14:42   #32  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Quote:
Originally Posted by Leak View Post
I'm going out on a limb here - but shouldn't it be possible to use one of the lowest end current-gen ATI (or nVidia) GPUs to manufacture something like that?

Radeon HD 4350 cards (which come with ATI's UVD2 engine) start at 35 EUR around here, and that's the price for the GPU, video RAM and the board...
Now add the case, power supply, other required things like tuners, smartcard interfaces, multiple video connectors, HDMI, USB, and all the other things set-top boxes have nowadays. And be aware that boxes are provided free or at very low cost to consumers by services such as DirecTV. The pressures on manufacturers to keep costs low is incredible. You wouldn't believe the competition between manufacturers to sell their boxes to service providers.

As several posters have pointed out, 5.1 support is just not necessary to play the kinds of videos that these devices are called on to play. It's a wholly unjustifiable expense in a penny-pinching environment.

The bottom line is that if putting six times the memory on a box along with the more complex chipsets required to support 5.1 was a viable competitive strategy, it would be done, tinfoil hats not withstanding.

Last edited by Guest; 18th November 2008 at 15:04.
Guest is offline   Reply With Quote
Old 18th November 2008, 15:14   #33  |  Link
STaRGaZeR
4:2:0 hater
 
Join Date: Apr 2008
Posts: 1,302
Quote:
Originally Posted by Neillithan View Post
To prove his point, Dark Shikari posted a 10 second video at some absurd resolution like 3840x2160@50fps with a bitrate of 96 megabits per second. Apparently, 5.1 is supposed to be outrageously high quality requiring a super computer from the future, but most people who use L5.1 don't encode their videos near the magnitude of that complexity. L5.1 should be playable and for the videos that go beyond sane compression and HD setting, it doesn't matter.
I think you still don't understand how this works. If a player is labeled as L4.1 compliant, it has to be able to play all compliant L4.1 videos. Therefore, if you want a player labeled as L5.1 compliant, it has to be able to play all compliant L5.1 videos. You fail to understand that the video DS posted is L5.1 compliant, so if you want validated L5.1 support in your player of choice it has to be able to play it. That is why it is not viable. The problem comes when people use an insane (and useless) number of reference frames for their encodes. Everything else is compliant with L3.x or L4.x, but because of the ref frames you'd need L5.1 to fully support it.

However, Blu-ray being L4.1, I don't know why DivX is restricted to L4. Also I'd have been nice that Blu-ray supported L4.2, allowing 1080p60 and 1080p50.
__________________
Specs, GTX970 - PLS 1440p@96Hz
Quote:
Originally Posted by Manao View Post
That way, you have xxxx[p|i]yyy, where xxxx is the vertical resolution, yyy is the temporal resolution, and 'i' says the image has been irremediably destroyed.
STaRGaZeR is offline   Reply With Quote
Old 18th November 2008, 17:51   #34  |  Link
DigitAl56K
Registered User
 
Join Date: Nov 2002
Location: San Diego, CA
Posts: 936
Quote:
Originally Posted by Leak View Post
I'm going out on a limb here - but shouldn't it be possible to use one of the lowest end current-gen ATI (or nVidia) GPUs to manufacture something like that?
I can't think of much outside of the PS3 and Xbox360 that might implement such a thing. HW players already use special IC's with dedicated decoders. They're going to be about as cost efficient as you will get.
DigitAl56K is offline   Reply With Quote
Old 18th November 2008, 20:30   #35  |  Link
Turtleggjp
Registered User
 
Join Date: Apr 2006
Posts: 225
I think that the reason they didn't support L4.1 is so that their (certified) players focus on content created by DivX 7, rather than the videos that are already out there, including HD-DVD and Blu Ray movies. Not only are there potential legal issues involved with this (no one wants to put out a hardware device capable of playing back decrypted Blu Rays, Popcorn Hour being the exception) but if they did then people would wonder why some movies (AVC) can be easily converted to play on DivX players, while others (MPEG2, VC-1) cannot. If they supported all 3 formats, then I think that is going beyond what they intended to create, not to mention even more expensive.

As for the L5.1 issue, there's just simply no need for such support. True L5.1 video (like the sample provided by DS) is way too expensive to support today, and would defeat DivX's goal of an affordable player for all. Now, what they could have done is display a warning when trying to play a file that is marked as L5.1 saying "This file might be more than I can handle, do you want to try anyway?" That way, if it fails to play correctly, you were warned, but if it can play it, then great. Granted, when feeding a hardware device more than it can handle the results may be unpredictable, so this may need to be an "expert option" only to be used by those who know what they might be getting into.
Turtleggjp is offline   Reply With Quote
Old 18th November 2008, 22:19   #36  |  Link
DigitAl56K
Registered User
 
Join Date: Nov 2002
Location: San Diego, CA
Posts: 936
Turtleggjp,

4.0 profile enables very high quality 1080 video and allows a much more extensive device support. Contrary to your point around Blu-Ray there is no legal issue I'm aware of around the choice of level. The maximum video bitrate for level 4.1 is 2.5x that of 4.0.
DigitAl56K is offline   Reply With Quote
Old 18th November 2008, 22:28   #37  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,806
For those who are obsessed by 4.1 profile:
20Mbps is more that enough for 1080p stream so why you want 50Mbps?!?!!?!? I'm with DigitAl56K ...

Last edited by Atak_Snajpera; 18th November 2008 at 22:40.
Atak_Snajpera is offline   Reply With Quote
Old 18th November 2008, 22:33   #38  |  Link
Tagert
Registered User
 
Tagert's Avatar
 
Join Date: Sep 2006
Location: Karlskrona, Sweden
Posts: 72
I agree with the gentlemen above me.
The profile support is well enough
Tagert is offline   Reply With Quote
Old 19th November 2008, 18:37   #39  |  Link
Turtleggjp
Registered User
 
Join Date: Apr 2006
Posts: 225
Quote:
Originally Posted by DigitAl56K View Post
Turtleggjp,

4.0 profile enables very high quality 1080 video and allows a much more extensive device support. Contrary to your point around Blu-Ray there is no legal issue I'm aware of around the choice of level. The maximum video bitrate for level 4.1 is 2.5x that of 4.0.
I understand that there is no legal issue in implementing a chip capable of decoding L4.1 video. However considering the primary source of such video is Blu Ray discs, which we aren't supposed to have free access to , I don't think too many manufacturers are willing to produce such a device. Since you are not trying to be a Blu Ray caliber player, I agree that L4.0 should be enough for the average user, which is whom you are targeting with your product.
Turtleggjp is offline   Reply With Quote
Old 19th November 2008, 19:30   #40  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
>I don't think too many manufacturers are willing to produce such a device

4.1 support is pretty standard for dedicated AVC decoder ICs.
Guest is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:33.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.