Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Closed Thread
 
Thread Tools Search this Thread Display Modes
Old 4th January 2011, 06:08   #15621  |  Link
joe42
Registered User
 
Join Date: Sep 2010
Posts: 25
An excerpt from Anand's Sandy Bridge review,

http://www.anandtech.com/show/4083/t...-2100-tested/7

Quote:
The limitation is entirely in hardware, particularly in what’s supported by the 5-series PCH (remember that display output is routed from the processor’s GPU to the video outputs via the PCH). One side effect of trying to maintain Intel’s aggressive tick-tock release cadence is there’s a lot of design reuse. While Sandy Bridge was a significant architectural redesign, the risk was mitigated by reusing much of the 5-series PCH design. As a result, the hardware limitation that prevented a 23.976Hz refresh rate made its way into the 6-series PCH before Intel discovered the root cause.

....

Intel has committed to addressing the problem in the next major platform revision, which unfortunately seems to be Ivy Bridge in 2012. There is a short-term solution for HTPC users absolutely set on Sandy Bridge. Intel has a software workaround that enables 23.97Hz output. There’s still a frame rate mismatch at 23.97Hz, but it would be significantly reduced compared to the current 24.000Hz-only situation.

MPC-HC Compatibility Problems

Just a heads up. Media Player Classic Home Cinema doesn't currently play well with Sandy Bridge. Enabling DXVA acceleration in MPC-HC will cause stuttering and image quality issues during playback. It's an issue with MPC-HC and not properly detecting SNB as far as I know. Intel has reached out to the developer for a fix.
Hmm, so now there is a software kludge for Intel HD Graphics to output at 23.97 Hz instead of 23.976 Hz. If my calculations are correct, that changes it from a repeated frame every 42sec to a skipped frame every 167 seconds. I wonder how much better that will look.

That part about Intel reaching out to "the developer" of MPC-HC is odd. Who did they talk to? Is Intel contributing code for MPC-HC?

Last edited by joe42; 4th January 2011 at 06:29.
joe42 is offline  
Old 4th January 2011, 10:28   #15622  |  Link
bobdynlan
Beyond the Corn Border
 
bobdynlan's Avatar
 
Join Date: Jul 2009
Location: 4th Roman Empire
Posts: 93
joe42, there is no spoon. In other words, I'll bet D3DFS fooled you into thinking you've got 10bit smooth output, but instead what happened was that D3DFS switched 16-235 on. And there is the display itself that does on-the-fly adjustments even if you set it at certain figures. Did you notice a slight drop in brightness level? Because that's what happens when 16-235 goes on. And please note that windowed tests are irrelevant, this whole 10-bit thing works only on full-screen mode. After ~2h of staring at a solar eclipse, my conclusion might be wrong

Anyway, this whole FP thing does little thing for 8-bit displays at the moment, and that is where the resources should be spent, not for the 0.05% 10-bit capable (not advertised) hardware, users have. Maybe it's time for a dither implementation, now that direct processing sort of works, JanWillem32?
bobdynlan is offline  
Old 4th January 2011, 11:06   #15623  |  Link
joe42
Registered User
 
Join Date: Sep 2010
Posts: 25
Quote:
Originally Posted by bobdynlan View Post
joe42, there is no spoon.
There is no joe42 in the rest of that conversation, either.
joe42 is offline  
Old 4th January 2011, 11:47   #15624  |  Link
janos666
Registered User
 
Join Date: Jan 2010
Posts: 479
Quote:
Originally Posted by bobdynlan View Post
joe42, there is no spoon.
I think I will answare these on behalf of joe42.

Quote:
Originally Posted by bobdynlan View Post
In other words, I'll bet D3DFS fooled you into thinking you've got 10bit smooth output, but instead what happened was that D3DFS switched 16-235 on. And there is the display itself that does on-the-fly adjustments even if you set it at certain figures. Did you notice a slight drop in brightness level? Because that's what happens when 16-235 goes on.
I don't think so.
Almost everything can happen when I connect my display to my Radeon HD5850 with a HDMI cable: PC-TV level mismatch (ignored pixel format settings and chaotic PC->TV conversions with HDTV resolutions), double-corrected (desaturated) colors with the xvYCC mode, and other random things (like a small flickering in every ~20 minutes).
But it's absolutely stable and reasonable with DisplayPort connection. The driver doesn't think that it has to be a smartass and deal with a broken player software and HDTV combo, and the display doesn't think that it fights against a broken media player which thinks that it has a dirty business with a broken HDTV; or God knows why the hell they do these chaotic things.

On the other hand... The display works with a 12-bit controller, so I guess it would do a nice job with a TV-PC level conversion.
But the possibility that EVR could tell the display to do this expansion and it really did it... almost zero... (May be with HDMI connection and the help of this solar eclipse but only once in a lifetime...)

It's an LCD with ~950:1 contrast ratio. It's far from perfect but only "acceptable" for me. So, it's easy to tell if 0-16 values are cut or not. Dark tones would be really bright grays then.

Quote:
Originally Posted by bobdynlan View Post
And please note that windowed tests are irrelevant, this whole 10-bit thing works only on full-screen mode. After ~2h of staring at a solar eclipse, my conclusion might be wrong

Anyway, this whole FP thing does little thing for 8-bit displays at the moment, and that is where the resources should be spent, not for the 0.05% 10-bit capable (not advertised) hardware, users have. Maybe it's time for a dither implementation, now that direct processing sort of works, JanWillem32?
If you would read everything I wrote on that post you may know that...
The "10-bit input" without the "10-bit output" used 10-bit RGB and dithered it back to 8-bit display output (older builds, not these test versions) and caused a smooth gradient.

And it also worked with windowed mode! You need the D3DFS mode for real 10-bit display mode but it isn't necessary for 10-bit processing and dithering until the display mode is the usual 8-bit. (At least as I can remember. I used to watch movies with madVR, I am just here to do some experiment with 10-bit and XYZ-LUT based CMS.)

Why do you think that it wasn't 10-bit?
The hardwares are capable, the software is theoretically capable and it looks like it works. Why can't you believe if it worked? Didn't it work for you with a TN display or an expensive but unfortunately incompatible HDTV, or what?

Last edited by janos666; 4th January 2011 at 11:52.
janos666 is offline  
Old 4th January 2011, 12:09   #15625  |  Link
bobdynlan
Beyond the Corn Border
 
bobdynlan's Avatar
 
Join Date: Jul 2009
Location: 4th Roman Empire
Posts: 93
Sry joe42 It's easy to get disoriented with all these clone-type names janos666, you are right. After all those panel lotteries going on, I did not expect any change in the business model to favour the customer. So it's an 8-bit panel with dither advertised as 10-bit, but it still works better in Photoshop than in MPC-HC?! It's hard to find a professional display here, I will search some more.
bobdynlan is offline  
Old 4th January 2011, 13:26   #15626  |  Link
JanWillem32
Registered User
 
JanWillem32's Avatar
 
Join Date: Oct 2010
Location: The Netherlands
Posts: 1,083
The higher than 8-bit processing formats are quite useful if you have to: extend Y'CbCr values to full range, up-scale chroma resolutions, convert Y'CbCr formats to RGB formats, convert input gamma functions, scale video resolutions, apply display output gamma functions(+color corrections), and any other kinds of filtering applied besides that. Each of those steps can change the color values enormously of an original pixel. Using a limited processing format of 8-bit will truncate the original output of each step to 8-bit, causing inaccurate rounding.
Another thing is dithering. If the ditherer receives an input format with the same accuracy as the output format, it shouldn't activate at all. Dithering should be done by selectively rounding in-between values up or down from the input format to the output format. If the the processing format doesn't deliver those bits, there's nothing to round up or down anymore.
I also don't like the current type of dithering. The color resolutions it can use are okay, even for 10-bit output, but the method suits still image dithering the most. I tried to write a ditherer that uses a different method in a pixel shader, but it's very hard to get it just right, also considering the temporal dithering issues. I agree that the usage and type of dithering should be user-selectable (maybe even with the GPU/CPU processing costs indicated in the menu).
I worked quite a bit on making it possible to render with only 4×fp32 surfaces (the maximum allowed surface format in DirectX 9). Later on, it will be important to choose what formats are sane for 10-bit output, 8-bit output and 8-bit output on slower machines.
I'm already satisfied with the current handling of the 10-bit backbuffer and display formats. The only big problem is that exiting video from D3DFS with 10-bit output can cause black or darkened screens, but I think that can be solved in a while. (Build 1824 had the same problem.)
JanWillem32 is offline  
Old 4th January 2011, 14:43   #15627  |  Link
Ingram
Registered User
 
Join Date: Apr 2004
Posts: 56
MPC-HC has started doing some weird stuff with going into fullscreen mode on my Windows 7 HTPC with v1.4.2499. I've never seen it before until I switched to W7. Basically I go into fullscreen mode, but the left edges and bottom edges are still displaying the windows desktop in the background. If I exit and go back into fullscreen 2-3 times it eventually fixes itself.

Any solutions?

Edit: It seems it doesn't do it when you go from a non-maximised window to fullscren. But if the window is maximised and you enter fullscreen mode the problem is there.

This is occurring while watching a 720P MKV with DXVA. Not sure if it does it with Avi or non-DXVA yet.

Last edited by Ingram; 4th January 2011 at 14:48.
Ingram is offline  
Old 4th January 2011, 15:02   #15628  |  Link
namaiki
Registered User
 
Join Date: Sep 2009
Location: Sydney, Australia
Posts: 1,073
What is your system font set at? Also, are you using the 32-bit or 64-bit build of MPC-HC? How many monitors are being used? Which video renderer is being used?
namaiki is offline  
Old 4th January 2011, 15:08   #15629  |  Link
JanWillem32
Registered User
 
JanWillem32's Avatar
 
Join Date: Oct 2010
Location: The Netherlands
Posts: 1,083
Ingram, what display driver are you using? It sounds like the backbuffer isn't cleared automatically. Try the VMR-9 (renderless), EVR-CP and EVR Sync. renderers. Of those I'm sure they clean up of the buffers quite properly.
JanWillem32 is offline  
Old 4th January 2011, 15:11   #15630  |  Link
mark0077
Registered User
 
Join Date: Apr 2008
Posts: 1,106
Quote:
Originally Posted by Ingram View Post
MPC-HC has started doing some weird stuff with going into fullscreen mode on my Windows 7 HTPC with v1.4.2499. I've never seen it before until I switched to W7. Basically I go into fullscreen mode, but the left edges and bottom edges are still displaying the windows desktop in the background. If I exit and go back into fullscreen 2-3 times it eventually fixes itself.

Any solutions?

Edit: It seems it doesn't do it when you go from a non-maximised window to fullscren. But if the window is maximised and you enter fullscreen mode the problem is there.

This is occurring while watching a 720P MKV with DXVA. Not sure if it does it with Avi or non-DXVA yet.
Yes I reported this ages ago, only happens with the evr renderers. Doesn't occur with madVR, and yes it doesn't happen from a non maximized window. I have 1 monitor, windows 7 64bit, nvidia gtx295 in my case.
mark0077 is offline  
Old 4th January 2011, 15:14   #15631  |  Link
Mercury_22
Registered User
 
Join Date: Dec 2007
Posts: 1,138
Quote:
Revision 2809 - Directory Listing
Modified Tue Jan 4 12:41:20 2011 UTC (75 minutes, 48 seconds ago) by aleksoid

Add : MPC Audio Renderer - select audio device;
@aleksoid can you please also correct the default "Speaker configuration for 8 input channels" in the custom matrix ?

Because when "Enable custom channel mapping" it's enable the default mapping for 8 channels it's wrong = channel 7 it's mapped to "Front Left of Center" instead of "Back Left" and channel 8 it's mapped to "Front Right of Center" instead of "Back Right"

When "Enable custom channel mapping" it's not enable the mapping it's correct !
__________________
Intel UHD Graphics 750; Win 10 22H2

Last edited by Mercury_22; 4th January 2011 at 15:24.
Mercury_22 is offline  
Old 4th January 2011, 15:34   #15632  |  Link
Ingram
Registered User
 
Join Date: Apr 2004
Posts: 56
Quote:
Originally Posted by mark0077 View Post
Yes I reported this ages ago, only happens with the evr renderers. Doesn't occur with madVR, and yes it doesn't happen from a non maximized window. I have 1 monitor, windows 7 64bit, nvidia gtx295 in my case.
I'm using EVR Sync with Reclock controlling VSync, Win 7 64Bit and Ati 10.12 drivers. Also just the one monitor.

As for system font, default? I've not touched it.
Ingram is offline  
Old 4th January 2011, 15:41   #15633  |  Link
janos666
Registered User
 
Join Date: Jan 2010
Posts: 479
Quote:
Originally Posted by bobdynlan View Post
Sry joe42So it's an 8-bit panel with dither advertised as 10-bit, but it still works better in Photoshop than in MPC-HC?!
Yes, the LCD panel receives 10-bit values and it visualizes them via A-FRC (some kind of dithering). But it's very hard to notice from usual viewing distance, it's not as noisy as madVR but I think it's enough for me (according to my experiments with PhotoShop CS5 and HQ XYZ-LUT profiles).

But take a look at some 6+2 bit TN panels. They are significantly better than simple 6-bit TN panels. When you do 8+2 bit, the side-effect is much weaker.
And there is internal processing. It's better to feed the display with 10-bit before it starts to send it though the 12-bit internal LUT and correct the WP according the R,G,B Gains.

And may be I will get a real 10-bit panel in the "not too distant future". The current dithering is enough to test the possibilities (software side) and judge if it's worth some money or not (visible or not).
janos666 is offline  
Old 5th January 2011, 11:13   #15634  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
Quote:
Originally Posted by janos666 View Post
B
10-bit input = ??? - Somebody said, it asks 10-bit from the decoder. Why...? Source is 8-bit any I don't want any image manipulation on the decoder side. I want to output 10-bit after the image manipulations on the renderer side (like level conversion, CMS, etc.)
The EVR consists of two components, the EVR Mixer and the EVR Presenter. The Mixer talks to the decoder, and can mix multiple video streams onto each other, before passing them on to the Presenter. The Mixer is also responsible for color space conversion. The Presenter is then responsible for showing the samples at the correct time.

MPC-HC uses the default Microsoft EVR Mixer.

The MPC-HC EVR Presenter only accepts RGB input, so it lets the Mixer do the YUV->RGB conversion. By Forcing 10-bit Input, it trys to get the Mixer to output 10bit data of this conversion, and hopefully also process it internally at this level (it might do float processing internally anyway, or use the output format, i don't think MS ever exposed those details). It does not affect the output from the decoder.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now  
Old 5th January 2011, 14:41   #15635  |  Link
Fadeout
Registered User
 
Join Date: Dec 2009
Posts: 150
Is there a list of ATI cards and what H264 decoding is supported on them?

For example I know that my 4850 can deal even with 1080p at L5.1 and I want to know economic models on sale now that supports all kinds of formats.
Fadeout is offline  
Old 5th January 2011, 14:53   #15636  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
http://en.wikipedia.org/wiki/Unified_Video_Decoder

All versions of UVD should support H264 decoding. Best results you'll get with all UVD 2.2 or UVD 3 models, of course.
Also don't forget that you might need performance to do rescaling and deinterlacing on the GPU, however the 5450 appears to handle that just fine already (lowest card in the 5xxx series)
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now  
Old 5th January 2011, 19:41   #15637  |  Link
Fadeout
Registered User
 
Join Date: Dec 2009
Posts: 150
Yes, I was more worried about the specifics. For example it seems some cards can't handle 1080p L5.1, or can't handle more than 8 ref frames.

It seemed that it didn't depend entirely on just the chip. Mine has just UVD 2 and can read everything (beside LQ, which is a driver thing). Some other cards seem to work fine only up to L4.1.

So could a 5450 do 1080p with L5.1 and 16 ref frames like mine?
Fadeout is offline  
Old 5th January 2011, 19:46   #15638  |  Link
hoborg
Registered User
 
Join Date: Nov 2008
Posts: 454
Quote:
Originally Posted by Fadeout View Post
Yes, I was more worried about the specifics. For example it seems some cards can't handle 1080p L5.1, or can't handle more than 8 ref frames.

It seemed that it didn't depend entirely on just the chip. Mine has just UVD 2 and can read everything (beside LQ, which is a driver thing). Some other cards seem to work fine only up to L4.1.

So could a 5450 do 1080p with L5.1 and 16 ref frames like mine?
Yes. Just remember you will need DxVA 2.0 OS (Vista/Win7).
__________________
Working machine: Win10x64 + Intel Skull Canyon
My HTPC.

How to start with Bitcoin
hoborg is offline  
Old 5th January 2011, 21:59   #15639  |  Link
G_M_C
Registered User
 
Join Date: Feb 2006
Posts: 1,076
Just saw this (SDK documentation for AMD/Ati Open Video Decode API) and wondered if this is interesting to you guys.

JanWillem: Ik volg je vorderingen met grote interesse. Ben benieuwd hoe eea uit zal gaan komen in MPC-HT !
G_M_C is offline  
Old 6th January 2011, 06:36   #15640  |  Link
mariner
Registered User
 
Join Date: Nov 2005
Posts: 583
Quote:
Originally Posted by Fadeout View Post
Yes, I was more worried about the specifics. For example it seems some cards can't handle 1080p L5.1, or can't handle more than 8 ref frames.

It seemed that it didn't depend entirely on just the chip. Mine has just UVD 2 and can read everything (beside LQ, which is a driver thing). Some other cards seem to work fine only up to L4.1.

So could a 5450 do 1080p with L5.1 and 16 ref frames like mine?
Greetings Fadeout.

micksh is doing some UVD3 testing which may be of interest to you.

http://www.avsforum.com/avs-vb/showp...&postcount=120
mariner is offline  
Closed Thread

Tags
dxva, h264, home cinema, media player classic, mpc-hc


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.