Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 23rd November 2012, 04:39   #15681  |  Link
crotecun
Registered User
 
Join Date: Oct 2012
Posts: 27
Quote:
Originally Posted by rahzel View Post
Ya, everything in the video quality settings is disabled, too. And again, EVR CP + DXVA doesn't change the colors. As shown in my screenshots a page back, my HD4000 seems to change the gamma a bit, too.

The good news is that it does indeed work. CPU usage dropped from ~30% or ~8% on my somewhat dated HTPC (AMD Athlon II 250 + Radeon 5570).
Is that so... well then, try the rollback method for your card! It's what I did just now since I couldn't quite tell whether disabling everything for video color and quality in AMD Catalyst control center really does disable everything.

To be sure that it doesn't keep any Catalyst control center settings, you have to uninstall the video card driver too.



That way it wipes the registry entries for your video card in HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video. When the operating system automatically reinstalls the video drivers when your restart your PC, it would give you a clean slate without any Catalyst settings.

Before, with Catalyst control panel installed - left side is the unaltered video, right side is with all the Catalyst video color and quality settings on:



After, with Catalyst control panel uninstalled, video card drivers uninstalled from device manager and registry settings wiped:

crotecun is offline   Reply With Quote
Old 23rd November 2012, 04:49   #15682  |  Link
mindbomb
Registered User
 
Join Date: Aug 2010
Posts: 576
with amd drivers, even with the color settings set to use player's preference, the advanced video color still defaults to having a bunch of crap on.
and, with amd, I have crashes on seeking, using lav splitter, dxva2 with bilinear chroma, in fullscreen mode.
mindbomb is offline   Reply With Quote
Old 23rd November 2012, 04:52   #15683  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by ajp_anton View Post
Green line at the top of the screen when using DXVA scaling and any kind of decoding, including madVR's own.

http://ajpanton.se/sample.mkv

Win7 x64, MPC-HC 6240, Intel HD3000.
dxva2 scaling works fine with hd 4000
tested with quicksync, dxva2 native, dxva2 cp and lav software decoder
huhn is offline   Reply With Quote
Old 23rd November 2012, 05:05   #15684  |  Link
Gary.M
Registered User
 
Join Date: Sep 2011
Posts: 14
Quote:
Originally Posted by madshi View Post
But let's get back to the roots: Do we really want to replicate what old CRTs did? Does it not make more sense to design the controls today in such a way that they serve their purpose better?
No we don't, but the situation is the same... we need to adjust the range somehow without distorting the greyscale. Maybe look at it, based on my graph, as brightness - adjust slope of line with current white point fixed as the pivot. Contrast - adjust slope of the line with current black point fixed as the pivot. That we we get a full range of adjustment, independence between black and white points and controls, and the greyscale isn't mucked up.

Sorry I'm an analogue guy.
Gary.M is offline   Reply With Quote
Old 23rd November 2012, 05:11   #15685  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Quote:
Originally Posted by 6233638 View Post
So I'm not sure what conclusion to draw from all this. If your problem is specifically with 60fps content, the first thing I would try is switching to software decoding on the CPU to see if that helps (assuming your CPU is up to the task) but it does seem most likely to be a GPU load issue, unfortunately.
I never have used GPU decoding and always have my GPU overclocked in a full-power state.

The only conclusion is that the GT 440 DDR5 (16-19ms rendering times) and lesser GPUs in full-power state, not having enough muscle for 1920x1080p60 Catmull-Rom linear light + AR downscaling. As for 1920x1080i30->60p downscaling, I'd assume it would need at least a GT 650 or GT 650 TI in full-power state, considering rendering times were 21-31ms + 3ms on the GT 440. It came to a point that I'd rather have reliable playback on all types of content, rather than needing to deal with switching of settings around all the time, else be greeted by unexpected dropped frames.

As madshi stated a few times in the past, downscaling is more GPU intensive than the same algorithm used for upscaling.

Last edited by cyberbeing; 23rd November 2012 at 05:21.
cyberbeing is offline   Reply With Quote
Old 23rd November 2012, 07:44   #15686  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
I noticed enabling DXVA upscaling on my Intel HD4000 changes the color output, some colors are darkened some are lightened, everything under the media tab is set to application settings or is unticked. There's a slight pixel shift as well *shrug*

DVXA upscaling vs Jinc 3 AR.

http://screenshotcomparison.com/comparison/159646
ryrynz is offline   Reply With Quote
Old 23rd November 2012, 09:26   #15687  |  Link
TheLion
Registered User
 
Join Date: Dec 2010
Posts: 62
Quote:
Originally Posted by ryrynz View Post
I noticed enabling DXVA upscaling on my Intel HD4000 changes the color output, some colors are darkened some are lightened, everything under the media tab is set to application settings or is unticked. There's a slight pixel shift as well *shrug*

DVXA upscaling vs Jinc 3 AR.

http://screenshotcomparison.com/comparison/159646
In this comparison the DVXA looks ALOT better than madVR Jinc3 to my eyes. I am especially talking about the color/contrast differences. Is the same color matrix used?
TheLion is offline   Reply With Quote
Old 23rd November 2012, 10:20   #15688  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by ajp2k11 View Post
The windowed backbuffer setting is already at it's default 3? I switched to fullscreen because I wanted that logged too, didn't work either. It seems some files are harder to get to play than others, the one I'm testing with now seems impossible... others work after a while if I leave it alone or switch between windowed and fullscreen a couple of times, at least I think so...

EDIT: Some files play ok but some files just refuse to play...? They play fine using EVR/CP + LAV...
Ah sorry, I misread the log file. Just checked it again. The problem is not actually the number of backbuffers. The real problem is that the GPU driver is flat out refusing to provide madVR with proper VSync scanline information. I'm not sure why. Does your laptop have switchable Intel + AMD graphics? If so, does the problem go away if you turn the AMD GPU off? Or can you alternatively turn the Intel GPU off? I'm not sure what to suggest. madVR requires access to proper VSync scanline information in order to be able to draw the video frames at the right time. I would guess there's a problem with your drivers somehow...

Quote:
Originally Posted by leeperry View Post
The floating point stuff works for me in EVR, but it's visually identical to the regular mode.
Oh, interesting. That somehow differs from how I understood Jan-Willem.

Quote:
Originally Posted by leeperry View Post
So how come the PS gamut mapping script measures perfectly if everything is crushed like hell? This is confusing =/
I never said anything was crushed like hell. I just said that with standard settings, using custom shaders with EVR cuts BTB and WTW. That's not a *big* problem, it's probably not visible at all on a properly calibrated display, but I still consider it wrong. Anyway, please just have a bit of patience. As I hinted several times already, I'm still in discussion with Jan-Willem about this.

Quote:
Originally Posted by TheShadowRunner View Post
Arg, but hmm this doesn't help?
DXVA_1.01 API
Or those 2 topics: 1/2 ?
All of that talks about how to write a DXVA1 decoder, not a renderer. Writing a DXVA1 decoder is not the problem. You just connect to VMR and provide all the bitstream to VMR. But if a DXVA1 decoder connects to madVR and delivers the DXVA1 bitstream packages to madVR, I don't know what to do with it. It's not documented anywhere how VMR passes the DXVA1 bitstream packets to the GPU for decoding, so I don't know how to do that.

Anyway, DXVA decoding on XP is problematic, anyway. It's always been very unstable (blue screens etc) on my XP PC. And there's a fundamental problem when resizing the window, too. Resizing the window results in madVR trying to reset the D3D9 device, but in XP that is only possible if all GPU resources are destroyed first. That makes things very very complicated when using external DXVA decoders, because they hold some GPU resources, as well. In Vista (and win7 and win8) resetting the D3D9 device is possible without having to destroy all GPU resources first. So that's a much more friendly environment for DXVA decoding. As a result I have no intentions to support DXVA1/2 decoding on XP at all. Even if suddenly documentation for writing a DXVA1 renderer showed up, I'd probably not support it for XP because of the D3D9 device reset problem. FWIW, I fully support DXVA2 deinterlacing and scaling on XP and it works very stable for me. But right now chances for getting DXVA1/2 decoding support for XP are extremely low for madVR users.

Quote:
Originally Posted by secvensor View Post
At switchings between algorithms Jink in v0.85.1 becomes inaccessible
I've tried, but I can't seem to reproduce this. Can you please provide me with a step-by-step instruction to reproduce this problem? Thanks!

Quote:
Originally Posted by 6233638 View Post
I've actually just got in and had a chance to actually measure what your control is doing on my displays, and it is exactly what you have said. I misunderstood what it was doing, and it works exactly as intended - it reduces saturation while keeping luminance constant - a true saturation control.
Cool, it's good to be on the same page again, and we can put this to rest now.

Quote:
Originally Posted by 6233638 View Post
I think this would be an excellent solution. It may cause some confusion for people used to traditional "brightness" and "contrast" controls, but as long as you have some kind of black & white level controls inside madVR as well, I think it will work well.
Great! Seems we've found a solution now which should satisfy most people.

Quote:
Originally Posted by 6233638 View Post
Again, I'll trust your judgement on this. My area of expertise is generally in actually calibrating and evaluating displays, and less-so in terms of image processing.

I'm still not fond of "image enhancement" processing, rather than trying to achieve the most accurate calibration, but it doesn't sound like a bad control to have for people that want it.
I'm not really a fan of that, either, but I've been told by several users that their displays lack important controls and so sometimes it can be useful to have such controls in madVR instead. Of course the recommended setting for all these options will be "neutral".

Quote:
Originally Posted by cyberbeing View Post
This isn't exactly a safe default from a performance perspective on weaker GPUs with 60fps content or when deinterlacing. On my GT440 using either AR or Linear for Image downscaling on such content is a no-go, which is why I switched to using normal Spline36 Image downscaling awhile back.
Quote:
Originally Posted by 6233638 View Post
Unfortunately it seems that when downsampling, the biggest performance hit is linear light scaling. When my card was set to the lowest power state, I found that I was able to use Lanczos 3 for downscaling, but couldn't use Catmull-Rom linear, even without the anti-ringing filter. So perhaps an option with linear light scaling enabled doesn't make for the best default, even though Catmull-Rom is typically one of the less demanding scaling algorithms.

The problem is that when downsampling images, ideally you should be using linear light scaling if at all possible.
Even though Catmull-Rom is actually one of the least demanding scaling algorithms in madVR right now, it happens to be the best looking when downsampling in linear light, as long as you have the anti-ringing filter enabled. If your GPU can't handle that combination though, I wouldn't recommend using linear light scaling at all.
Originally I was planning to use DXVA2 scaling as the new default option, because that should (hopefully) run smooth on even rather slow GPUs. But then with the original v0.85.0 DXVA2 scaling wasn't working well at all, so I decided to use different defaults. But now with v0.85.1 DXVA2 scaling seems to work fairly well and it seems that the scaling algorithms offered by Intel especially, but maybe also by ATI and NVidia might be acceptable as a default option for budget GPUs. What do you think? I would really like the first impression of new users to be positive. It can't be positive if the first impression is that madVR produces a slide-show. So maybe using DXVA2 scaling as default option might be a good idea?

Quote:
Originally Posted by Aleksoid1978 View Post
madshi hi.
Question - how can I find out which mode is currently using in madVR - DXVA or not. MPC-BE for internal DXVA compatible renderer use Hook to detect DXVA mode. Maybe you add api call for this ?

If not - do not worry, have also made ​​also use a hook
No problem, I can add an interface for that.

Quote:
Originally Posted by ajp_anton View Post
Green line at the top of the screen when using DXVA scaling and any kind of decoding, including madVR's own.

http://ajpanton.se/sample.mkv

Win7 x64, MPC-HC 6240, Intel HD3000.
Which "target rectangle" does the madVR debug OSD (Ctrl+J) show when the green line shows up? Does the green line appear no matter which zoom factor you're using? Does it occur with every video, or just with some? Tnx.

Quote:
Originally Posted by rahzel View Post
Using madVR DXVA2 on my Radeon 5570 also shows different colors compared to software decoding, DXVA2-CB and using EVR CP DXVA. Only madVR DXVA2 native makes colors look different (judging by the screenshots taken by MPC HC anyway). All of my 'enhancements' in the drivers are either disabled or set to application preference. Using LAV 0.53.2 with the default LAV video decoder settings.
Quote:
Originally Posted by DragonQ View Post
It's set to nVidia Settings, everything in the Colour and Gamma tabs is default but Dynamic Contrast Enhancement and Colour Enhancement are off, with Dynamic Range set to 0-255.
Quote:
Originally Posted by ryrynz View Post
I noticed enabling DXVA upscaling on my Intel HD4000 changes the color output, some colors are darkened some are lightened, everything under the media tab is set to application settings or is unticked. There's a slight pixel shift as well *shrug*

DVXA upscaling vs Jinc 3 AR.

http://screenshotcomparison.com/comparison/159646
Can you guys please test the following:

(1) Use software decoding and e.g. Bilinear scaling.
(2) Use native DXVA2 decoding and e.g. Bilinear scaling.
(3) Use software decoding and DXVA2 scaling.
(4) Use native DXVA2 decoding and DXVA2 scaling.

The reference is (1) for colors, brightness, contrast and gamma. This is how the image must look. Please check which of (2), (3) and (4) are different from the reference and which are identical. Also please check if maybe (4) is even more different than (2) and (3) are.

(I don't need screenshots.)

Please also list your GPU, your drivers, and your OS.

Thank you!!!

Quote:
Originally Posted by TheLion View Post
In this comparison the DVXA looks ALOT better than madVR Jinc3 to my eyes. I am especially talking about the color/contrast differences. Is the same color matrix used?
The scaling algorithm itself (e.g. Jinc vs. Bilinear) does not have any effect on color/contrast. I'm just saying that to clarify that the DXVA2 scaling algorithm itself isn't better than Jinc. It just happens that the colors look different when using DXVA2 for some reason. This could be a bug in madVR, or a misbehaviour of the GPU drivers. In any case, the DXVA2 colors are very very likely to be incorrect. It's just a happy accident that they might look pleasing in this specific case.
madshi is offline   Reply With Quote
Old 23rd November 2012, 10:50   #15689  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
On my GT 440, DXVA2 scaling seems to be Bilinear, yet with blurry misaligned (centered pos instead of left pos?) chroma, so I don't really think that's a great default either.
cyberbeing is offline   Reply With Quote
Old 23rd November 2012, 10:55   #15690  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Even with v0.85.1, it's Bilinear for you? That's disappointing...
madshi is offline   Reply With Quote
Old 23rd November 2012, 11:14   #15691  |  Link
romulous
Registered User
 
Join Date: Oct 2012
Posts: 179
Quote:
Originally Posted by madshi View Post
I'll put this on my list of things to look it. For now I'm more concerned about fixing all the new bugs with DXVA2 decoding and scaling first.
Thanks madshi, much appreciated!
romulous is offline   Reply With Quote
Old 23rd November 2012, 11:26   #15692  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by madshi View Post
Oh, interesting. That somehow differs from how I understood Jan-Willem.

I never said anything was crushed like hell. I just said that with standard settings, using custom shaders with EVR cuts BTB and WTW. That's not a *big* problem, it's probably not visible at all on a properly calibrated display, but I still consider it wrong.
I just compared:
-EVR CP in PotP
-VMR9 in an old build from Casimir666 released in 2007
-EVR CP in the recent 6240 MPC build, whatever in regular or float mode

= they all give identical colors with the nightvision script on my rec709 video test pattern!

so if BTB/WTW are trimmed, it all remains a big mystery to me as to why the AVS PS script outputs the exact same colors as ddcc():

untouched:

AVS PS script: ddcc:

there are some slight R/G/B differences in the <16 and >235 regions but these might account for different gamma curves or slightly different coeffs...nothing appears to be "cut"

Last edited by leeperry; 23rd November 2012 at 11:29.
leeperry is offline   Reply With Quote
Old 23rd November 2012, 11:35   #15693  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Quote:
Originally Posted by madshi View Post
Even with v0.85.1, it's Bilinear for you? That's disappointing...
Definitely is Bilinear, but slightly worse quality than the madVR implementation. The only improvement I see from v0.85.1, is that DXVA2 scaling no longer crops off a portion of the frame.
cyberbeing is offline   Reply With Quote
Old 23rd November 2012, 11:47   #15694  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by leeperry View Post
I just compared:
-EVR CP in PotP
-VMR9 in an old build from Casimir666 released in 2007
-EVR CP in the recent 6240 MPC build, whatever in regular or float mode

= they all give identical colors with the nightvision script on my rec709 video test pattern!

so if BTB/WTW are trimmed, it all remains a big mystery to me as to why the AVS PS script outputs the exact same colors as ddcc()
Maybe you don't understand what BTB/WTW are or what purpose they have? If BTB/WTW are cut, that has no effect whatsoever on the "valid"/visible colors.

Anyway, could you please give this finally a rest? I've told you many times, this topic is still under discussion between the devs. I will not do anything about this until the discussion has finished. Posting any more about this really doesn't make any sense right now.

Quote:
Originally Posted by cyberbeing View Post
Definitely is Bilinear, but slightly worse quality than the madVR implementation.
Too bad.

So, suggestions for madVR default scaling settings, anyone? I'd like to aim low. Maybe I should be brutal and set everything to bilinear to achieve smooth results on any GPU for first time users? But that would be too extreme, I guess...
madshi is offline   Reply With Quote
Old 23rd November 2012, 11:52   #15695  |  Link
kasper93
MPC-HC Developer
 
Join Date: May 2010
Location: Poland
Posts: 586
@crotecun:
Using old Microsoft drivers is just wrong :< And you don't need to uninstall CCC in order to disable video enhance features. Just disable everything and turn off demo mode, and that's all.

Last edited by kasper93; 23rd November 2012 at 11:54.
kasper93 is offline   Reply With Quote
Old 23rd November 2012, 11:56   #15696  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by madshi View Post
If BTB/WTW are cut, that has no effect whatsoever on the "valid"/visible colors.
Well, I was feeding full range 0-255 RGB32 to this PS script so if <16 and >235 were harmed, this would show...and it does not, that's my point. It's virtually identical to the 0-255 RGB32 test pattern that was fed to ddcc().

Last edited by leeperry; 23rd November 2012 at 12:56.
leeperry is offline   Reply With Quote
Old 23rd November 2012, 12:06   #15697  |  Link
Prinz
Registered User
 
Join Date: Jul 2011
Posts: 83
Quote:
Originally Posted by madshi View Post
So, suggestions for madVR default scaling settings, anyone? I'd like to aim low. Maybe I should be brutal and set everything to bilinear to achieve smooth results on any GPU for first time users? But that would be too extreme, I guess...
Don't know what it's a good default setting. I can only say I have to set everything to bilinear or my GPU (ATI 2600 XT) will drop frames with some videofiles.
Prinz is offline   Reply With Quote
Old 23rd November 2012, 12:09   #15698  |  Link
CiNcH
Registered User
 
CiNcH's Avatar
 
Join Date: Jan 2004
Posts: 567
Quote:
* added support for media player color controls (IVMRMixerControl9)
* added support for "IQualProp" interface for media player statistics display
Is there a complete list of MS interfaces that you support somewhere?
__________________
Bye
CiNcH is offline   Reply With Quote
Old 23rd November 2012, 12:22   #15699  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by TheLion View Post
In this comparison the DVXA looks ALOT better than madVR Jinc3 to my eyes. I am especially talking about the color/contrast differences. Is the same color matrix used?
Yup, I'll agree some areas look more saturated and "better" but overall I don't like it mostly because of how overly sharp it is, I'll continue to use Lanczos 3 AR over DXVA.

Quote:
Originally Posted by Prinz View Post
Don't know what it's a good default setting. I can only say I have to set everything to bilinear or my GPU (ATI 2600 XT) will drop frames with some videofiles.
Once again falling into final stage fine-tuning but having MadVR perform a benchmark and auto selecting a preset would be quite cool.

Last edited by ryrynz; 23rd November 2012 at 12:26.
ryrynz is offline   Reply With Quote
Old 23rd November 2012, 12:34   #15700  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by leeperry View Post
Well, I was feeding full range 0-255 RGB32 to this PS script so if <16 and >235 were harmed, this would show...and it does not, that's my point.
You fed full range 0-255 RGB32 to the *renderer*, not to the PS script. IIRC, MPC-HC's EVR and VMR treat all RGB input as fullrange by default, so your test didn't really say anything about BTB/WTW at all. It would have been a useful test only if the <16 and >235 areas of your test pattern were treated as BTB/WTW by the renderer, but they were not, I believe. Try feeding 16-235 RGB to MPC-HC's EVR/VMR and you'll see that black turns gray, proving that with RGB input, MPC-HC's EVR/VMR renderers expect BTB and WTW <0 >255, so basically with RGB input there never is any BTB/WTW.

If you want to really test whether BTB and WTW are cut with EVR with default settings, just apply the following 2 scripts after each other, with no other scripts: (1) "16-235 -> 0-255", (2) "0-255 -> 16-235". Then feed RGB with a gray scale. If BTB and WTW are cut, the darkest gray you should get is 16,16,16. If BTB and WTW are not cut, you should get 0,0,0. And this is exactly what happens with EVR on my PC. EVR produces 16,16,16. madVR maintains the full range down to 0,0,0.

Anyway, as I said about a dozen times now, and as I have already said right in the very v0.85.0 release notes, the levels custom shaders run in are still under discussion. So it really doesn't make much sense to discuss it here in the forum now.

Quote:
Originally Posted by Prinz View Post
Don't know what it's a good default setting. I can only say I have to set everything to bilinear or my GPU (ATI 2600 XT) will drop frames with some videofiles.
So you get drops with bilinear but not with more demanding scalers? That's weird...

Quote:
Originally Posted by CiNcH View Post
Is there a complete list of MS interfaces that you support somewhere?
Hmmmm... Good question. Probably if you search through the changelog. But let me check the source code. Here you go:

- ISpecifyPropertyPages
- IVideoWindow
- IBasicVideo
- IBasicVideo2
- IKsPropertySet
- IVMRMixerControl9
- IQualProp
- IMFGetService -> IDirect3DDeviceManager9, IDirectXVideoMemoryConfiguration

There might be some more supported by the Microsoft base classes madVR is building on, but I don't think so. If media player devs need more MS interfaces than those listed above, I'm willing to put that on my to do list.
madshi is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:38.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.