Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th July 2015, 18:57   #31881  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Quote:
Originally Posted by Akeno View Post
Previous versions of madVR had a slight increase in render times with d3d11 enabled but the current version actually lowered render times by a few milliseconds.
Not really sure what d3d9 and d3d11 are but I also wouldn't expect performance to change when updating versions.
A difference in rendering time between d3d9 and d3d11 is somewhat surprising, but the different path may be putting your GPU in a different power state, or it might be an artifact of how rendering times are measured. Unless you can confirm that d3d11 allows you to use settings where you'd be dropping frames with d3d9, I wouldn't pay too much attention to rendering times.
Ver Greeneyes is offline   Reply With Quote
Old 17th July 2015, 19:00   #31882  |  Link
Akeno
Registered User
 
Join Date: Jul 2015
Location: Seattle, Washington
Posts: 53
Quote:
Originally Posted by baii View Post
1. chroma upscaling is known to have little effect on real life material.
2. edit: nvm thought you said SVP, that wont make a difference.
3. people are working on it
4. does not matter ~~

most of the stuff go back to "judge with your own eye".
1. I've tried it out with CG and anime material as well but even then I can't see a difference.
2. So I can keep reclock set to original speed?
4. Considering the render times are noticeably different, I would think that it does matter. My question was more about why they would be slightly different though rather than should I set d3d11 on or off.

Judging things for myself is good advice but I always go mad whenever I see people write about how they can notice huge differences in various options while I can't see any. Gives off the feeling that my eyes are broken.

Quote:
Originally Posted by Ver Greeneyes View Post
A difference in rendering time between d3d9 and d3d11 is somewhat surprising, but the different path may be putting your GPU in a different power state, or it might be an artifact of how rendering times are measured. Unless you can confirm that d3d11 allows you to use settings where you'd be dropping frames with d3d9, I wouldn't pay too much attention to rendering times.
It does make the difference between dropping frames every few seconds with 30fps material using NNEDI3 16neurons. d3d11 gives me a few milliseconds of buffer room to prevent that.
On the note of GPU paths: This might have to do with an optimus issue but plugging in my laptop to my TV results in extremely high render times unless I restart my system. Despite render times being higher than the frame interval though, there are no visible dropped frames nor does madVR report dropped frames or presentation glitches.

Last edited by Akeno; 17th July 2015 at 19:06.
Akeno is offline   Reply With Quote
Old 17th July 2015, 19:05   #31883  |  Link
har3inger
Registered User
 
Join Date: Feb 2014
Posts: 139
These are mostly my opinions, but I think the gist is mirrored by many others in this thread:

1. A good number of IPS displays are not true 8 bit and use static dither built into the logic of the display to "upscale" from 6 bit to 8 bit. Sometimes it'll look better if you set the display bitdepth for dithering to the true (6 bit, if applicable) depth of your monitor, sometimes it looks better at 8, and this depends entirely on your own configuration. You may need to test this empirically on your end. Usually you want to feed a monitor 8 bit info (even if it's 6 bit) because the dither levels decided by madvr may be different from your monitor, which can cause way more noise than necessary. Unless you know for sure your monitor supports 10 bit color or higher (almost nothing does in a laptop), don't go higher than 8 bit.

Chroma quality differences in live-action film are going to be very hard to distinguish on any monitor unless you are trained well in what to look for. The best way to visualize differences is to try to find pure colors (red is best) on pure black background. Credit rolls and opening logo scenes are best for this. Otherwise, you can set chroma to nearest neighbor and look for the heavily aliased areas to identify places in the image where a chroma upscaler makes a lot of difference. As a sort of related tidbit, since it's hard to tell the difference between chroma upscalers, many people like to get some performance back by setting the upscaling to bicubic 75 AR.

2. Smooth motion should works just fine with 23.976->60. At least, it does for me. However, if you're able to test with reclock on and off already, it won't hurt to try and see what you like better. IIRC, reclock does cost a noticeable chunk of computation time, so it may be worth it to skip it and use some more obviously visible madvr features if you're bottlenecked by your laptop hardware.

3. I have no idea how the math of the current SuperRes algorithm works, and it's still very much in testing. IIRC madshi said that he'll be replacing those numerical entries with presets at some point in the future.

4. D3D9 and D3D11 are the same as the DX9 and DX11 that you see for games. They can enable new features, but by themselves shouldn't cause any differences in an image result. D3D11 needs a newer GPU (which you probably have if you have an IPS screen laptop) and enables 10 bit output. Otherwise, it really doesn't matter if you use D3D9 or D3D11. Go with whichever one runs faster unless you need 10 bit output for a 10bit or higher screen.

Edit: RE: your TV plugging issue: Probably some sort of OSD bug, where madvr gets confused about the stats when dealing with switchable graphics. If you don't see frame stuttering or weird jumping, it really shouldn't matter what the stats say. TBH, madvr support for laptops and switchable graphics isn't quite perfect (intel + AMD in laptops has been reported to not work with openCL or nnedi3) so just report whatever bugs you see and move on.

Last edited by har3inger; 17th July 2015 at 19:12.
har3inger is offline   Reply With Quote
Old 17th July 2015, 19:22   #31884  |  Link
Akeno
Registered User
 
Join Date: Jul 2015
Location: Seattle, Washington
Posts: 53
Thanks for the information, har3inger. I didn't know about the display bitdepth. Is it true even though the iGPU settings state that the monitor is 32bit?

For chroma upscaling, is this true even for CG or anime material? I personally can't see any difference even with extremely colorful and simple anime material unless I push it up to something like 4x zoom.

Regarding smooth motion: a personal preference, but I prefer to use smooth motion with a 60hz display rather than switching the refresh rate to 24hz. The judder is just too obvious at 24hz and smooth motion alleviates it slightly. I suppose an alternative would be to use SVP but I don't know how resource intensive it would be.
Akeno is offline   Reply With Quote
Old 17th July 2015, 19:25   #31885  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
D3D9:

D3D11:


*Sorry for the huge pics.
Shouldn't they shrink like on some other forums?

As you can see, I have a 30% faster rendering time with d3d11, if d3d9 is the base.
Or a 44% slow down with d3d9, if d3d11 is the base.
This is in P8 state (GTX660, 324MHz both gpu and memory clocks).
D3D11 definitely got faster in the last few releases.

When the GPU is in P2 state or full clock speeds, the rendering times are less than 4ms on both.
d3d9= 3.7ms
d3d11= 3.4ms
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 17th July 2015 at 19:33.
James Freeman is offline   Reply With Quote
Old 17th July 2015, 20:13   #31886  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Interesting! Well, I'm certainly not going to complain.
Ver Greeneyes is offline   Reply With Quote
Old 17th July 2015, 22:44   #31887  |  Link
Ceremony
Registered User
 
Join Date: Jan 2014
Posts: 93
Again, can someone help me with my AMD APU issue: The GPU clock does not increase while running madVR, thus I cannot higher quality scalers such as Jinc or resolution doubling:
Ceremony is offline   Reply With Quote
Old 17th July 2015, 22:58   #31888  |  Link
har3inger
Registered User
 
Join Date: Feb 2014
Posts: 139
Quote:
Originally Posted by Akeno View Post
Thanks for the information, har3inger. I didn't know about the display bitdepth. Is it true even though the iGPU settings state that the monitor is 32bit?

For chroma upscaling, is this true even for CG or anime material? I personally can't see any difference even with extremely colorful and simple anime material unless I push it up to something like 4x zoom.

Regarding smooth motion: a personal preference, but I prefer to use smooth motion with a 60hz display rather than switching the refresh rate to 24hz. The judder is just too obvious at 24hz and smooth motion alleviates it slightly. I suppose an alternative would be to use SVP but I don't know how resource intensive it would be.
8 bit display == 32 bit in Windows display. The 8 bit means 8 bits per color channel, for RGB, totaling 24. There's also an alpha (transparency) channel that also has 8 bits, adding up to 32 bit color. Yeah, it's a weird naming convention.

The chroma differences are more obvious with crisp edges in anime or CG works, but again, hard to discern unless you know exactly where in an image to be looking. A lot of chroma information is well obscured by the luma channel, as it's supposed to be. Better scaling generally shouldn't give you more vivid colors, but more accurate edges between colors and more accurate saturation in thin or small bits of color. Go with a recommended algorithm like jinc3, bicubic75AR or super-xbr 100 if you're in doubt.

24 hz material (or 23.976, difference should be unnoticeable) on a proper 24 hz display should be perfect and theoretically better than 60fps and smooth motion. If you're seeing judder, something is not working like you're expecting in your setup. May as well just stick with smooth motion because it's so amazing. The only artifact smooth motion introduces is very slightly increased ghosting from motion in a scene/camera pans.

You'll find a lot of the answers to questions here basically come down to "I dunno, play around with the settings and pick something you like best. After all, you're the only one you're deciding settings for" .

Quote:
Originally Posted by Ceremony View Post
Again, can someone help me with my AMD APU issue: The GPU clock does not increase while running madVR, thus I cannot higher quality scalers such as Jinc or resolution doubling
Force GPU clocks to max for mpchc through CCC if you can. You might need to rename the mpchc executable to something else for this to work (I had to--ATI is stupid and locks certain applications to lower power states or iGPU and takes away user control). You can also try turning off hardware decoding (DXVA) in LAV filters. This tends to lock clocks for AMD gpus to lower clocks as well. In general, GPU power states are entirely controlled by your OS and GPU driver settings and how they handle your video player (NOT madVR, which runs inside the vid player). There's likely nothing madVR can do to force certain GPU power states, so unfortunately, don't expect support here if troubleshooting doesn't work.

Last edited by har3inger; 17th July 2015 at 23:04.
har3inger is offline   Reply With Quote
Old 17th July 2015, 23:07   #31889  |  Link
MS-DOS
Registered User
 
Join Date: Sep 2012
Posts: 77
Quote:
Originally Posted by Ceremony View Post
The GPU clock does not increase while running madVR,
It has nothing to do with MadVR. Disable DXVA in your video decoder and try again.
MS-DOS is offline   Reply With Quote
Old 17th July 2015, 23:25   #31890  |  Link
Akeno
Registered User
 
Join Date: Jul 2015
Location: Seattle, Washington
Posts: 53
Quote:
Originally Posted by har3inger View Post
8 bit display == 32 bit in Windows display. The 8 bit means 8 bits per color channel, for RGB, totaling 24. There's also an alpha (transparency) channel that also has 8 bits, adding up to 32 bit color. Yeah, it's a weird naming convention.

24 hz material (or 23.976, difference should be unnoticeable) on a proper 24 hz display should be perfect and theoretically better than 60fps and smooth motion. If you're seeing judder, something is not working like you're expecting in your setup. May as well just stick with smooth motion because it's so amazing. The only artifact smooth motion introduces is very slightly increased ghosting from motion in a scene/camera pans.

You'll find a lot of the answers to questions here basically come down to "I dunno, play around with the settings and pick something you like best. After all, you're the only one you're deciding settings for" .
Just so I'm clear, I have correctly assumed that my monitor is, in fact, a 8bit display based on the 32bit setting. If the settings said 24bit, the monitor would be a 6bit display and likewise 40bit would equate to a 10bit display, correct?

Unfortunately, my TV doesn't support any interpolation settings when connected to a computer. I'd like to try out the cineflow options on it like what another user has posted but oh well.

On a side note: What's the bilateral option? I remember madashi saying it works well on some content but horrible on others.
Akeno is offline   Reply With Quote
Old 17th July 2015, 23:57   #31891  |  Link
Ceremony
Registered User
 
Join Date: Jan 2014
Posts: 93
Quote:
Originally Posted by MS-DOS View Post
It has nothing to do with MadVR. Disable DXVA in your video decoder and try again.
Using CPU decoding did nothing to counter the issue, improve performance, increase clockspeed or anything else along the lines. The issue persists.

However, it is still a madVR issue. I highly doubt madVR is at fault here, however, there clearly is a compatibility issue with my setup: madVR needs performance, desperately, but fails to reserve just that...

Bottom line: How do I fix this?
Ceremony is offline   Reply With Quote
Old 18th July 2015, 00:18   #31892  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
reinstall the driver and delete everything that can change or control the clock of your GPU there are so many possibilities.

after that try to rename your mpc-hc exe.

BTW. your screen shows a huge issue with the buggy and most likely windows 7 desktop composition.
huhn is offline   Reply With Quote
Old 18th July 2015, 00:41   #31893  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by huhn View Post
BTW. your screen shows a huge issue with the buggy and most likely windows 7 desktop composition.
With Windows 10 basically weeks away just prepare for the upgrade and do a fresh install..
ryrynz is offline   Reply With Quote
Old 18th July 2015, 00:42   #31894  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by Akeno View Post
Just so I'm clear, I have correctly assumed that my monitor is, in fact, a 8bit display based on the 32bit setting. If the settings said 24bit, the monitor would be a 6bit display and likewise 40bit would equate to a 10bit display, correct?
No. It's impossible to tell if a display is truly 6-bit or 8-bit just by looking at control panel options (and in fact there is no such thing as "24-bit" in these options). In both cases 8-bit data is sent over the wire and the GPU has no idea if the display is going to display it directly as 8-bit or if it is going to apply some FRC/dithering first to feed a 6-bit panel. The only way to tell is to actually look at the specifications of the display (and prey they are correct).

Same thing for 10-bit: just because you're sending 10-bit data to a display does not mean the panel is truly 10-bit. In fact I'm not even sure such panels exist, I've only heard of 8-bit panels where the 10-bit signal goes through some FRC first.
e-t172 is offline   Reply With Quote
Old 18th July 2015, 02:21   #31895  |  Link
dansrfe
Registered User
 
Join Date: Jan 2009
Posts: 1,210
Quote:
Originally Posted by e-t172 View Post
No. It's impossible to tell if a display is truly 6-bit or 8-bit just by looking at control panel options (and in fact there is no such thing as "24-bit" in these options). In both cases 8-bit data is sent over the wire and the GPU has no idea if the display is going to display it directly as 8-bit or if it is going to apply some FRC/dithering first to feed a 6-bit panel. The only way to tell is to actually look at the specifications of the display (and prey they are correct).

Same thing for 10-bit: just because you're sending 10-bit data to a display does not mean the panel is truly 10-bit. In fact I'm not even sure such panels exist, I've only heard of 8-bit panels where the 10-bit signal goes through some FRC first.
But you can sort of eyeball gradient patterns on images in exclusive mode to check, right? Or is that unreliable too?

For the record, I did try the image test as the other thread on the topic suggested and although it suggests the screen is 10-bit with a 10-bit signal, I find it hard to believe since 10-bit screens are supposed to be crazy expensive and mine is not.
dansrfe is offline   Reply With Quote
Old 18th July 2015, 04:40   #31896  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
@madshi: I've noticed something interesting about D3D9 windowed mode: it seems to deal with repeated frames much worse than the new D3D11 path. I thought this was an artifact of livestreamer, but whenever a stream drops frames, the D3D9 path seems to show a much older frame (the oldest in the queue?), which looks very glitchy. The D3D11 path, on the other hand, just stutters a little, which is how I would expect it to look.

This behavior might only occur when the source is dropping frames, i.e. not when madVR's queues run empty because the system can't keep up. If the queues are empty it presumably just repeats the last frame it had. Is there an error in the logic here? Is it pulling up the wrong frame to repeat when the queues are filled up?

Edit: I think this behavior only happens when "present several frames in advance" is checked in the windowed mode settings. It seems to just stutter with that unchecked, like with the D3D11 path.

Last edited by Ver Greeneyes; 18th July 2015 at 06:35.
Ver Greeneyes is offline   Reply With Quote
Old 18th July 2015, 06:27   #31897  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
After viewing some more content with image sharpening enabled, I have some more observations:
  • Image sharpening is very subjective compared to image upscaling. A subtle change in settings can go from being too sharp to just sharp enough. Others may disagree.
  • As a result of this subjectivity, I would suggest adding as many presets as possible. A Low, Med, High seems too restrictive. 1-10 would be much better, or, instead, a Beginner and Expert setting. It would be very challenging to get the settings just right for every display.
  • SuperRes is the most natural sharpener. It would be nice if this shader could be added to Image Enhancements somehow for consistency's sake.
  • If not, FineSharp, what I would consider the best sharpener in Image Enhancements, is in dire need of an anti-ringing filter.
  • This is my primary criticism of image sharpening; it seems almost useless in preserving the illusion of a natural image without some type of anti-ringing.
Warner306 is offline   Reply With Quote
Old 18th July 2015, 06:52   #31898  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
How do you think 50fps film on a 120Hz monitor with SmoothMotion will perform/look?
24fps on a 60Hz monitor ratio is 2.5, so 50fps on 120hz will be 2.4.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 18th July 2015, 07:08   #31899  |  Link
Schwartz
Registered User
 
Join Date: Dec 2012
Posts: 67
That's the reason I haven't touched sharpening either. I rather take a softer picture without extra artifacts. Remember ffdshow's xsharpen? Now that would be a filter I'd like to see some time in madVR. It was fairly lightweight and it even reduced artifacts.
Schwartz is offline   Reply With Quote
Old 18th July 2015, 08:08   #31900  |  Link
Dogway
Registered User
 
Join Date: Nov 2009
Posts: 2,352
To answer Schwartz above, and despite troll "Ver Greeneyes" calling me troll, which goes to show who lacks language comprehension, I want to further dig on the sharpening method I discussed on my (apparently content void) post.

If we are able to do smooth motion in real time, we are only one step away of doing motion guided sharpening, or in other words, motion compensated sharpening which will by all means enhance sharpness where it needs (static high freq) and thus not sharpening grain/artifacts, etc. As it is now we only have spatial sharpeners, save for SuperRes which we have no way to know what it really does.
Dogway is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:49.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.