Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 6th June 2014, 00:56   #26581  |  Link
TheDarkTemplar
Registered User
 
Join Date: Dec 2011
Posts: 23
Quote:
Originally Posted by Soukyuu View Post
I think you might have to backtrack to the madvr version which doesn't cause those artifacts for you, so that madshi might narrow down the code change which is causing it.
I tried many previous increasingly older versions of madVR and it continued to give the blur in all of them, so it wasn't an issue with updating madVR. However, there was no blur when using any other renderer so I figured that it must be an issue in an update with either LAV or MPC-HC that simply reacts negatively only with madVR.

So, I tried an older version of MPC-HC with the latest madVR and the blur disappeared. I then figured I would try to find the cutoff of which version the issue starts on. Once I found it, I noticed the first one that started having the problem was also the first one that was retaining my settings of the latest version of MPC-HC.

I then reset MPC-HC back to default settings and it eliminated the problem completely, using the latest versions of MPC-HC, madVR, and LAV. The madVR and LAV settings didn't change and are still at what I had them at, so I must have had some really strange setting set in MPC-HC that was reacting badly with madVR. I went through all the MPC-HC settings though and I can't imagine what it could have possibly been. I don't know what setting was even different now that I've set everything to what I want again and nothing seems like it would cause issues of that sort. I guess I will never know...

Anyway, I am just glad this extreme annoyance is gone and I can finally watch videos clearly again. Thanks for all the help!

Last edited by TheDarkTemplar; 6th June 2014 at 01:14.
TheDarkTemplar is offline   Reply With Quote
Old 6th June 2014, 02:27   #26582  |  Link
G_M_C
Registered User
 
Join Date: Feb 2006
Posts: 1,076
Quote:
Originally Posted by huhn View Post
how should that help MadVR in the end it is rendered in 8 bit
Thats not the point: How's the NNedi3 performance with madVR under 14.6. If it's bad, i wont upgrade and wait until next version. If NNedi performance is OK, i'll upgrade and get the color-depth controls i hope to use. So the question is related to madVR in the sense of NNedi performance ....

Plus it's a sign that madshi could start working on 10-bit out, cause it looks like high-bitdepth controls and usability are finally coming to consumer level video cards

Last edited by G_M_C; 6th June 2014 at 02:31.
G_M_C is offline   Reply With Quote
Old 6th June 2014, 06:49   #26583  |  Link
QBhd
QB the Slayer
 
QBhd's Avatar
 
Join Date: Feb 2011
Location: Toronto
Posts: 697
Quote:
Originally Posted by G_M_C View Post
Thats not the point: How's the NNedi3 performance with madVR under 14.6. If it's bad, i wont upgrade and wait until next version. If NNedi performance is OK, i'll upgrade and get the color-depth controls i hope to use. So the question is related to madVR in the sense of NNedi performance ....

Plus it's a sign that madshi could start working on 10-bit out, cause it looks like high-bitdepth controls and usability are finally coming to consumer level video cards
The only way to find out if performance takes a hit on your system is to try it... it takes very little effort.

Some people have claimed no hit, others have claimed there is a hit. I myself reverted back to 13.12

QB
__________________
QBhd is offline   Reply With Quote
Old 6th June 2014, 13:48   #26584  |  Link
Procrastinating
Registered User
 
Procrastinating's Avatar
 
Join Date: Aug 2013
Posts: 71
I noticed a slight performance dip from the previous AMD drivers, but no change with these drivers. It wasn't enough that I bothered to revert, however.
Procrastinating is offline   Reply With Quote
Old 6th June 2014, 18:17   #26585  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Anime Viewer View Post
What were the problems experienced by nVidia Optimus systems? Given I've got an Optimus system I can see if its still an issue, so Optimus users will know if its best to continue to leave them unchecked.
As far as I remember, using the "use separate device for presentation" option didn't work properly. I think it produced a black screen instead of video output, or something like that.

Quote:
Originally Posted by Nico8583 View Post
I would like to play my 1080p MKV with MPC-HC or Potplayer (I don't really test them but it's another subject ) without any filter (resize & co).
I have a Gigabyte Brix with 2955u CPU and Intel HD Graphics, and I would like to know if there is a difference between MadVR and others like LAV Filters and FFDShow, if I don't use any filter ?
There are still differences in the following areas:

(1) Chroma upsampling (always needed!).
(2) Color conversion (always needed).
(3) Dithering (always needed).
(4) Smooth Motion FRC (you may or may not need this).
(5) Auto refresh rate switching (you may or may not need this).

Some of these differences can be minor, some can be major. It's possible you won't see/notice any difference. Or it could be a dramatic difference. So just compare yourself.

Quote:
Originally Posted by livache View Post
* "Pause" OSD message no longer blocked

could this be made optional, please?
It displays the paused OSD at the end of playback in mpc-hc instead of just black fullscreen.
That can cause burn on plasma if left unsupervised for many hours (like falling asleep and forgetting to set the timer). MPC-HC doesn't show that paused OSD with any other renderer and probably for good reason.
This sounds like a bug to me. At the end of playback the message should automatically be removed. Can you please add a bug entry to the madVR bug tracker?

Quote:
Originally Posted by turbojet View Post
There isn't, I brought it up here because of a good chance that it might be something only I'm experiencing, like so many things in the past. I'll add it to the bugtracker if others experience the same thing. Do they?

I'm also experiencing random slow motion when ivtcing 59 to 23. No dropped frames or cadence breaks and the only way to get it back to normal speed is to seek or pause/play. If I rewatch where slow motion first started it plays fine. Anyone else experiencing this?
Seems nobody else has reacted, so it might make sense to add this to the bug tracker with detailed information and maybe a sample file. You can add a comment that you're not sure if it's only a bug that occurs on your PC. That makes it easier for me to simply close the bug report if I can't reproduce the problem myself.

Quote:
Originally Posted by StinDaWg View Post
+1 on adding a "force film" type option for progressive content like you have for deinterlacing.
Noted.

Quote:
Originally Posted by seiyafan View Post
I cannot tell any difference between nearest neighbor and NNEDI3 for chroma upscaling, but the difference between them for luma doubling is noticeable.
Depends on the scene, though. E.g. try those scenes in Battlestar Galactica in the Cylon motherships where the whole image is all red and black. In scenes like that different chroma upsampling algorithms can make a noticeable difference...

Quote:
Originally Posted by SecurityBunny View Post
I believe I have found an issue with the new windowed mode.

Randomly half way through watching a video, the render queue drops to half of the present queue. Playback starts fine, all queues fill up, no dropped frames, rendering is ~27ms for 23.976 playback. At random, usually after some minutes, the video playback glitches. If you are using ReClock, audio starts to pop constantly and frames drop until you pause the video. The render queue was playing at 19-20/20, but drops to 7-8/20. I have tried raising and lowering all queue sizes, but seems to always drop to half of what the 'frames in advance' is set to. Pausing and starting video playback again refills the queues to the full values, but it drops again after some time. Occurs with and without ReClock. (Currently not using ReClock.)

I should also note that the rendering ms stats begin to slowly climb upward when the render queue is bugged at 7-8/20. For me, it began at 27ms, but climbed upwards to 40ms before I stopped and restarted the video.

Switching to fullscreen exclusive mode, this bug does not occur and the render queue remains full all the time.

MPC-HC 1.7.4.15
Nvidia Driver 337.61
MadVR 0.87.10
Display: 2560x1440
Windows 8.1

MadVR Settings.
No 'trade quality for performance' options.
No 'general settings' options.
CPU queue 24, GPU queue 20.
Windowed mode, 16 frames in advance.
Smooth motion 'only if...'
ED2 dithering with both options checked.
Debanding medium/high.
NNEDI3 32 / NNEDI3 32 luma double / Jinc3 AR / Catmull-Rom AR LL

I have also noticed that if the CPU and GPU queue are the same size, the 'render queue' will be one number off.

Example with CPU queue 20, GPU queue 20, frames in advance 16.

decoder queue: 19-20/20
upload queue 19-20/20
render queue 18-19/20
present queue 15-16/16

Example with CPU queue 24, GPU queue 20, frames in advance 16.

decoder queue: 23-24/24
upload quieue 19-20/20
render queue 19-20/20
present queue 15-16/16
Thanks for the detailed report. Which GPU is that?

Does anybody else have this problem?

Quote:
Originally Posted by Knight77 View Post
Guys today which do you prefer to run MPHC+madVR and XBMC: Windows 7 or 8?
Personally, I prefer Windows 8 because desktop composition works *much* better. Some users still have problems with refresh rate switching in Windows 8, though (e.g. getting 23.976Hz instead of 24.000Hz). And you need to add a start menu software. E.g. try Classic Shell.

Quote:
Originally Posted by Stereodude View Post
Shouldn't deintFps be 24FPS for 720p60 material that is detected as film mode 6:4?
I don't remember: Which value does deintFps have if you activate forced film mode for 60i telecined movie content? Is it 24fps? I guess it would make sense and should then also apply to decimated 60p content.

Quote:
Originally Posted by VincAlastor View Post
i'm really impressed with the smooth motion algorithm! for me it's pure magic! Thank you!
i've searched an avisyth filter like this for a long time, but nothing was smooth like this algorithm in madvr.
is there a way to use it in avisynth/vipoursynth?
The madVR algorithm adjusts itself to the GPU VSync position all the time. You can't do that with AviSynth etc. It should be easy enough to convert 24fps to 60fps with something similar to smooth motion frc in AviSynth, by using a fixed blend pattern. But I don't think it makes a lot of sense to do this for encoding. Realtime playback is the ideal moment to do that, IMHO.

Quote:
Originally Posted by chros View Post
@Madshi: wooow! Thanks for the profile support!!!


Quote:
Originally Posted by Meulen92 View Post
- What would be prefered next to 256 neuron Luma doubling: 64 neuron Chroma doubling or 64 neuron Luma quadrupling?
I think luma quadrupling would be more benefical than chroma doubling. But it depends on the scaling factor. With a 2.1x scaling factor, luma quadrupling has barely any benefit. With a 4.0x scaling factor, luma quadrupling should be moderately useful. Chroma doubling will always have a very minor effect, only, but this very minor benefit is already there with a 2.0x scaling factor.

Quote:
Originally Posted by Meulen92 View Post
- At what number of neurons will NNEDI generally beat Jinc3AR in chroma upscaling?
Might already be with 16 neurons, but give it a try yourself and trust your eyes. You'll probably not see much of a difference.

Quote:
Originally Posted by iSunrise View Post
VESA finally approved adaptive-sync for the masses!

While this wouldn't make Reclock completely obsolete, this is a major step we have all been waiting for, not just for use with games, but also for madVR and jutter-free video playback without having to worry (hopefully) about custom refresh rate tweaking.
I'm not sure if this will be useful for madVR. If Direct3D will allow madVR to define/fine tune the refresh rate, then it would be very useful. But I somehow doubt that's how it's going to be. I rather think games will present fully rendered frames as quickly as possible and the GPU driver will turn that into "adaptive-sync" somehow. A solution like that would probably not be useful for madVR.

Quote:
Originally Posted by cyberbeing View Post
Use of NNEDI3 Luma Doubling & Chroma Upscaling together can make a rather significant difference on SD content and below. Chroma Doubling is much tougher sell considering its high cost. The majority of the time, you are better off finding your highest usable Luma Doubling setting, and only then considering use of NNEDI3 chroma if it can fit easily within the remaining GPU headroom.
Agreed. Luma Doubling first, everything else after that.

Quote:
Originally Posted by Devrim View Post
Anyone else experiencing frame drops reported by mpc-hc when turning on smooth motion in fullscreen?

The video looks smooth to me but mpc-hc reports about 140 frames dropped on 2000 frames drawn.
Don't see that on my PC.

Quote:
Originally Posted by cyberbeing View Post
Not particularly surprising, since 'new path' Windowed is the slowest display mode in madVR, while 'old path' Windowed in the second fasted behind Overlay. Windows 8.1 handles the 'new path' Windowed mode significantly better, but I still found it slower than 'new path' FSE which seems to handle presenting in advance much better, especially at higher refresh rates and GPU loads.
Quote:
Originally Posted by cyberbeing View Post
YMMV, then I guess. Windowed 'new path' is definitely the slowest mode on my GTX770. At very high GPU loads I experience massive presentation glitches, which do not occur with FSE.
Which refresh rate are we talking about? Are you still using ultra-high refresh rates? That's a very important factor in judging which mode is the "fastest". The old windowed mode and Overlay mode do the least amount of work for very high refresh rates. The new windowed/FSE modes have to do extra work to handle very high refresh rates. But for "normal" refresh rates, e.g. 23.976Hz for Blu-Ray playback, I don't see why the old mode should be "faster" than the new mode.
madshi is offline   Reply With Quote
Old 6th June 2014, 18:22   #26586  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by StinDaWg View Post
Should I be using IVTC on hard telecined 1080i29 videos? It works fine with soft telecined, but with hard I get occasional stuttering, a few frames here and there per episode. I suppose this is madVR algorithm not handling it correctly. The weird thing is it does not happen every time on the same frames. If I see a small stutter I can quickly rewind it and it might not stutter again at the same spot. It's not dropping frames though according to the OSD. If I play it back with regular deinterlacing the stuttering goes away.
Is this with 23.976Hz? And you really don't see any indication in the OSD about the small stutter? Are you using FSE mode or windowed mode? Does the cadence detection show a cadence break at that moment? If not, I'd rather say that the problem probably is not related to IVTC at all, but it's probably something else...

And yes, you should use IVTC for both soft- and hard-telecined content. And for content which switches between soft- and hard-telecine all the time.

Quote:
Originally Posted by Conquerist View Post
I have two feature requests.

Feature request #1: Keyboard shortcut for turning subtitles on and off when using XySubfilter, similar to "On/Off Subtitle" in MPC-HC.

Feature request #2: When switching subtitle tracks, show the track's title in the OSD.
IMHO these are feature requests that should be implemented in the media player or in the subtitle renderer, but not in madVR.

Quote:
Originally Posted by Niyawa View Post
I found what was making my madVR go haywire. It seems that... the option in "devices" > your monitor > called "native display bit depth" was the culprit. Since I use a TN monitor I'd always assume 6-bit was the best for it, but the moment I put 8-bit again everything returned to how it was. I can't even see the increase that cyber told me I'd get for using the new path.

It's funny though, that option didn't make any difference in old path, but the new one made it useless for me. I wonder why is that...
Newer versions do some extra work if you choose a bitdepth lower than 8bit. Basically the dithering gets gamma corrected. That does cost a bit of extra performance.

Quote:
Originally Posted by Ceremony View Post
question: why is there no 64bit version of madvr?
Because it would cost quite a bit of development effort with no dramatic improvements. Ok, so CPU usage might be slightly lower in some situations, but that's not reason enough for me to look at 64bit support right now. I have a lot of things on my to do list, and I try to do the most important (or the things that I'm personally interested in) first. 64bit support is likely to come at some point, but not soon, because there are still many things I consider more important at this point.

Quote:
Originally Posted by kolak View Post
Ups- I thought it can do 10bit. Shame, but it looks like Iris won't do it anyway.
So it means there is no way to pass 10bit video over Display Port or HDMI through some player at all?
Try the following experiment:

(1) In the madVR device settings, set your display to a native bitdepth of 4bit.
(2) Play a video. You'll notice the dithering, but the video is still mostly watchable, isn't it?
(3) Increase the bitdepth to 5bit. Play the video again. It's a nice improvement over 4bit, isn't it?
(4) Increase the bitdepth to 6bit. Play the video again. Do you still see a dramatic improvement? Yes, it's better than 5bit, but the difference is much smaller compared to 4bit->5bit, right?
(5) Increase the bitdepth to 7bit. Play the video again. Do you see any difference to 6bit at all?
(6) Increase the bitdepth to 8bit. Play the video again. Do you see any difference to 7bit at all?

After doing those tests you should see that the higher the bitdepth, the lower the quality gain is. Going from 8bit to 10bit will probably be a smaller improvement than going from 7bit to 8bit.

Personally, I think the quality improvement from 8bit to 10bit will be rather small. I'll still add 10bit support at some point, but I don't consider it terribly important at the moment, because madVR's dithering quality is very very high. Higher than any other software or hardware device in the movie playback industry, as far as I know...

Quote:
Originally Posted by leeperry View Post
Oh, kinda late to the party but quad NNEDI is double NNEDI basically? On 576x320p25, is that wiser for luma to do 256x double only, 256x double + 32x quad or 128 double + 64x quad? I guess I'll have to put my eagle eye glasses on and run comparisons with fresh eyes

PS: Humm, I think 256+32 looks sharper, yay! Wish I could do 256+64
IMHO, put your resources into luma doubling (maybe a little into chroma upscaling). When you have something left, you can try luma doubling.

Quote:
Originally Posted by Danne S View Post
Is there any reason to use chroma double on SD content? Or is it just waste of resources?
From what I've seen it's mostly a waste of resources. But maybe in some specific situation it could be helpful, I don't know. I'd rather up your luma doubling and chroma upscaling settings instead.

Quote:
Originally Posted by Danne S View Post
On luma double, is there any significant improvement when going from 128 to 256 on SD content?
Depends on the content. I've noticed that higher neuron counts can help getting edges with "difficult" angles upscaled better (with less aliasing). The softer your source, though, the less difference you will see. Even with sharp sources there might be next to zero difference, if the edges in the image all have easy to handle angles.

Quote:
Originally Posted by MysteryX View Post
How does madVR work with Windows' "Change the size of all items" feature, such as when display is 120% bigger on 1080p TV?
I've no idea, never tested that. Might also depend on the OS.

Quote:
Originally Posted by Anime Viewer View Post
Setting a high refresh rate (60 or 59) with Smooth Motion enabled still produces noticeable blur as shown in :[URL="http://i62.tinypic.com/73e23b.jpg"]
Showing a screenshot doesn't really mean anything. Smooth motion frc is based on frame blending, so even if you have a refresh rate of 1000Hz, there will still be some frames that look blurred like your screenshot. However, the higher the refresh rate is, the shorter those blurred frames will be visible on screen. The human eye isn't terribly fast with motion. If you activate smooth motion frc with 30Hz, such a blurred frame will be shown for 33.3ms. That's a long time and our eyes may notice that blur. But with 1000Hz, that blurred frame will only be on screen for 1ms, while most other frames shown will be totally blur-free. So the higher the refresh rate, the less blur you should see.

Quote:
Originally Posted by Stereodude View Post
Does anyone else experience that madVR will get all discombobulated and start stuttering or a playing a reduced framerate sometimes when playing back 720p60 content with a 6:4 film IVTC. It happens with ATSC broadcast HDTV mkv files at the point where commercials are removed. Cycling out of Film mode and back in with CTRL+Shift+Alt+T fixes the issue. The files play back fine (smoothly) through the commercial cuts without the IVTC. The OSD doesn't seem to indicate any issues when it starts stuttering.
Quote:
Originally Posted by turbojet View Post
I do Stereodude, it's not always at ad breaks for me, it's often in the middle of a scene. It's usually at least 15 minutes into the video and it looks like slow-motion. Seeking fixes it and going back over the scene works fine. I'll add it to the bugtracker.
Quote:
Originally Posted by StinDaWg View Post
I have this issue and it usually happens when the OSD switches to "unknown cadence" and doesn't stop stuttering until it locks onto 6:4 again (usually seconds or sometimes minutes later). It's not during commercials either but rather anytime randomly in the middle of a scene like turbojet.
Samples, please. Lots of samples, please. And please cut them generously, so that there's enough smooth playback before the problem starts, and enough playback time with the problem visible. Thanks!

Quote:
Originally Posted by leeperry View Post
FWIW I've mentioned the glitches I had with NNEDI on the 270 and the increased lag with 14.4 to a local hardware review website who's in direct contact with AMD and the owner told me that he would demand answers. They remained on the top of things regarding the 7870 black screen drama so there might be hope that somebody at AMD would stop this copyback nonsense.
Yes, please.

Quote:
Originally Posted by pie1394 View Post
Dithering is not used since it could affect the FALD TV set's deep black performance a little bit.
When did you notice this? Dithering has changed a lot in recent builds. The new default (ordered dithering) should avoid problems with deep black performance. Maybe you should try again?

Quote:
Originally Posted by mysterix View Post
When I should to use an "use OpenCL to process DXVA NV12 surfaces (Intel, AMD)" option? Which case is less expensive to videocard? Does OpenCL use both CPU and GPU in this case?
It's probably not a good option to use for AMD users with current drivers. For Intel users it might be beneficial. But I don't really know for sure. If you're an Intel user, just give this option a try and report back which works better for you.

Quote:
Originally Posted by XMonarchY View Post
I feel like madVR is now more or less complete. I just don't see how anything else could considerably improve video rendering quality...
Oh, I still have some ideas...

Quote:
Originally Posted by XMonarchY View Post
Is there an overall roadmap for madVR development?
Not a public one, no.

Quote:
Originally Posted by ikakun View Post
It can help reduce banding in some monitors/TVs.

In my case, using drivers prior to 14.6, my TV has awful banding when I set my TV to 8-bit(or higher) on madVR device properties settings. Even though on my TV's spec says it's 8-bit, I still needed to change my TV's madVR device properties settings to 6-bit and let Error diffusion option 2 do the magic. That way, my TV can do smooth color transition with much lesser banding but with slight increase in noise.

With the introduction of the new feature in catalyst 14.6 beta driver, I can make my TV have much lesser banding by setting it to 8bpc on the new TV panel setting in catalyst and leave the madVR device property to 8-bit. The end result is a much lesser banding and without noticeable added noise that is not in the source video.
Let me guess: You have your GPU set to output 16-235? My best guess is that AMD might now also apply dithering, based on this new bitdepth control in the GPU control panel. Just a guess, though...

Quote:
Originally Posted by TheDarkTemplar View Post
I tried many previous increasingly older versions of madVR and it continued to give the blur in all of them, so it wasn't an issue with updating madVR. However, there was no blur when using any other renderer so I figured that it must be an issue in an update with either LAV or MPC-HC that simply reacts negatively only with madVR.

So, I tried an older version of MPC-HC with the latest madVR and the blur disappeared. I then figured I would try to find the cutoff of which version the issue starts on. Once I found it, I noticed the first one that started having the problem was also the first one that was retaining my settings of the latest version of MPC-HC.

I then reset MPC-HC back to default settings and it eliminated the problem completely, using the latest versions of MPC-HC, madVR, and LAV. The madVR and LAV settings didn't change and are still at what I had them at, so I must have had some really strange setting set in MPC-HC that was reacting badly with madVR. I went through all the MPC-HC settings though and I can't imagine what it could have possibly been. I don't know what setting was even different now that I've set everything to what I want again and nothing seems like it would cause issues of that sort. I guess I will never know...

Anyway, I am just glad this extreme annoyance is gone and I can finally watch videos clearly again. Thanks for all the help!
Weird. Would really have liked to know which setting caused the problem.
madshi is offline   Reply With Quote
Old 6th June 2014, 19:03   #26587  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by madshi View Post
I'm not sure if this will be useful for madVR. If Direct3D will allow madVR to define/fine tune the refresh rate, then it would be very useful. But I somehow doubt that's how it's going to be. I rather think games will present fully rendered frames as quickly as possible and the GPU driver will turn that into "adaptive-sync" somehow. A solution like that would probably not be useful for madVR.
Why? If it indeed works the way you think, couldn't you just ignore VSync information entirely and simply present frames at the native video frame rate, and then the GPU driver would do the right thing and sync the refresh rate to the effective frame rate? (though I guess that means you cannot use a present queue anymore with that solution...)
e-t172 is offline   Reply With Quote
Old 6th June 2014, 19:12   #26588  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by e-t172 View Post
Why? If it indeed works the way you think, couldn't you just ignore VSync information entirely and simply present frames at the native video frame rate, and then the GPU driver would do the right thing and sync the refresh rate to the effective frame rate? (though I guess that means you cannot use a present queue anymore with that solution...)
Yes, no present queue. And that means: If the OS decides that madVR doesn't need the CPU for a few milliseconds, the whole timing could get screwed up. For users with 120Hz displays, madVR would have to get CPU time very exact each 8.3ms. And not just once in an 8.3ms window, but very reliably exactly at the time when the frame needs to be presented. For games it's not such a big problem if the presentation times are not very regular, they will simply calculate the 3D positions of the objects accordingly. But video frames are fixed and require an equal distance between each presented frame. And Windows is not a real time OS. Meaning I can't really rely on getting CPU time at the exactly needed time for every video frame in a full movie. The current approach with a presentation queue and a fixed refresh rate is much more reliable and much better suited for video playback.
madshi is offline   Reply With Quote
Old 6th June 2014, 20:33   #26589  |  Link
Meulen92
Registered User
 
Join Date: May 2014
Posts: 9
Thank you for your response madshi!
Meulen92 is offline   Reply With Quote
Old 6th June 2014, 21:20   #26590  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by madshi View Post
As far as I remember, using the "use separate device for presentation" option didn't work properly. I think it produced a black screen instead of video output, or something like that.
That doesn't seem to be the case any longer. My Optimus system appears to play fine with the "use a separate device for presentation" checked (as well as "use a separate device for DXVA processing" should I choose to check that as well.

I did some searching in the thread too, and turned up this that andybkma had posted
http://forum.doom9.or/showthread.php?p=1591750#post1591750
I tried what he/she said was a problem (going from windowed mode to fullscreen to windowed mode causing the video to pause), and that doesn't occur for me. With the new windowed mode I don't use FSE any longer, but I tried it with both FSE and the new windowed mode and neither was any issue. Reading old posts it sounds like having "use a separate device for presentation" checked stopped presentation glitches for many people, and used to be checked by default. If that's the case it looks like it may be a good thing to have on by default again unless it causes other issues I haven't read about.

Quote:
Originally Posted by madshi View Post
Personally, I prefer Windows 8 because desktop composition works *much* better. Some users still have problems with refresh rate switching in Windows 8, though (e.g. getting 23.976Hz instead of 24.000Hz). And you need to add a start menu software. E.g. try Classic Shell.
Thanks for mentioning classic shell. I've been using Windows 8.1 (and before that 8 -since it first came out), and missed the start menu. I'd read about other pay and (possibly) malicious start menu add-ins out there, but classic shell seems like a nice free one that is free for both personal and commercial use.

Quote:
Originally Posted by madshi View Post

Showing a screenshot doesn't really mean anything. Smooth motion frc is based on frame blending, so even if you have a refresh rate of 1000Hz, there will still be some frames that look blurred like your screenshot. However, the higher the refresh rate is, the shorter those blurred frames will be visible on screen. The human eye isn't terribly fast with motion. If you activate smooth motion frc with 30Hz, such a blurred frame will be shown for 33.3ms. That's a long time and our eyes may notice that blur. But with 1000Hz, that blurred frame will only be on screen for 1ms, while most other frames shown will be totally blur-free. So the higher the refresh rate, the less blur you should see.
Given that is there any reason you'd recommend keeping settings like 1080p23, 1080p24, 1080p29, and/or 1080p30 in the "list all display modes madVR may switch to" area of display modes as opposed to deleting them out and leaving only 1080p60 and/or 1080p59 in there by themselves like some other people on the thread have recommended?
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.

Last edited by Anime Viewer; 6th June 2014 at 21:27. Reason: added madshi's quote from later post with reply
Anime Viewer is offline   Reply With Quote
Old 6th June 2014, 21:55   #26591  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Anime Viewer View Post
Given that is there any reason you'd recommend keeping settings like 1080p23, 1080p24, 1080p29, and/or 1080p30 in the "list all display modes madVR may switch to" area of display modes as opposed to deleting them out and leaving only 1080p60 and/or 1080p59 in there by themselves like some other people on the thread have recommended?
Native refresh rate (with smooth motion frc turned off) is still the best and recommended solution, if your display can do that. If your display can't do native refresh rate, turn smooth motion frc on and use the highest refresh rate your display can handle, to keep the blurring to a minimum.
madshi is offline   Reply With Quote
Old 6th June 2014, 22:40   #26592  |  Link
xyndv
Registered User
 
Join Date: Jan 2013
Posts: 9
Quote:
Originally Posted by madshi View Post
Thanks for the detailed report. Which GPU is that?

Does anybody else have this problem?
I have a similar problem, but oddly, only if Crossfire is enabled in Catalyst - playback seems to start fine, but after some time/if I enter fullscreen in and out, render and present queue only fill to about 3-5 instead of 16, and rendering time goes up. (I've also sent a report to AMD, but they probably ignored it). This happens both with the new windowed path, and the new path in exclusive mode, but not in the old path, so that's what I'm using now. And as I mentioned, disabling Crossfire fixes everything.. (HD7950)

"use a separate device for presentation" has some strange effect on this bug; if it's enabled, I can see the problem triggered faster, but being disabled it kind of takes it longer/rarer "to start", but it always does.
xyndv is offline   Reply With Quote
Old 7th June 2014, 00:03   #26593  |  Link
GCRaistlin
Registered User
 
GCRaistlin's Avatar
 
Join Date: Jun 2006
Posts: 350
Quote:
Originally Posted by huhn View Post
open the 1- black clipping file and report back!

test it in both VMR and MadVR. after this post your MadVR settings from device -> "monitor name" -> properties.
  • 1-Black Clipping
    If madVR's "devices | <my_monitor> | properties | the display expects the following RGB output levels:" is set to "PC levels (0-255)" (default setting), the reference black bar isn't flashed while Brightness is set to 46/100. To be precise the picture doesn't change much even if I raise Brightness to max.
    If this setting is set to "TV levels (16-235)" I cannot make bars 2-16 blend together even if I set Brigtness to minimum. The same result for VMR9.
  • 3-White Clipping
    "PC levels (0-255)": bar 234 is the highest flashing bar regardless of Contrast setting.
    "TV levels (16-235)": bar 244 is the highest flashing bar while Contrast is set to 91/100, all bars are flashing while Contrast is set to 86/100.

Questions:
  1. Should I set madVR to use "PC levels" or "TV levels"? "TV levels" will give me more details but I confused with this "reference black". The guide says that I should definitely not want to see any bars below reference black but why? The picture details will be loosing this way.
  2. If I use "TV levels" what Brightness value should I use? 46/100 has no sense in this case, hasn't it?
  3. If I use "TV levels" what Brightness value should I use? I think of 86/100 since it should give me more details than 91/100, right?

Thanks in advance for the help. AVSHD guide seems to be to complicated for me. For example I cannot understand how to use 2-APL Clipping test correctly (what to look at in this test).
__________________
Windows 8.1 x64

Magically yours
Raistlin
GCRaistlin is offline   Reply With Quote
Old 7th June 2014, 00:29   #26594  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
this doesn't look like a problem with MadVR i was free and created a new thread for this and answer all question you got as good as i can:
http://forum.doom9.org/showthread.ph...81#post1683181

this may take longer

Last edited by Guest; 7th June 2014 at 00:35. Reason: make link clickable
huhn is offline   Reply With Quote
Old 7th June 2014, 00:44   #26595  |  Link
ikakun
Registered User
 
Join Date: Apr 2014
Posts: 19
Quote:
Originally Posted by madshi View Post
Let me guess: You have your GPU set to output 16-235?
Yes. To be specific, it's set to limited RGB on the catalyst control center and set the madVR output levels to PC-levels(0-255). That way I experience no clipping. Meaning, reference black 16 is always black & above that is visible to my TV.
__________________
Windows 10 64-bit
PowerColor HD R9 270
ikakun is offline   Reply With Quote
Old 7th June 2014, 01:32   #26596  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by ikakun View Post
Yes. To be specific, it's set to limited RGB on the catalyst control center and set the madVR output levels to PC-levels(0-255). That way I experience no clipping. Meaning, reference black 16 is always black & above that is visible to my TV.
you can try to output unlimited rgb and set MadVR to 16-235 TV. this way the gpu shouldn't touch the image and no banding should be added.
try it with 8 bit.

of cause everything else doesn't look nice anymore like games and the browser.

but maybe this can be fixed with a ICM? if a ICM with limited RGB is added shouldn't that fix a ton of problems?
huhn is offline   Reply With Quote
Old 7th June 2014, 01:46   #26597  |  Link
turbojet
Registered User
 
Join Date: May 2008
Posts: 1,840
madshi: The problem with the '59p ivtc' issue is it happens randomly. It's been awhile since I've used it but if you watch something at least 30 minutes there's a good chance the problem will arise. If you don't have such a source I can provide one, pm me if you need it.
__________________
PC: FX-8320 GTS250 HTPC: G1610 GTX650
PotPlayer/MPC-BE LAVFilters MadVR-Bicubic75AR/Lanczos4AR/Lanczos4AR LumaSharpen -Strength0.9-Pattern3-Clamp0.1-OffsetBias2.0
turbojet is offline   Reply With Quote
Old 7th June 2014, 07:50   #26598  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by xyndv View Post
I have a similar problem, but oddly, only if Crossfire is enabled in Catalyst - playback seems to start fine, but after some time/if I enter fullscreen in and out, render and present queue only fill to about 3-5 instead of 16, and rendering time goes up. (I've also sent a report to AMD, but they probably ignored it). This happens both with the new windowed path, and the new path in exclusive mode, but not in the old path, so that's what I'm using now. And as I mentioned, disabling Crossfire fixes everything.. (HD7950)

"use a separate device for presentation" has some strange effect on this bug; if it's enabled, I can see the problem triggered faster, but being disabled it kind of takes it longer/rarer "to start", but it always does.
Crossfire never worked really well with madVR, I don't know exactly why. In any case, can't you tell the AMD control panel to automatically disable Crossfire for madVR use, and to enable it for games?

Quote:
Originally Posted by ikakun View Post
Yes. To be specific, it's set to limited RGB on the catalyst control center and set the madVR output levels to PC-levels(0-255). That way I experience no clipping. Meaning, reference black 16 is always black & above that is visible to my TV.
Ok. With this setup older AMD driver versions simply stretched the Windows & madVR output behind the back of Windows & madVR from 0-255 to 16-235, and AMD did that without using dithering, which introduced all those nasty banding artifacts were you seeing. By setting your display to 6bit in the madVR device setup you somewhat hid the problem, but it was still there. This is why I usually don't recommend to set your GPU to 16-235 output.

Now the latest AMD version seemingly has added dithering capability - which is very nice improvement!! My interpretation of the different bitdepth settings is as follows:

12bit: AMD stretches from 0-255 to 16-235 and dithers to 12bit.
10bit: AMD stretches from 0-255 to 16-235 and dithers to 10bit.
8bit: AMD stretches from 0-255 to 16-235 and dithers to 8bit.
6bit: AMD stretches from 0-255 to 16-235 and dithers to 6bit.

Practically this means that the 10bit and 12bit settings may be beneficial (lower noise level) *if* your display supports 10bit/12bit correctly. For most users probably 8bit is the appropriate setting in the AMD control panel.

I still recommend to set the GPU to 0-255 output, though, because madVR has to dither, too. And if both madVR and the GPU driver dither, the noise level will increase. And madVR's dithering quality is probably higher than AMD's. But anyway, as mentioned above, it's a nice improvement on AMD's side...
madshi is offline   Reply With Quote
Old 7th June 2014, 13:36   #26599  |  Link
kostik
Registered User
 
Join Date: Jul 2007
Posts: 161
I remember some years ago even with AMD's 2000HD or 5000HD series, my Pioneer plasma was changing its mode to 30bit and 36bit (when 4:2:2 was selected the plasma would change to 36bit).
Now I am using Nvidia Quadro and it can't do that, strange, apparently AMD used this dithering long time ago also ,without an option to change manually the bitdepth :\
kostik is offline   Reply With Quote
Old 7th June 2014, 14:05   #26600  |  Link
xyndv
Registered User
 
Join Date: Jan 2013
Posts: 9
Quote:
Originally Posted by madshi View Post
Crossfire never worked really well with madVR, I don't know exactly why. In any case, can't you tell the AMD control panel to automatically disable Crossfire for madVR use, and to enable it for games?
It's actually just globally enabled to even work in games and put the second GPU to sleep, and there's a checkbox next to it to "enable Crossfire for applications with no associated profile" which is disabled of course, so the second GPU is even off while madVR is running ..
xyndv is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:17.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.