Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 8th July 2014, 10:27   #26821  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by pie1394 View Post
Instead I should say I don't want the dithering function at the input signal side to affect the HX920's super-resolution engine's processing efficiency.
But are you sure that it does? I don't really know what the super-res engine does inside. Depending on the algorithm it could work better with dithering off, or with dithering on, or it could make no difference. There's no way to know except trying which looks better.

Quote:
Originally Posted by Stereodude View Post
Quote:
Originally Posted by madshi
I don't remember: Which value does deintFps have if you activate forced film mode for 60i telecined movie content? Is it 24fps? I guess it would make sense and should then also apply to decimated 60p content.
24fps is what I've seen in my testing if I activate forced film mode for 60i telecined movie content.
Ok, in that case it should also be 24fps for decimated 720p60 content. Is there already an entry in the bug tracker for this? If not, could you please make one? Thx.

Quote:
Originally Posted by Plutotype View Post
I have a weird problem on my Win7 x64 HTPC. Installed X86 MPC-HC and madVR ( install.bat ), but although I see madvr in the list available renderers ( output section ) and I can select it, the renderer I see later in the list during the playback is the default "video renderer" of MPC-HC. MadVR wont load..tried to reinstall everything, but Im done. What could be the cuprit please?
Hard to say. Does madTPG show an image and react to the OSD buttons etc? If so, madVR seems to generally work and the media player or your selected DirectShow splitter/decoder could be causing the problem. If not, your PC doesn't seem to like madVR for some reason. In the latter case try temporarily disabling your anti-virus/firewall just to check whether that's causing the problems or not (of course to be safe feel free to first check madVR via www.virustotal.com).

Quote:
Originally Posted by Dodgexander View Post
  1. If my receiver is not passing 4:4:4 from my Radeon 5450 too my display, then I gather setting 4:4:4 in the AMD CCC is useless?
  2. I learnt from reading on here that madVR works by converting Y'CbCr very well to RGB and RGB therefore should always be selected in the video card driver. But if I can't accept 4:4:4 on my display because of my receiver, is setting the CCC to Y'CbCr 4:2:2 a better option?
  3. On my Samsung t.v how can I make sure I am not expecting to receive a 4:4:4 chroma signal? There is an option for Colour Space (AUTO/WIDE). Does this have anything to do with Chroma, or is this only related too limited/full range.
  4. HDMI Black level on my t.v is only available to change when I output Y'CbCr 4:2:2 and not Y'CbCr/RGB 4:4:4. Why is this? If I understand correctly it is this setting that assumes you will be receiving blacks darker than 16, but does it also dictate receiving whites greater than 235?
  5. How does all of this relate to bit-depth? When I have Y'CbCr 4:4:2 or RGB 4:4:4 Limited in CCC my receiver shows its receiving a 30bit signal and is outputting 24bit to my display. With 4:4:4 it shows receiving and outputing 30bit.
  6. When I connect my laptop, or my PVR the receiver instead shows an input of 24bit, why does my laptop/PVR show this but not my PC with the radeon?
Most of these questions are very specific to your hardware, so it's hard for any of us to give an exact answer. madVR always renders to RGB. If you set the GPU to YCbCr output (either 4:4:4 or 4:2:2) the GPU is converting madVR's output behind madVR's back. Otherwise your receiver is doing that. Who does it better? I don't know. I can only advise to try different settings with test patterns and real movies and trust your eyes which setting combination produces the best overall result.

Quote:
Originally Posted by ikakun View Post
Already tried PC standard(full RGB) on GPU output & 16-135 on madVR output, and yeah, doesn't look nice on games and others. I also notice that black levels & grayscale ramp is a bit off the way it should look.

On the other hand, going Studio(limited RGB) on GPU output & 0-255 on madVR output makes other application look proper. Black levels is also spot on(16 is black and 17 to 235 is visible) & grayscale ramp looks nicer with only few improper gray steps.
Yes, if you want other applications to have proper black and white levels, there are only 2 options for you:

(1) Set the GPU to Studio=limited RGB=16-235. This will produce correct black and white levels, but it might introduce banding artifacts.

(2) Set the GPU to PC Standard=full RGB=0-255. And at the same time set your display to 0-255, too! This will produce correct black and white levels and it will avoid banding artifacts. However, not all displays/TVs support this.

Quote:
Originally Posted by StinDaWg View Post
I've mentioned this before, but now have a sample to back it up. Using anti-ringing filter on any of the downscaling algorithms after NNEDI3 causes jagged/aliased text. With the AR filter off text looks much, much better. This happens on any video I play, and can most easily be seen at the beginning of tv shows where it shows the actors names.

Take a look at the ticker at the bottom. Any letter with a diagonal like W, N, X, M, ect looks terrible with AR on.
http://www44.zippyshare.com/v/62692648/file.html
Thanks. Could you please create a bug tracker entry for this (if you haven't already)?

Quote:
Originally Posted by Nui View Post
Small idea.
It could be helpful for evaluation of LUTs if one could set 2 LUTs and a shortcut for switching between them for A/B comparison.
Maybe even use both LUTs on one half of the image each.
Yes, features for A/B comparison would be nice. But really, they have low priority compared to some other things. So this won't come soon.

Quote:
Originally Posted by Nui View Post
As as side note, I find it incredible how much information can be conveyed with 1bit dithering (or 8 colors)
Agreed. Especially the gamma corrected dithering helps a lot at such low bitdepths (thanks Shiandow).

Quote:
Originally Posted by G_M_C View Post
Since a few weeks AMD has realased their new 14.6 beta driver. This driver adds a setting to set display color depth for displays that support it. See screenie below.

I've installed the driver, and can set my TV to 10 bpc. When i then set MPC to D3D output, 10 bit out etc. etc. I actually can get 10 bit out (a2R10G10B10 output, CTRL-J stats screen, had to use EVR custom to confirm). So i can confirm this setting seems to work.
Oh, that's pretty good news - thanks!

Quote:
Originally Posted by leeperry View Post
So as promised they asked AMD about the copyback nonsense, they replied that their engineering team was aware of the problem but didn't have the time to look at it.
Hmmm... They didn't have time to look at it *yet*? Or they generally don't have time to look at it?

Quote:
Originally Posted by -Hitman- View Post
I've just started getting into Madvr but I have a minor GFX clipping problem that has plagued me for years and need help resolving it especially now using a quality renderer, i'm at a brick wall before I can carry on.

I have always used amd cards and using a common pluge pattern I have always had to up bri +2 and lower contrast -2 in the CCC settings to get all bars to show correctly, the consensus seems to be that the GFX driver settings for color should be untouched but this gives me a 19-233 video output, current new card is an amd R9 790, this is exactly the same result!

I have tried my onboard haswell 4600, same result as if something is clipping the video slightly.

With madvr integrated into the chain, my settings for colorspace are amd CCC - dynamic range 0-255 - video hdmi output - 0-255 - madvr 16-235 - display set to 16-235 (Pioneer KRP600M).

This gives the same 19-233 output, any change to the above gives either all bars 0+ and washed out or no bars at all on the black clipping pattern with heavy clipping.

I tried just adjusting madvr bri/cont which required bri +10, cont -40 to get 17/18 and 234/235+ bars to show correctly but the video image was a mess and overblown!

This is really frustrating me and need help

EDIT: My display has been calibrated using a bluray player and calman5, so it's not the display clipping.
This sounds quite weird. First of all, you can define a custom output levels configuration in madVR. Try 13-237, that might make things work for you.

However, something is going wrong there. My first guess would be that maybe you have some sort of 1dlut loaded through Windows calibration or something? In the madVR device setup under "calibration" make sure you have "disable GPU gamma ramps" checked. Does that help? If not, my next guess would be that either your receiver or your display is messing things up. It's possible that your Blu-Ray player output wrong levels. Or it's possible that your display behaves differently when receiving RGB input. Your Blu-Ray player probably output YCbCr.

From the feedback other users are giving me, a properly configured HTPC seems to output correct levels using madVR. So if you get the very same 19-233 with two different GPUs, then the issue is very likely to either be a Windows configuration problem (the "disable GPU gamma ramps" option should work around the most likely such issue), or a problem outside of the HTPC.

Quote:
Originally Posted by Anime Viewer View Post
Contrary to what some people have told you Image Doubling can still be active when playing a 1080p video on a 1080p monitor [...]
No, to the best of my knowledge that's not true.

Quote:
Originally Posted by tp4tissue View Post
if I use windows classic theme without desktop composition on Win7.. there's tearing in windowed playback mode.

is there a fix for this?
You're probably using an AMD GPU? They're known to have this problem. At least NVidia users are not reporting this as often as AMD users. Unfortunately I dont have a fix for you, other than either enabling desktop composition or using FSE mode. Or upgrade to Windows 8, which has desktop composition forced on - but on the positive side, Windows 8 has a much better desktop composition implementation compared to Windows 7. The implementation in Windows 8 is good enough that I never felt the need to disable it yet.

Quote:
Originally Posted by Selur View Post
I was wondering if there is still hope for a 64bit MadVR version in the foreseeable future
madVR x64 will probably come, but probably not very soon.

Quote:
Originally Posted by Plutotype View Post
Ok, I have lowered the number of "in advance" frames in FSE mode to 4 frames, picture appeared, but madvr crashes after couple of seconds and creates crashslog. The same behaviour with 4,3,2,1.

6+ frames blacks out the FSE mode.
Interesting. Try lowing the GPU queue size. You seem to be running out of GPU RAM, unfortuantely...

Quote:
Originally Posted by huhn View Post
it blends frame based on the Vsync if two frames have to be shown in the same vsync it will blend them based on the time in % they have in this Vsync.

nearly every TV out there does the same.
AFAIK, TVs with proper multi-refresh-rate abilities simply change the driving method to achieve different refresh rates natively. Which is a much better solution (if done properly) compared to madVR's smooth motion blending.
madshi is offline   Reply With Quote
Old 8th July 2014, 10:30   #26822  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Shiandow View Post
If you toggle NNEDI3 on or off you should see the image move slightly, if you don't then NNEDI3 is not working. If you do then it's probably working.
FWIW, the latest madVR build avoids the NNEDI3 0.5 pixel shift - except if you do *exactly* 2.0000x scaling.

Quote:
Originally Posted by Shiandow View Post
While experimenting with shaders I seem to have discovered a bug. MadVR crashes when the total number of shaders becomes "too large" the exact number seems to depend a bit on the shaders and the settings but is usually somewhere between 16 and 28. Anyway I've added a bug report, hopefully you'll be able to fix it sometime.
Yes, I have a fixed array size for the rendering steps, and every custom shader consumes such a rendering step. I will need to dynamically allocate the array. Shouldn't be hard to do. I didn't anticipate that anybody would use so many custom shaders (or passes). My fault...

Quote:
Originally Posted by anthropolyte View Post
I'm finding that when the PC is being used on the primary monitor (connected via DVI), and video is being played back on the (secondary) HDTV, I get presentation glitches, audio delays, choppy playback and all sorts of other things going on. This doesn't seem to affect any applications being used on the primary monitor.

The PC is used for a variety of things, including Photoshop, normal web browsing, document editing, etc. Nothing specific seems to set off the problem; just moving the mouse on the other screen can cause the issues to occur if video is being played back on the HDTV.

Specs of the PC are:
AMD FX8350 (8-core, 4.0GHz)
8GB Corsair Vengeance 1600MHz RAM
Sapphire Radeon HD7870 GHz Edition
Western Digital Black 2TB storage drive

I've tried playing with the various settings in MadVR in an attempt to resolve the problems (disabling image doubling, choosing different scaling algorithms etc), but no joy so far.

This seems to happen with all media types as well, including avi, mkv and mpg.

Any help would be greatly appreciated!

*edit*
I probably should have mentioned that when the PC is not being used for anything else, playback is flawless.
Are you using FSE mode or windowed mode? Generally windowed mode is more susceptible if the CPU/GPU is busy doing other stuff. Although the new windowed mode in the latest madVR build should be somewhat more reliable. Are you using the latest madVR build? And do you have the "present several frames in advance" option turned on?

Also check the madVR OSD (Ctrl+J) to see if any of the queues (which?) are getting empty when the glitches/issues occur.

Which OS are you using?

Quote:
Originally Posted by Osjur View Post
I don't know if this is a bug but if I untick "don't rerender frames when fade in/out is detected" I get about 7-10 droppped frames each time there is a black fade going on. Makes some of the files unwatchable when there's alot of fast movement and black scene changes.

This happens in both windowed and fse mode. Max rendering times are around 7ms so that shouldn't be the culprit.

Ps. My gpu is R9 290X so there should be plenty of power to run madvr.
Try increasing the CPU and GPU queue sizes. Personally, I'm currently using 24 frames for CPU and 16 frames for GPU. This should help avoiding those dropped frames when there's a fade occurring. Of course you'll need to have enough GPU RAM for this to work properly.

Quote:
Originally Posted by Shiandow View Post
That's more or less expected behaviour. As far as I understand it takes a few frames before MadVR realises that there is fade in/out so what "rerender frames when fade in/out is detected" does is go back a few frames whenever MadVR detects a fade in/out and rerenders those frames to prevent banding on the first few frames of the fade. If this is causing issues it's probably best to leave it off.
Well, it's working fine on my PC without any frame drops or other issues. I think the queues just need to be set a bit larger.

Quote:
Originally Posted by panetesan2k6 View Post
MadVR works natively in full rgb, but when you select "PC Levels" output it cuts BTB and WTW intentionally (as madshi explains here), so best/preferable scenario is to use a display that can handle full rgb. This won't give us BTB and WTW, but that's ok. In case of having a display that can't handle full rgb, the signal must be "squeezed" to 16-235 either by madVR (setting "TV Levels" in MadVR and leaving desktop whites and blacks clipped) or the GPU (setting "RGB Limited" in the driver and leaving MadVR in "PC Levels"), having desktop and video correct values. I hope I got it right finally.
Well, your description is technically not fully correct, but it doesn't matter much. You got the general idea correctly.

Quote:
Originally Posted by xabregas View Post
Hi, i just bought an nvidia gt 750 ti alongside with a smartv 1080p and im having several presentation glitches when i play 1080p movie at 1080p resolution, while i dont see any glitche at all, madvr ctrl j shows many...

The problem dont seem to happen while playing 720p movies upscaled to 1080p wit jinc 3 ar on chroma and luma. Only when i play 1080p movies (which dont need any upscale or downscale) the glitches appear.

I use Full screen exclusive mode...
Sounds quite strange. Do you have smooth motion FRC frame blending active? Does the problem go away if you disable FRC? What happens if you play a 1080p60 file?

Quote:
Originally Posted by Anime Viewer View Post
Even when its not supposed to do upscaling (1080 -> 1080) it still tries to, and uses up resources.
Nope, that's not true.

Quote:
Originally Posted by innocenat View Post
Not sure if this is reported: madVR doesn't work (it only renders first frame and died [not crashing]) for me if I set it to use external graphic instead of integrated graphic in optimus setup.

- madVR 0.82.10
- Intel i7-4710HQ (mobile processor)
- nVidia GTX850M

It work fine if I set it to use integrated graphics only, but that would be a waste.

I am on latest nVidia driver (337.88). I used to use Optimus before in i7-2630QM+GT550M and it works fine running in external CPU on same driver/madVR version.
Hmmmm... Try disabling the options "use a separate device for presentation" and "use a separate device for DXVA processing". Does that help? If not, try reducing the size of the GPU queue and the number of pre-presented frames.

Quote:
Originally Posted by generalmx View Post
Scaling Algorithms: <=360p
-- image upscaling: Jinc, 3 taps w/ Anti-Ringing Filter
-- chroma upscaling: NNEDI3, 256 neurons
-- image doubling: Always Double Luma, 64 neurons + Always Double Chroma, 16 neurons (*)
FWIW, chroma doubling usually does very little to improve the subjective image quality.

Quote:
Originally Posted by generalmx View Post
General Settings
- Use OpenCL to process DXVA NV12 surfaces.
With the current state of AMD OpenCL drivers I'd recommend to turn this option off.

Quote:
Originally Posted by generalmx View Post
* My settings don't seem to like ReClock set to 24 FPS (for ~24 FPS source), causing more to significantly more dropped frames. I have two monitors connected in an "Extended" desktop: 24" 1200p S-PVA workstation monitor and a crappy 32" 1080p TN HDTV. Both can support 24Hz natively, especially the 1080p, though I must use CRU to add 24Hz for the 1200p, however, even the 1080p set to 24Hz causes significantly more dropped frames (same with 48Hz). Just to check if it was the case of using Windowed and two monitors running different resolutions, I set both monitors to 1080p@24 and the same problem occurred. But I'm guessing the reason it's totally unplayable if Monitor #2 (1080p, Secondary) is set to 1080p@24 and Monitor #1 (1200p, Focus) is still set to 1200@60 is how Windows handles refresh rate differences like that.
Does this problem go away if you disable NNEDI3 or if you play a native 1080p24 file which doesn't need scaling? Just as a test to see if this problem is related to high GPU consumption? Have you tried clearing/resetting the Reclock timing database? What happens if you disable Reclock completely?

Quote:
Originally Posted by generalmx View Post
* Full-Screen Exclusive Mode gives more to significantly more dropped frames for me, and can introduce artifact errors. Again, note that I can't get EDID for Monitor #2 (1080p), and it's using some generic PnP driver, and Monitor #1 is a 1200p workstation monitor that doesn't support 24Hz (or 48Hz) without hackery.
Strange. What kind of artifacts do you get?

Quote:
Originally Posted by generalmx View Post
Hmm, interesting. But I do think I've found a significant source of dropped frames that I think is just related to madVR: blending of high contrast / overexposed and normal contrast scenes. With my settings, every now and then, there will be a big string of dropped frames from rapid transitions between the different contrast levels, which I'm guessing is a side-effect of manipulating luma and chroma. And while one could see this in live-action, you're most likely to see it in more light-hearted anime; where even your typical shounen/action anime uses more basic colouring and lighting.
Quote:
Originally Posted by michkrol View Post
Try those scenes with disabled debanding. Then try with both "strength" and "fade in/out" set to the same level.

With "don't rerender frames ..." trade quality for performance option disabled, when madvr detects a fade in/out it drops frames and rerenders them using other debanding setting. If this works as intended you don't even notice it, but if you have maxed out settings and one fade in/out after another in the video you rerender frames almost constantly with different settings. That's why some quality trade-offs are enabled by default
^

michkrol's guess would also have been my best guess, that the frame drops are caused by the re-rendering of the frames when a fade is detected. If the tests suggested by michkrol confirm this, you could also try if increasing the size of the CPU and GPU queues fixes the problem.

===============

I've recently moved to a new home which meant no time for madVR development. I'm now slowly getting back to development. But my commercial projects need some attention first. So don't expect a major new madVR version with new killer features soon.
madshi is offline   Reply With Quote
Old 8th July 2014, 13:26   #26823  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
Quote:
Originally Posted by madshi View Post
I've recently moved to a new home which meant no time for madVR development.
i hope with a proper HT room.
huhn is offline   Reply With Quote
Old 8th July 2014, 13:33   #26824  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by madshi View Post
Nope, that's not true.
Then is there another explanation for why render times would be higher when a 1080p video is playing on a 1080p monitor full screen with settings like Jinc or NNEDI3 set compared to when that same video is playing on that same monitor at full screen with something like DXVA2 or Bi-linear set for a scaling algorithm like image upscaling?
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 8th July 2014, 13:58   #26825  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by Asmodian View Post
What? This isn't true. DXVA2 will be fast and ok quality but there isn't any "it still tries to, and uses up resources" going on. Chroma upscaling, conversion to RGB, debanding (if enabled), smooth motion (if enabled), and dithering still needs to be done for 1080p displayed at 1920x1080.

@xabregas
The 750 Ti should be find for 1080p at 1920x1080 with Jinc3 AR for chroma scaling. Have you changed the dithering options or smooth motion? How about using Bicubic 75+AR for chroma scaling?

Also try with Full screen exclusive disabled and Windowed Overlay enabled.

Which motherboard do you have, madVR likes bandwidth so if you are on something very old there might be an issue there.
So i did what u said, disabled FSE mode but didnt activate windowed overlay as i had no tearing in FS with FSE mode deactivated so i suppose i dont need windowed overlay right?

Ok so i tried several 1080p videos, VC1 and AVC with high bitrates and no presentation glitches, ofc i put bicubic 75 AR in Chroma also...

Also, i turned Smooth motion on for some TV shows i have, even knowing i have 1080p 24hz, for TV shows is better 60hz, but with movies at 23.976, if i enable 24HZ should i use Smooth Motion? DO i need videoclock or reclock?

TIA
xabregas is offline   Reply With Quote
Old 8th July 2014, 14:13   #26826  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
Quote:
Originally Posted by Anime Viewer View Post
Then is there another explanation for why render times would be higher when a 1080p video is playing on a 1080p monitor full screen with settings like Jinc or NNEDI3 set compared to when that same video is playing on that same monitor at full screen with something like DXVA2 or Bi-linear set for a scaling algorithm like image upscaling?
do you change chroma too? if yes there is your answer all dvd/bd need a chroma upscaling always.

and powerstates make rendertime unreliable too shouldn't apply to your case but i have normally higher render times with 1080p else with 720p on 1080p the powerstate with 1080p is lowest so the render time look higher.

and are you sure your source is true 1080p and not cropped by 2-6 pixel?
huhn is offline   Reply With Quote
Old 8th July 2014, 17:26   #26827  |  Link
kopija
Registered User
 
Join Date: May 2012
Posts: 49
Quote:
Originally Posted by huhn View Post
@kopija:

NNEDI3,ERROR DIFFUSION and under general "use OpenCL to process DXVA NV12 surfaces"

the newer version got some tweaks for optimus system could be a problem with that.

but is works totally fine with out F@H (what ever that is), right?
Thanks, will try it out.
And all of you guys with monster cards should read this:
http://en.wikipedia.org/wiki/Folding@home#Biomedical_research

Another question:
LAV Video decoder is performing random dithering and MadVR is performing ordered dithering by default. So is there some kind of conflict? I canot disable dithering in LAV Video, so should dithering be set to "None" in madVR?
kopija is offline   Reply With Quote
Old 8th July 2014, 17:38   #26828  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
Quote:
Originally Posted by kopija View Post
Another question:
LAV Video decoder is performing random dithering and MadVR is performing ordered dithering by default. So is there some kind of conflict? I canot disable dithering in LAV Video, so should dithering be set to "None" in madVR?
there should be a stick about this.

leave dither in MadVR active this option in lavfilter is not used ignore it. it's only used when YCbCr is transformed to RGB this is not happening with MadVR. and even when it is used MadVR should still dither at the end.
huhn is offline   Reply With Quote
Old 8th July 2014, 17:49   #26829  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by madshi
I've recently moved to a new home which meant no time for madVR development. I'm now slowly getting back to development. But my commercial projects need some attention first.
So don't expect a major new madVR version with new killer features soon.
Big congratulations on your new home!

Killer the features will be nevertheless.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 8th July 2014, 17:49   #26830  |  Link
Stan
Registered User
 
Join Date: Jun 2014
Posts: 8
Quote:
Originally Posted by madshi View Post
madVR x64 will probably come, but probably not very soon.
Sadface
Stan is offline   Reply With Quote
Old 8th July 2014, 19:20   #26831  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by xabregas View Post
So i did what u said, disabled FSE mode but didnt activate windowed overlay as i had no tearing in FS with FSE mode deactivated so i suppose i dont need windowed overlay right?
On Nvidia windowed overlay is the lowest effort mode but it is a very small difference and there are benefits to the non-overlay modes.

Quote:
Originally Posted by xabregas View Post
Also, i turned Smooth motion on for some TV shows i have, even knowing i have 1080p 24hz, for TV shows is better 60hz, but with movies at 23.976, if i enable 24HZ should i use Smooth Motion? DO i need videoclock or reclock?
If you leave smooth motion on "only if there would be judder" it will automatically turn off when watching 23.976fps @ 24Hz. Reclock will still help if you want to sync timings exactly but I stopped using it with madVR and I don't notice or measure any dropped or repeated frames (on a 60 or 72 Hz display). This might depend on your exact video clock(s) though.
Asmodian is offline   Reply With Quote
Old 8th July 2014, 20:11   #26832  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by Asmodian View Post


If you leave smooth motion on "only if there would be judder" it will automatically turn off when watching 23.976fps @ 24Hz. Reclock will still help if you want to sync timings exactly but I stopped using it with madVR and I don't notice or measure any dropped or repeated frames (on a 60 or 72 Hz display). This might depend on your exact video clock(s) though.
I notice that in 24Hz smooth motion is off, but i notice some oos issues, not by much. Should i use videoclock or reclock?

The oos issues get worse when i watch 25fps TV shows on 50HZ, smooth motion is off but audio gets oos.
xabregas is offline   Reply With Quote
Old 8th July 2014, 21:45   #26833  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Odd, your video clocks must be fairly off? Yes reclock would probably help.
Asmodian is offline   Reply With Quote
Old 9th July 2014, 03:49   #26834  |  Link
pie1394
Registered User
 
Join Date: May 2009
Posts: 212
Quote:
Originally Posted by madshi View Post
But are you sure that it does? I don't really know what the super-res engine does inside. Depending on the algorithm it could work better with dithering off, or with dithering on, or it could make no difference. There's no way to know except trying which looks better.
According to Sony DRC team's chief engineer comment, the material library of thousands video image patterns are built into the XCA7 engine. I don't know the details how the process is done. But I guess it could be pattern-matching process similar to image recognization to identify the image content's characteristsics. If I am not wrong, the traditional ways only rely on temporal multi-image information to restore the lost detailed contents?

Although this TV set has the 960Hz backlit-strobe scanning + 240Hz native panel, however, I sometimes still found this engine's Clear+ motion interpolation handling is not perfect. For very few 24 fps contents moving scenes, back-and-force-flicking or ghost image could be still suddenly observed.
pie1394 is offline   Reply With Quote
Old 9th July 2014, 06:54   #26835  |  Link
dansrfe
Registered User
 
Join Date: Jan 2009
Posts: 1,210
Is it possible for madVR to exclusively utilize the discrete GPU instead of the integrated Intel GPU?
dansrfe is offline   Reply With Quote
Old 9th July 2014, 11:50   #26836  |  Link
subz3ro
Registered User
 
Join Date: Jul 2014
Posts: 7
Is it only me, or anybody else experience random crashes with madVR and Smooth Motion enabled, when seeking through media in Media Player Classic? This is not happening with the 0.86.x version, only with the 0.87.x one.
subz3ro is offline   Reply With Quote
Old 9th July 2014, 13:13   #26837  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Just you.. 0.87.x is rock solid.
ryrynz is offline   Reply With Quote
Old 9th July 2014, 13:34   #26838  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by dansrfe View Post
Is it possible for madVR to exclusively utilize the discrete GPU instead of the integrated Intel GPU?
There are a few things you need to do, and that you can check to make sure madVR is using your dGpu.

If you're using Media Player Classic Home Cinema make sure to rename the executable for it from mpc-hc.exe to something else like mpc-hc2.exe, mpc-hchd.exe, mpc-hcnv.exe (pretty much anything aside from the original name will work).

Go into Nvidia Control Panel (I'm guessing you have an Nvidia/Intel system, but if that's not the case you'll need to go into the driver settings area for whatever GPU you have). Under manage 3d settings switch to the Program Settings tab. Select Add and navigate to your renamed mpc-hc file. Change the area under perferred graphics processor for this program to the High-performance Nvidia processor.

Open a video. You should notice an improvement in your render times. Exit the video and open your registry. Navigate to HKEY_CURRENT_USER\Software\madshi\madVR\OpenCL if it has correctly detected your dGPU it should have it listed as a key inside that key showing its being used for OpenCL processes. If you don't see your gpu listed there post again and let us know, and we can give you instructions for creating a key that will force it to be added.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 9th July 2014, 13:50   #26839  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by huhn View Post
i hope with a proper HT room.
Not yet, but there's going to be a dedicated HT front projection room.

Quote:
Originally Posted by Anime Viewer View Post
Then is there another explanation for why render times would be higher when a 1080p video is playing on a 1080p monitor full screen with settings like Jinc or NNEDI3 set compared to when that same video is playing on that same monitor at full screen with something like DXVA2 or Bi-linear set for a scaling algorithm like image upscaling?
Quote:
Originally Posted by huhn View Post
do you change chroma too? if yes there is your answer all dvd/bd need a chroma upscaling always.

and powerstates make rendertime unreliable too shouldn't apply to your case but i have normally higher render times with 1080p else with 720p on 1080p the powerstate with 1080p is lowest so the render time look higher.

and are you sure your source is true 1080p and not cropped by 2-6 pixel?
^

madVR carefully checks the source and target resolutions and only does what is necessary. If no upscaling/doubling is needed, it is not performed at all. If you have higher GPU consumption with Jinc compared to Bilinear with a video that you think needs no scaling, there must be something wrong somewhere. Of course it could be a bug in madVR, but I rather think it's likely something else. Check the OSD to make sure source and target width/height *really* match perfectly.

Quote:
Originally Posted by xabregas View Post
I notice that in 24Hz smooth motion is off, but i notice some oos issues, not by much. Should i use videoclock or reclock?

The oos issues get worse when i watch 25fps TV shows on 50HZ, smooth motion is off but audio gets oos.
The only way you should get oos problems with madVR is if you're constantly dropping lots of frames. If that's not the case and you *still* get oos problems, then it must be something really strange, like your GPU driver or audio renderer messing things up, or something like that. Not sure what to recommend here. Maybe double check with other renderers just to make sure the problem is isolated to madVR or not. Try other audio renders. Try disabling the "present several frames in advance" feature.

Quote:
Originally Posted by pie1394 View Post
According to Sony DRC team's chief engineer comment, the material library of thousands video image patterns are built into the XCA7 engine. I don't know the details how the process is done. But I guess it could be pattern-matching process similar to image recognization to identify the image content's characteristsics. If I am not wrong, the traditional ways only rely on temporal multi-image information to restore the lost detailed contents?
I don't really know. But I rather think dithering should not harm the Sony DRC algorithm more than the quantization noise you'd get from *not* dithering.
madshi is offline   Reply With Quote
Old 9th July 2014, 17:59   #26840  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
One thing that was never all that clear to me, if I use NNEDI to scale above target (luma only), does it also always double chroma (with the image scaling option) or does it scale to output target directly?
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:58.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.