Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 4th July 2014, 17:39   #26801  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 752
Quote:
Originally Posted by Soukyuu View Post
Is there any way to back up madvr settings from the config tool, or do I have to backup the registry keys?
You can also backup the settings.bin file in the madVR folder.
Shiandow is offline   Reply With Quote
Old 4th July 2014, 18:04   #26802  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Thanks, Shiandow. I didn't consider it at first, because i thought madvr prefers registry entries over the settings.bin, but since it deletes them when resetting, it works fine.
It's not a settings issue it seems. I have also tried disabling my second monitor (the only thing that changed hardware-wise) and no changes. I guess I will have to test older mpc-hc/lav versions.

edit: well, I'm lost. I even tried older madvr versions, and nothing helps. Any suggestions where to look for a solution? I tried playing around with queue lengths and increasing doesn't help. It does show that the upload queues are stuck at 1-2/x. Could a windows update have caused that? Did anyone else notice GPU performance degradation? Maybe mine is just so old the degradation is so obvious?

edit2: nvm, I found out why: Wikipedia Hyper-V
Quote:
On CPUs without Second Level Address Translation, installation of most WDDM accelerated graphics drivers on the primary OS will cause a dramatic drop in graphic performance. This occurs because the graphics drivers access memory in a pattern that causes the Translation lookaside buffer to be flushed frequently.
The interesting thing is that according to system info, my CPU does support SLAT! Still, disabling Hyper-V solves the issue. Thanks, Microsoft!
__________________
AMD Phenom II X4 970BE | 12GB DDR3 | nVidia 260GTX | Arch Linux / Windows 10 x64 Pro (w/ calling home shut up)

Last edited by Soukyuu; 4th July 2014 at 18:45.
Soukyuu is offline   Reply With Quote
Old 5th July 2014, 00:13   #26803  |  Link
Osjur
Registered User
 
Osjur's Avatar
 
Join Date: Oct 2010
Posts: 5
I don't know if this is a bug but if I untick "don't rerender frames when fade in/out is detected" I get about 7-10 droppped frames each time there is a black fade going on. Makes some of the files unwatchable when there's alot of fast movement and black scene changes.

This happens in both windowed and fse mode. Max rendering times are around 7ms so that shouldn't be the culprit.

Ps. My gpu is R9 290X so there should be plenty of power to run madvr.

EDIT: Found out that if I untick artifact removal, I don't get dropped frames anymore. Now the question is why those frame drops doesn't show as high frametimes on max stats (5s) window?

Last edited by Osjur; 5th July 2014 at 00:21.
Osjur is offline   Reply With Quote
Old 5th July 2014, 00:25   #26804  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 752
That's more or less expected behaviour. As far as I understand it takes a few frames before MadVR realises that there is fade in/out so what "rerender frames when fade in/out is detected" does is go back a few frames whenever MadVR detects a fade in/out and rerenders those frames to prevent banding on the first few frames of the fade. If this is causing issues it's probably best to leave it off.
Shiandow is offline   Reply With Quote
Old 5th July 2014, 00:27   #26805  |  Link
mindbomb
Registered User
 
Join Date: Aug 2010
Posts: 578
try disabling amd radeon's powerplay with msi afterburner. It's in settings>general>amd compatibility options>unofficial overclocking mode>without powerplay. You have to have an overclock or underclock active too, so you can increase or decrease the core frequency by 1mhz. This has solved a lot of my problems in general.
mindbomb is offline   Reply With Quote
Old 5th July 2014, 07:38   #26806  |  Link
panetesan2k6
Registered User
 
Join Date: Jan 2014
Location: Latveria
Posts: 29
Quote:
Originally Posted by Asmodian View Post
Think of the limited settings as "Compress Full Range to Limited" and you always start with full range RGB (madVR's native format if you will).

If you want to use limited you only want to use it in one place. madVR's might be better than the GPU's but then only it is correct while everything else (decktop, etc.) is clipped.

If you can run full range you don't have to worry and can simply set everything to full range.
Thanks for your answer. I guess this sums up to:

MadVR works natively in full rgb, but when you select "PC Levels" output it cuts BTB and WTW intentionally (as madshi explains here), so best/preferable scenario is to use a display that can handle full rgb. This won't give us BTB and WTW, but that's ok. In case of having a display that can't handle full rgb, the signal must be "squeezed" to 16-235 either by madVR (setting "TV Levels" in MadVR and leaving desktop whites and blacks clipped) or the GPU (setting "RGB Limited" in the driver and leaving MadVR in "PC Levels"), having desktop and video correct values. I hope I got it right finally.


Now, I have some question regarding luma and chroma and RGB signal: My TV shows 0-255 luma values fed to it, calling this "Full RGB", but subsamples chroma to 4:2:2 anyway, unless I label the HDMI3 input to "PC". Then, it subsamples chroma to 4:4:4. So I started wondering: Can we say a RGB signal is really "full" if chroma is subsampled to 4:2:2? Is it possible that what my tv calls "full RGB" is actually a way of squeezing the 0-255 luma fed to it to 16-235 but leaving chroma out of the process? In that case I think I should feed my TV with a 16-235 signal to avoid that last step of picture processing... What you guys think?
panetesan2k6 is offline   Reply With Quote
Old 5th July 2014, 09:22   #26807  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,063
Quote:
Originally Posted by panetesan2k6 View Post
Now, I have some question regarding luma and chroma and RGB signal: My TV shows 0-255 luma values fed to it, calling this "Full RGB", but subsamples chroma to 4:2:2 anyway, unless I label the HDMI3 input to "PC". Then, it subsamples chroma to 4:4:4. So I started wondering: Can we say a RGB signal is really "full" if chroma is subsampled to 4:2:2? Is it possible that what my tv calls "full RGB" is actually a way of squeezing the 0-255 luma fed to it to 16-235 but leaving chroma out of the process? In that case I think I should feed my TV with a 16-235 signal to avoid that last step of picture processing... What you guys think?
just use PC mode or test it with a black clipping clip.

if your TV is doing this:

Quote:
Is it possible that what my tv calls "full RGB" is actually a way of squeezing the 0-255 luma fed to it to 16-235 but leaving chroma out of the process?
it should be through into the sun right now.

it does a RGB -> YCbCR 4:2:2 but with chroma and luma it's not leaving it out of the precess
huhn is offline   Reply With Quote
Old 5th July 2014, 09:29   #26808  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,773
Limited and full are independent of chroma sampling. However, you cannot have chroma sampling and be RGB so your TV must be converting to YCbCr. Weather it converts to limited or full YCbCr will be harder to check.
Asmodian is offline   Reply With Quote
Old 5th July 2014, 14:22   #26809  |  Link
mysterix
Registered User
 
Join Date: May 2014
Location: Ukraine
Posts: 25
Does exist any utility which can show percent of VideoMemory bandwidth loading on AMD's (particularly, on my HD5450)? I know such utility exists for Nvidia cards definitely, but what about AMD? GPU-Z does not show that parameter for AMD cards...

Last edited by mysterix; 5th July 2014 at 14:27.
mysterix is offline   Reply With Quote
Old 6th July 2014, 01:43   #26810  |  Link
panetesan2k6
Registered User
 
Join Date: Jan 2014
Location: Latveria
Posts: 29
Quote:
Originally Posted by Asmodian View Post
Limited and full are independent of chroma sampling. However, you cannot have chroma sampling and be RGB so your TV must be converting to YCbCr.
Thanks for that, it's the conclussion I was looking for

Quote:
Weather it converts to limited or full YCbCr will be harder to check.
I'll try to check this and will report back.
panetesan2k6 is offline   Reply With Quote
Old 6th July 2014, 09:21   #26811  |  Link
Qaq
AV heretic
 
Join Date: Nov 2009
Posts: 422
Quote:
Originally Posted by mysterix View Post
Does exist any utility which can show percent of VideoMemory bandwidth loading on AMD's (particularly, on my HD5450)? I know such utility exists for Nvidia cards definitely, but what about AMD? GPU-Z does not show that parameter for AMD cards...
5450 is just very limited in this regard. And too weak for madVR in general. If you prefer fanless AMD cards you may want to upgrade to fanless 7750 (like I did) to get more power for madVR and more sensors for GPU-Z.
Qaq is offline   Reply With Quote
Old 7th July 2014, 00:59   #26812  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 119
Hi, i just bought an nvidia gt 750 ti alongside with a smartv 1080p and im having several presentation glitches when i play 1080p movie at 1080p resolution, while i dont see any glitche at all, madvr ctrl j shows many...

The problem dont seem to happen while playing 720p movies upscaled to 1080p wit jinc 3 ar on chroma and luma. Only when i play 1080p movies (which dont need any upscale or downscale) the glitches appear.

I use Full screen exclusive mode...

Guess if the gpu dont do upscale/downscale, it wont be used and the fullscreen exclusive mode needs the gpu to do something???

TIA
xabregas is offline   Reply With Quote
Old 7th July 2014, 02:24   #26813  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,063
you can disable FSE mode.

even with a 1080p source at a 1080p screen there is a lot to do.
chroma upscaling, dithering.
huhn is offline   Reply With Quote
Old 7th July 2014, 05:26   #26814  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 333
Quote:
Originally Posted by xabregas View Post
Hi, i just bought an nvidia gt 750 ti alongside with a smartv 1080p and im having several presentation glitches when i play 1080p movie at 1080p resolution, while i dont see any glitche at all, madvr ctrl j shows many...

The problem dont seem to happen while playing 720p movies upscaled to 1080p wit jinc 3 ar on chroma and luma. Only when i play 1080p movies (which dont need any upscale or downscale) the glitches appear.

I use Full screen exclusive mode...

Guess if the gpu dont do upscale/downscale, it wont be used and the fullscreen exclusive mode needs the gpu to do something???

TIA
Even when its not supposed to do upscaling (1080 -> 1080) it still tries to, and uses up resources. I recommend you make rules. For the 1080 rule choose DXVA2 which for all intensive purposes will disable all the other upscaling options, and give you the highest probably of having glitch free playback at that resolution.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 7th July 2014, 06:12   #26815  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,773
Quote:
Originally Posted by Anime Viewer View Post
Even when its not supposed to do upscaling (1080 -> 1080) it still tries to, and uses up resources. I recommend you make rules. For the 1080 rule choose DXVA2 which for all intensive purposes will disable all the other upscaling options, and give you the highest probably of having glitch free playback at that resolution.
What? This isn't true. DXVA2 will be fast and ok quality but there isn't any "it still tries to, and uses up resources" going on. Chroma upscaling, conversion to RGB, debanding (if enabled), smooth motion (if enabled), and dithering still needs to be done for 1080p displayed at 1920x1080.

@xabregas
The 750 Ti should be find for 1080p at 1920x1080 with Jinc3 AR for chroma scaling. Have you changed the dithering options or smooth motion? How about using Bicubic 75+AR for chroma scaling?

Also try with Full screen exclusive disabled and Windowed Overlay enabled.

Which motherboard do you have, madVR likes bandwidth so if you are on something very old there might be an issue there.
Asmodian is offline   Reply With Quote
Old 7th July 2014, 06:35   #26816  |  Link
vivan
/人 ◕ ‿‿ ◕ 人\
 
Join Date: May 2011
Location: Russia
Posts: 649
Quote:
Originally Posted by Anime Viewer View Post
Even when its not supposed to do upscaling (1080 -> 1080) it still tries to, and uses up resources.
What? No. You can easily prove it by choosing heavy upscaling algorithm and checking rendering time at 100% zoom and at higher one (like view -> video frame -> zoom 1),

Quote:
Originally Posted by Anime Viewer View Post
For the 1080 rule choose DXVA2 which for all intensive purposes will disable all the other upscaling options, and give you the highest probably of having glitch free playback at that resolution.
It shouldn't change anything. However even if it did - it would only harm quality, since using DXVA-things drops precision to 8 bit, which leads to banding. I would recommend never using them, unless you're using intel GPU and very limited with resourses.

Last edited by vivan; 7th July 2014 at 06:38.
vivan is offline   Reply With Quote
Old 7th July 2014, 10:55   #26817  |  Link
innocenat
Registered User
 
innocenat's Avatar
 
Join Date: Dec 2011
Posts: 77
Hello,

Not sure if this is reported: madVR doesn't work (it only renders first frame and died [not crashing]) for me if I set it to use external graphic instead of integrated graphic in optimus setup.

- madVR 0.82.10
- Intel i7-4710HQ (mobile processor)
- nVidia GTX850M

It work fine if I set it to use integrated graphics only, but that would be a waste.

I am on latest nVidia driver (337.88). I used to use Optimus before in i7-2630QM+GT550M and it works fine running in external CPU on same driver/madVR version.
__________________
AviSynth+
innocenat is offline   Reply With Quote
Old 7th July 2014, 14:56   #26818  |  Link
kopija
Registered User
 
Join Date: May 2012
Posts: 34
I have just upgraded from 0.8.6.11 to the newest version and am getting GPU driver chrashes.
I use MadVR primarily because I can watch movies and fold at the same time, thanks to the FSE mode.
Even though F@H consumes 95-98% of my 7870 I could use Jinc3 AR AND SM on 0.8.6.11 while folding simultaneously.
But not so with the newest version: even lowly bicubic causes driver restarts. I do not use image doubling nor error diffusion.
Since F@H uses OpenCL, I suspect that it could be fighting with madVR for use of OpenCL resources somehow.

So, basically, my guestion is: for which madVR functions is OpenCL used? Apart from NNEDI, ofcourse.

Thanks for your answers.
kopija is offline   Reply With Quote
Old 7th July 2014, 15:15   #26819  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,063
@kopija:

NNEDI3,ERROR DIFFUSION and under general "use OpenCL to process DXVA NV12 surfaces"

the newer version got some tweaks for optimus system could be a problem with that.

but is works totally fine with out F@H (what ever that is), right?

else: http://bugs.madshi.net/my_view_page.php feel free to add a entry.
Madshi is currently working on other things. so it better to add it there to make sure he will look at it later on.


edit: @innocenat

please try a newer version of MadVR 82.10 is ancient and report back.

Last edited by huhn; 7th July 2014 at 15:17.
huhn is offline   Reply With Quote
Old 8th July 2014, 05:39   #26820  |  Link
generalmx
Registered User
 
Join Date: Aug 2008
Posts: 11
My results with R9 290X and various observations

First I'd like to thank madshi and the community for such an excellent product

I have a watercooled R9 290X that's pretty consistent with its clocks (no overclock yet, stable at 1000MHz) and even this beast of a card can't handle everything without the occasional frame drop.

Specs
Windows 8.1 (Update 1) 64-bit (Pro w/ WMC)
Haswell Xeon 3.5GHz (w/ HT) 16GB DDR3 RAM
AMD Radeon R9 290X 4GB GDDR5
AMD Driver version: 14.16 (no difference with 14.14)

madVR version: 0.87.10
Software Used: MPC-HC "Lite" (*), MPC-BE (*), LAV Filters 0.62.0, ReClock 1.8.8.4
* - All internal filters disabled.

madVR & ReClock Configuration
Code:
ReClock
- WASAPI Exclusive
- Assume source is 30 FPS or Refresh Rate / 2
- Pre-Buffer: 100ms
- Quality: Best Sinc Interpolation
- Added MPC-BE to ReClock's "load always" since it's not added by default.
(all other defaults)

madVR 0.87.10 Configuration
(If not mentioned, option is disabled.)

Deinterlacing: 
- Automatically activate deinterlacing when needed
-- if in doubt, deactive deinterlacing
- Only look at pixels in the frame center.

artifact removal:
- reduce banding artifacts
-- default debanding strength: low
-- strength during fade in/out: medium

Scaling Algorithms (All)
-- image downscaling: Catmull-Rom w/ Anti-Ringing Filter + Scale in Linear Light

Scaling Algorithms: <=360p
-- image upscaling: Jinc, 3 taps w/ Anti-Ringing Filter
-- chroma upscaling: NNEDI3, 256 neurons
-- image doubling: Always Double Luma, 64 neurons + Always Double Chroma, 16 neurons (*)

Scaling Algorithms: 360p<=480p
-- image upscaling: Jinc, 3 taps w/ Anti-Ringing Filter
-- chroma upscaling: NNEDI3, 128 neurons
-- image doubling: Always Double Luma, 32 neurons + Always Double Chroma, 16 neurons (*)

Scaling Algorithms: 480p<=720p
-- image upscaling: Jinc, 3 taps w/ Anti-Ringing Filter
-- chroma upscaling: NNEDI3, 32 neurons
-- image doubling: Always Double Luma, 16 neurons + Always Double Chroma, 16 neurons (*)

Scaling Algorithms: 720p<=1080p
-- image upscaling: DXVA2
-- chroma upscaling: NNEDI3, 16 neurons
-- image doubling: (None)

* Note: AMD interop hack doesn't seem to make a difference for my tests.

General Settings
- Delay playback start until render queue is full + delay playback start after seeking, too.
- Use OpenCL to process DXVA NV12 surfaces.
- Use separate device for DXVA processing.
- CPU/GPU queue size: 24

windowed mode settings:
- 16 frames in advance
- (rest are set to defaults)

trade quality for performance: ALL disabled

Smooth Motion
- Enable smooth motion frame rate conversion 
-- ...or if the display refresh rate is an exact multiple of the movie frame rate.

Dithering (< 720p): Ordered Dithering
Dithering (>= 720p): Error Diffusion #2

Monitor
HP LP2465 (1920x1200 S-PVA panel)
- Properties: PC Levels (0-255), 8-bit (or higher)
- Calibration: Disabled + Disable GPU gamma ramps.
- Display Modes: (None)
- Color & Gamma: 0/disabled
AMD Catalyst Configuration
- Enable AMD Video Quality features in selected video player applications: UNCHECKED (disabled)
- Enforce Smooth Video Playback: UNCHECKED (disabled)

720p Hi10P Average Stats:
- Rendering (madVR): 28ms
- Interop (madVR): 10ms
- Present (madVR): 0.10ms
- GPU Utilization (HWiNFO): 58% (*)
- D3D Usage (HWiNFO): 23% (*)
- Time Spent with very high (>90%) GPU Usage: 28% (*)
Note: No dropped frames over this report.

(*) I have a super-fancy spreadsheet I made to calculate all of this.

I have tried a large number of different settings from suggestions online and in this thread, and so far these have shown to be the best -- however there are plenty of different source material (I'm mostly testing on anime) that trips it up. Here's some especially weird stuff:

* My settings don't seem to like ReClock set to 24 FPS (for ~24 FPS source), causing more to significantly more dropped frames. I have two monitors connected in an "Extended" desktop: 24" 1200p S-PVA workstation monitor and a crappy 32" 1080p TN HDTV. Both can support 24Hz natively, especially the 1080p, though I must use CRU to add 24Hz for the 1200p, however, even the 1080p set to 24Hz causes significantly more dropped frames (same with 48Hz). Just to check if it was the case of using Windowed and two monitors running different resolutions, I set both monitors to 1080p@24 and the same problem occurred. But I'm guessing the reason it's totally unplayable if Monitor #2 (1080p, Secondary) is set to 1080p@24 and Monitor #1 (1200p, Focus) is still set to 1200@60 is how Windows handles refresh rate differences like that.

* Full-Screen Exclusive Mode gives more to significantly more dropped frames for me, and can introduce artifact errors. Again, note that I can't get EDID for Monitor #2 (1080p), and it's using some generic PnP driver, and Monitor #1 is a 1200p workstation monitor that doesn't support 24Hz (or 48Hz) without hackery.

Hence I've settled on ReClock with 30 FPS or Refresh Rate / 2, which requires Smooth Motion enabled or else I'll get significant judder on anime pan & scan.

Note on Pan & Scan in Anime: I've found certain sources -- say, HorribleSubs -- often have low bitrate sources that will have problems on pan&scan no matter what you do; only outright frame-doubling really helps here (SVP, CUVID, etc.).

On Windows Power Profile: I'd like to stress that you should be using "High Performance" if you're having any sort of trouble, NOT just changing settings under "Balanced". Process Lasso is an example of a program that will set the power profile for you based on what programs are running.

Last edited by generalmx; 8th July 2014 at 05:40. Reason: Formatting.
generalmx is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:34.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.