Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 4th July 2014, 01:38   #26781  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
Dithering is the only part of the process that needs to be used on every frame. But usually dithering is so fast that this doesn't cause a significant increase in rendering times if you enable smooth motion. However the error diffusion algorithms are slower than the other dithering options so if you use error diffusion and smooth motion at the same time you might see a noticeable increase in rendering times.

You should be able to play back videos even if the rendering time is higher than the v-sync interval. It's just that the dithering algorithms are the only ones where the rendering times increase if you're using smooth motion. However this just means that you should measure the rendering times while using smooth motion. It doesn't mean that you need to keep the total rendering time below the v-sync interval. You should be fine if it is below the total time per frame (30~40ms).
Shiandow is offline   Reply With Quote
Old 4th July 2014, 02:47   #26782  |  Link
Osjur
Registered User
 
Osjur's Avatar
 
Join Date: Oct 2010
Posts: 5
Thanks for the clarification... so I can use a lot more neurons in my profiles.

Is there actually even a point in using error diffusion dithering if you have true 10bit display and amd card with 10bpc color depth enabled?

I just can't see any differences when comparing ordered dithering vs error diffusion 1
Osjur is offline   Reply With Quote
Old 4th July 2014, 03:57   #26783  |  Link
XMonarchY
Guest
 
Posts: n/a
I just finished playing Assassin's Creed 4 - Black Flag using 4x TXAA with LumaSharpen, which ended up being a perfect combo! Nothing gets rid of aliasing, especially temporal one with shimmering that occurs in motion, better than TXAA. Even with 4x SSAA (without SMAA), leaves, grass, and other similar foliage content continues to have aliasing that is very visible as you move. If you add SMAA to 4x SSAA then it gets much better, but 4x SSAA + SMAA has a significant performance impact. 4x TXAA is not as much of a resource hog as 4x SSAA + SMAA, but its blurry as hell! This is where LumaSharpen comes in. With the right settings, it completely gets rid of all the blur, but unlike some people report, it doesn't offset/negate the actual anti-aliasing effect or at least not most of it! It merely sharpens those few surfaces that had some visible aliasing even without LumaSharpen when 4x TXAA was enabled. I can post some comparison screenshots if you don't believe me! 4x TXAA + LumaSharpen is by far the best combination out there for games that produce a ton of temporal aliasing with FXAA, SMAA, MSAA, CSAA, and sometimes even SSAA!

How is this relevant to madVR? I know LumaSharpen is used not only for games, but also for video rendering, although it is not an actual madVR feature. Another example - GeDoSaTo, a tool made for some games like Dark Souls 2 with intent to add some cool graphics features such as DoF, SSAO, SMAA/FXAA, etc. can be set to use Lanczos to downsample from whatever resolution like 4K to any lower resolution like 1080p. Lanczos scaling is also present in madVR!

So... I was wondering if madVR techniques used to improve video rendering image quality could be applied to other applications, like video games??? What about vice versa, where videogame/videocard graphics rendering capabilities and features like TXAA could be applied to madVR to further improve video rendering image quality!? NNEDI3 + Videogames = , but I doubt its feasible.

Last edited by XMonarchY; 4th July 2014 at 04:21.
  Reply With Quote
Old 4th July 2014, 04:04   #26784  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by Osjur View Post
Is there actually even a point in using error diffusion dithering if you have true 10bit display and amd card with 10bpc color depth enabled?

I just can't see any differences when comparing ordered dithering vs error diffusion 1
MadVR doesn't do 10-bit at all, it will only output 8-bit RGB. That said ordered dithering is very good, I do think error diffusion dithering is better but the difference is very minor.

This would depend on your display too, if you don't notice the difference use ordered dithering.

A fun test is to set your display to 2-bit and change dithering options, make sure to turn off the trade quality for performance option "don't use linear light for dithering".
Asmodian is offline   Reply With Quote
Old 4th July 2014, 06:09   #26785  |  Link
panetesan2k6
Registered User
 
Join Date: Jan 2014
Location: Latveria
Posts: 29
Quote:
Originally Posted by DarkSpace View Post
This looks wrong. Assuming that your TV can't handle fullrange RGB, you should set it to either
ATI CCC: pixel format: RGB 4:4:4 (Limited RGB)
madvr: RGB output leves: PC levels (0-255)
(which is the general-purpose solution), or to
ATI CCC: pixel format: RGB 4:4:4 (Full RGB)
madvr: RGB output leves: TV levels (16-235)
(which is probably the preferred solution if you care only about your videos displaying correctly - be warned: anything other than madVR will display incorrectly, with clipped colors).

In your current setup, madVR squeezes the 0-255 RGB data into the 16-235 range, and later, your video driver squeezes that 16-235 data, which it thinks is 0-255 data, into 16-235 again!
Hi.

I don't undestand why there is never a logical correlation (to me, at least) between settings. For example, if a tv can't handle fullrange, why set madvr output to 0-255? As you describe it, it seems like madvr ouput setting would make sense if you couldn't set the output in the graphics card driver, but with both options, they overlap each other.

My TV, for example, can handle full RGB. So my ATI CCC is set to 4:4:4 Full RGB and madvr is set to 0-255. Is that wrong?
panetesan2k6 is offline   Reply With Quote
Old 4th July 2014, 07:52   #26786  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Think of the limited settings as "Compress Full Range to Limited" and you always start with full range RGB (madVR's native format if you will).

If you want to use limited you only want to use it in one place. madVR's might be better than the GPU's but then only it is correct while everything else (decktop, etc.) is clipped.

If you can run full range you don't have to worry and can simply set everything to full range.
Asmodian is offline   Reply With Quote
Old 4th July 2014, 09:25   #26787  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
Quote:
Originally Posted by XMonarchY View Post
I just finished playing Assassin's Creed 4 - Black Flag using 4x TXAA with LumaSharpen, which ended up being a perfect combo! Nothing gets rid of aliasing, especially temporal one with shimmering that occurs in motion, better than TXAA. Even with 4x SSAA (without SMAA), leaves, grass, and other similar foliage content continues to have aliasing that is very visible as you move. If you add SMAA to 4x SSAA then it gets much better, but 4x SSAA + SMAA has a significant performance impact. 4x TXAA is not as much of a resource hog as 4x SSAA + SMAA, but its blurry as hell! This is where LumaSharpen comes in. With the right settings, it completely gets rid of all the blur, but unlike some people report, it doesn't offset/negate the actual anti-aliasing effect or at least not most of it! It merely sharpens those few surfaces that had some visible aliasing even without LumaSharpen when 4x TXAA was enabled. I can post some comparison screenshots if you don't believe me! 4x TXAA + LumaSharpen is by far the best combination out there for games that produce a ton of temporal aliasing with FXAA, SMAA, MSAA, CSAA, and sometimes even SSAA!

How is this relevant to madVR? I know LumaSharpen is used not only for games, but also for video rendering, although it is not an actual madVR feature. Another example - GeDoSaTo, a tool made for some games like Dark Souls 2 with intent to add some cool graphics features such as DoF, SSAO, SMAA/FXAA, etc. can be set to use Lanczos to downsample from whatever resolution like 4K to any lower resolution like 1080p. Lanczos scaling is also present in madVR!

So... I was wondering if madVR techniques used to improve video rendering image quality could be applied to other applications, like video games??? What about vice versa, where videogame/videocard graphics rendering capabilities and features like TXAA could be applied to madVR to further improve video rendering image quality!? NNEDI3 + Videogames = , but I doubt its feasible.
you can't compare video games with videos and AA is not really useful with video. and using nnedi3 with games is just dumb it's most of the time better to render the game direct in double the resolution.
huhn is offline   Reply With Quote
Old 4th July 2014, 11:41   #26788  |  Link
anthropolyte
Registered User
 
Join Date: Jun 2014
Posts: 2
Weird playback issues...

Hi all, I'm after some advice!

I have MPC-HC installed which I use MadVR with to play back video on my HDTV, connected to my GPU via HDMI.

I'm finding that when the PC is being used on the primary monitor (connected via DVI), and video is being played back on the (secondary) HDTV, I get presentation glitches, audio delays, choppy playback and all sorts of other things going on. This doesn't seem to affect any applications being used on the primary monitor.

The PC is used for a variety of things, including Photoshop, normal web browsing, document editing, etc. Nothing specific seems to set off the problem; just moving the mouse on the other screen can cause the issues to occur if video is being played back on the HDTV.

Specs of the PC are:
AMD FX8350 (8-core, 4.0GHz)
8GB Corsair Vengeance 1600MHz RAM
Sapphire Radeon HD7870 GHz Edition
Western Digital Black 2TB storage drive

I've tried playing with the various settings in MadVR in an attempt to resolve the problems (disabling image doubling, choosing different scaling algorithms etc), but no joy so far.

This seems to happen with all media types as well, including avi, mkv and mpg.

Any help would be greatly appreciated!

*edit*
I probably should have mentioned that when the PC is not being used for anything else, playback is flawless.

Last edited by anthropolyte; 4th July 2014 at 11:54.
anthropolyte is offline   Reply With Quote
Old 4th July 2014, 13:56   #26789  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by anthropolyte View Post
Hi all, I'm after some advice!

I have MPC-HC installed which I use MadVR with to play back video on my HDTV, connected to my GPU via HDMI.

I'm finding that when the PC is being used on the primary monitor (connected via DVI), and video is being played back on the (secondary) HDTV, I get presentation glitches, audio delays, choppy playback and all sorts of other things going on. This doesn't seem to affect any applications being used on the primary monitor.
Is your computer setup to duplicate the primary screen to the secondary screen? Running in Duplicate or Clone mode can lead to some the of the problems you described. If you're using Windows 8 or 7 right click on a blank part of your desktop and select Screen Resolution. Under the drop down box that says Multiple Displays change from Duplicate these displays (or Clone) to extend these displays.
Then when a video launches on your primary monitor pause it and drag it to the secondary monitor, and play it. Does it still have the same issues?

The other thing that may be occurring is the use of Smooth Motion which can be a drain on resources and shoot up your render times. Do you have Smooth Motion active in your settings? I have two screens too (a notebook screen and a tv). The notebook always runs at 60hz with Smooth Motion off, but playing the same videos on the tv Smooth Motion turns on. If you have Smooth Motion on try turning it off and see if the same issues occur. If they do not it doesn't mean you have to disable Smooth Motion, but that you may have to change to some less taxing settings in other areas (like setting Ordered Dithering instead of Error Diffusion, or dropped the number of Neurons you use if you're using NNEDI3).

One more thing you can experiment with is changing the refresh rate of the tv. Chances are it is running at 60hz by default (if you haven't changed it). Try changing it to 30hz (in Windows if it shows your tv supports it) and see if that makes a difference. 60hz puts twice as much work on the GPU as 30hz, so the lower the refresh rate your tv is set to the higher the settings you can use without it stressing the GPU as much. If this makes a difference you should add your supported screen resolutions in the madVR display area where it says list all display modes madVR may switch to: that will allow the screen to change to the same (or close) speed to each video you are playing, and give you better quality playback.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.

Last edited by Anime Viewer; 4th July 2014 at 14:06.
Anime Viewer is offline   Reply With Quote
Old 4th July 2014, 14:13   #26790  |  Link
anthropolyte
Registered User
 
Join Date: Jun 2014
Posts: 2
Quote:
Originally Posted by Anime Viewer View Post
Is your computer setup to duplicate the primary screen to the secondary screen?
No - the desktop is already set to Extended, with the HDTV displaying the desktop extended to the right of the primary monitor.

Quote:
Originally Posted by Anime Viewer View Post
The other thing that may be occurring is the use of Smooth Motion which can be a drain on resources and shoot up your render times. Do you have Smooth Motion active in your settings?
I do use Smooth Motion, so I'll give that a try.

Quote:
Originally Posted by Anime Viewer View Post
One more thing you can experiment with is changing the refresh rate of the tv.
I'll give that a go too - although the feed via HDMI goes through an AV receiver, so I'm not sure whether I can set that to 30Hz!

Thanks for your help
anthropolyte is offline   Reply With Quote
Old 4th July 2014, 15:50   #26791  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Something is very wrong with my madvr setup... I am suddenly unable to play any 1080p h264 videos (23,97fps) without dropping frames. I didn't update or downgrade any drivers since I last watched a 1080p video (testing the new dithering modes), and there was no new stable build of madvr since then. The weird thing is, it's definitely madvr causing it (switching to evr works fine), but no matter how low I set the settings (no dither, nearest neighbor upscale, everything else off), the GPU/CPU load stays the same and the playback stutters.

Is there any way to back up madvr settings from the config tool, or do I have to backup the registry keys?

edit: the OSD shows that the backbuffer queue is not filling in windowed mode, the presentation queue in fullscreen mode.
__________________
AMD Phenom II X4 970BE | 12GB DDR3 | nVidia 260GTX | Arch Linux / Windows 10 x64 Pro (w/ calling home shut up)

Last edited by Soukyuu; 4th July 2014 at 15:54.
Soukyuu is offline   Reply With Quote
Old 4th July 2014, 16:03   #26792  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
have you restarted the pc?
what does the composition rate say?
huhn is offline   Reply With Quote
Old 4th July 2014, 16:37   #26793  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Composition rate?
Here is a capture of the OSD: click
edit: that's with smooth motion on, but the values don't change even if i turn it off, at least not significantly.
__________________
AMD Phenom II X4 970BE | 12GB DDR3 | nVidia 260GTX | Arch Linux / Windows 10 x64 Pro (w/ calling home shut up)

Last edited by Soukyuu; 4th July 2014 at 16:39.
Soukyuu is offline   Reply With Quote
Old 4th July 2014, 16:39   #26794  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
Quote:
Originally Posted by Soukyuu View Post
Is there any way to back up madvr settings from the config tool, or do I have to backup the registry keys?
You can also backup the settings.bin file in the madVR folder.
Shiandow is offline   Reply With Quote
Old 4th July 2014, 17:04   #26795  |  Link
Soukyuu
Registered User
 
Soukyuu's Avatar
 
Join Date: Apr 2012
Posts: 169
Thanks, Shiandow. I didn't consider it at first, because i thought madvr prefers registry entries over the settings.bin, but since it deletes them when resetting, it works fine.
It's not a settings issue it seems. I have also tried disabling my second monitor (the only thing that changed hardware-wise) and no changes. I guess I will have to test older mpc-hc/lav versions.

edit: well, I'm lost. I even tried older madvr versions, and nothing helps. Any suggestions where to look for a solution? I tried playing around with queue lengths and increasing doesn't help. It does show that the upload queues are stuck at 1-2/x. Could a windows update have caused that? Did anyone else notice GPU performance degradation? Maybe mine is just so old the degradation is so obvious?

edit2: nvm, I found out why: Wikipedia Hyper-V
Quote:
On CPUs without Second Level Address Translation, installation of most WDDM accelerated graphics drivers on the primary OS will cause a dramatic drop in graphic performance. This occurs because the graphics drivers access memory in a pattern that causes the Translation lookaside buffer to be flushed frequently.
The interesting thing is that according to system info, my CPU does support SLAT! Still, disabling Hyper-V solves the issue. Thanks, Microsoft!
__________________
AMD Phenom II X4 970BE | 12GB DDR3 | nVidia 260GTX | Arch Linux / Windows 10 x64 Pro (w/ calling home shut up)

Last edited by Soukyuu; 4th July 2014 at 17:45.
Soukyuu is offline   Reply With Quote
Old 4th July 2014, 23:13   #26796  |  Link
Osjur
Registered User
 
Osjur's Avatar
 
Join Date: Oct 2010
Posts: 5
I don't know if this is a bug but if I untick "don't rerender frames when fade in/out is detected" I get about 7-10 droppped frames each time there is a black fade going on. Makes some of the files unwatchable when there's alot of fast movement and black scene changes.

This happens in both windowed and fse mode. Max rendering times are around 7ms so that shouldn't be the culprit.

Ps. My gpu is R9 290X so there should be plenty of power to run madvr.

EDIT: Found out that if I untick artifact removal, I don't get dropped frames anymore. Now the question is why those frame drops doesn't show as high frametimes on max stats (5s) window?

Last edited by Osjur; 4th July 2014 at 23:21.
Osjur is offline   Reply With Quote
Old 4th July 2014, 23:25   #26797  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
That's more or less expected behaviour. As far as I understand it takes a few frames before MadVR realises that there is fade in/out so what "rerender frames when fade in/out is detected" does is go back a few frames whenever MadVR detects a fade in/out and rerenders those frames to prevent banding on the first few frames of the fade. If this is causing issues it's probably best to leave it off.
Shiandow is offline   Reply With Quote
Old 4th July 2014, 23:27   #26798  |  Link
mindbomb
Registered User
 
Join Date: Aug 2010
Posts: 576
try disabling amd radeon's powerplay with msi afterburner. It's in settings>general>amd compatibility options>unofficial overclocking mode>without powerplay. You have to have an overclock or underclock active too, so you can increase or decrease the core frequency by 1mhz. This has solved a lot of my problems in general.
mindbomb is offline   Reply With Quote
Old 5th July 2014, 06:38   #26799  |  Link
panetesan2k6
Registered User
 
Join Date: Jan 2014
Location: Latveria
Posts: 29
Quote:
Originally Posted by Asmodian View Post
Think of the limited settings as "Compress Full Range to Limited" and you always start with full range RGB (madVR's native format if you will).

If you want to use limited you only want to use it in one place. madVR's might be better than the GPU's but then only it is correct while everything else (decktop, etc.) is clipped.

If you can run full range you don't have to worry and can simply set everything to full range.
Thanks for your answer. I guess this sums up to:

MadVR works natively in full rgb, but when you select "PC Levels" output it cuts BTB and WTW intentionally (as madshi explains here), so best/preferable scenario is to use a display that can handle full rgb. This won't give us BTB and WTW, but that's ok. In case of having a display that can't handle full rgb, the signal must be "squeezed" to 16-235 either by madVR (setting "TV Levels" in MadVR and leaving desktop whites and blacks clipped) or the GPU (setting "RGB Limited" in the driver and leaving MadVR in "PC Levels"), having desktop and video correct values. I hope I got it right finally.


Now, I have some question regarding luma and chroma and RGB signal: My TV shows 0-255 luma values fed to it, calling this "Full RGB", but subsamples chroma to 4:2:2 anyway, unless I label the HDMI3 input to "PC". Then, it subsamples chroma to 4:4:4. So I started wondering: Can we say a RGB signal is really "full" if chroma is subsampled to 4:2:2? Is it possible that what my tv calls "full RGB" is actually a way of squeezing the 0-255 luma fed to it to 16-235 but leaving chroma out of the process? In that case I think I should feed my TV with a 16-235 signal to avoid that last step of picture processing... What you guys think?
panetesan2k6 is offline   Reply With Quote
Old 5th July 2014, 08:22   #26800  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
Quote:
Originally Posted by panetesan2k6 View Post
Now, I have some question regarding luma and chroma and RGB signal: My TV shows 0-255 luma values fed to it, calling this "Full RGB", but subsamples chroma to 4:2:2 anyway, unless I label the HDMI3 input to "PC". Then, it subsamples chroma to 4:4:4. So I started wondering: Can we say a RGB signal is really "full" if chroma is subsampled to 4:2:2? Is it possible that what my tv calls "full RGB" is actually a way of squeezing the 0-255 luma fed to it to 16-235 but leaving chroma out of the process? In that case I think I should feed my TV with a 16-235 signal to avoid that last step of picture processing... What you guys think?
just use PC mode or test it with a black clipping clip.

if your TV is doing this:

Quote:
Is it possible that what my tv calls "full RGB" is actually a way of squeezing the 0-255 luma fed to it to 16-235 but leaving chroma out of the process?
it should be through into the sun right now.

it does a RGB -> YCbCR 4:2:2 but with chroma and luma it's not leaving it out of the precess
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 00:59.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.