Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 24th January 2019, 13:38   #54421  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
A 1080 and a 2070 are quite comparable in performance, that wouldn't be a huge "upgrade" to speak of. If you can get a cheap used 1080Ti, that might be a better idea if you really need more performance.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 24th January 2019, 14:20   #54422  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by chros View Post
This is not how GPU hardware acceleration works: it's a completely separate pipeline. Just use GPU hardware acceleration all the time if you can.

I'm not sure how the pipeline works, I'll take your word for it.

But in my own testing, if I use Dxva or Dx11, the render time is always slightly higher than if I use CPU decode.

This was especially true on my iGPU and slower graphics cards.

On my HD3000 for example it will stall out /drop frames if I use Dxva decode, whereas if i let it Cpu decode, and gpu render, it will play smoothly.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 24th January 2019, 15:06   #54423  |  Link
Axelpowa
Registered User
 
Join Date: Jan 2018
Posts: 16
Quote:
Originally Posted by nevcairiel View Post
A 1080 and a 2070 are quite comparable in performance, that wouldn't be a huge "upgrade" to speak of. If you can get a cheap used 1080Ti, that might be a better idea if you really need more performance.
Thx!!!
Axelpowa is offline   Reply With Quote
Old 24th January 2019, 15:33   #54424  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by tp4tissue View Post
This was especially true on my iGPU ...
I'm not sure about iGPUs (never used them for madVR, I even used the discrete GPU in an optimus system): whether they have small amount of dedicated memory or they use the system wide RAM.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 24th January 2019, 16:31   #54425  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 319
Quote:
Originally Posted by Asmodian View Post
I would also recommend not using CUDA acceleration anymore, DXVA2 copy-back or D3D11 copy-back is just as good or better and as universal solutions they are getting more development.
I keep seeing this, and while I'm sure it's true for most cases I don't think it is for some. At least in my system (Windows 7, GTX 1080) I get framedrops with DXVA and D3D11 when playing 4k and up @ 60 fps, while CUVID runs smooth. This means that at least for this content CUVID is faster (it was even faster with old drivers but they produced some ugly artifacts with HDR). I don't know if this could be related with some particular rendering or scaling setting, but I don't think so because it's the same with other renderers and players. Maybe for VR in Windows 10 this is solved with that ATW thing.
Alexkral is offline   Reply With Quote
Old 24th January 2019, 16:36   #54426  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Axelpowa View Post
I'm thinking about replacing my gtx1080 with a Rtx2070 which could be quite affordable money wise.
According to the specs, a 1080 can be faster than a 2070
https://youtu.be/CT2o_FpNM4g?t=916
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 24th January 2019, 16:39   #54427  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by Alexkral View Post
This means that at least for this content CUVID is faster (it was even faster with old drivers but they produced some ugly artifacts with HDR). I don't know if this could be related with some particular rendering or scaling setting, but I don't think so because it's the same with other renderers and players. Maybe for VR in Windows 10 this is solved with that ATW thing.
Since you mention HDR, you should know that HDR playback with CUVID will be much worse (in quality) then with any other mode, since CUVID does not actually export all the necessary metadata for full HDR reproduction.
D3D11 in Native mode should be the best you can do for performance (ie. without selecting a graphics device in LAV Video).
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 24th January 2019 at 16:51.
nevcairiel is offline   Reply With Quote
Old 24th January 2019, 16:45   #54428  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
cuvid forces your GPU into high performance mode while dxva and d3d11 use the NCP power setting which is by default optimal and which is very well known to be problematic with madVR.

no matter which hardware decoder you select on nvidia you use the same decoder in the end the only thing that changes is how this data is transferred.

so go to the NCP and change the power setting to adaptive or maximum performance and try again.
huhn is offline   Reply With Quote
Old 24th January 2019, 17:40   #54429  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by griffind View Post
I need help evolving from 'poking at settings' to 'knows what poking at settings does'

Been using MadVR for a while now for HDR>SDR tone mapping on my Projector and I'm wondering if I am missing out on picture quality because I don't understand the settings properly.

- How do I know if I am crushing blacks or if my highlights are right
Try using these HDR10 black clipping patterns to test black clipping (white clipping is not overly important):

https://drive.google.com/file/d/1PYY...8JqyuGH7-/view

The 8-bit/10-bit combined black clipping video should be paused when making adjustments because it is short. The 8-bit values are listed along the bottom of the pattern as thick bars. The faint, grey gradient should extend as close as possible to the edge of Bar 16 from right-to-left.

The target nits is tradeoff between brightness and contrast. Increasing the target nits will give you more highlight detail, but the entire image will become progressively darker to create necessary contrast.

The other options have been simplified in the latest test build. Hopefully, it comes out soon because I find it is hard to get a bad picture with the latest build.

Last edited by Warner306; 24th January 2019 at 17:50.
Warner306 is offline   Reply With Quote
Old 24th January 2019, 18:05   #54430  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by huhn View Post
so go to the NCP and change the power setting to adaptive or maximum performance and try again.
That's what I thought as well, until 2 weeks ago I had to set Optimal back in NCP because I had massive drops with Adaptive. Now it seem to work fine. (Setup is in my signature.)
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 24th January 2019, 18:28   #54431  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
so your card get's slower with more aggressive performance?
huhn is offline   Reply With Quote
Old 24th January 2019, 18:45   #54432  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by huhn View Post
so your card get's slower with more aggressive performance?
For whatever reason with Adaptive setting it often dropped the performance state and quickly raised it back (monitored with nvidiainspector), hence it produced bunch of dropped frames. Why? That's a good question, because it worked before it with the same setting.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 24th January 2019, 18:46   #54433  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by Warner306 View Post
Try using these HDR10 black clipping patterns to test black clipping .
These are not HDR ? Think they are just 8-10bit panel checkers ?

Last edited by madjock; 24th January 2019 at 18:56.
madjock is offline   Reply With Quote
Old 24th January 2019, 18:57   #54434  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Alexkral View Post
At least in my system (Windows 7, GTX 1080) I get framedrops with DXVA and D3D11 when playing 4k and up @ 60 fps, while CUVID runs smooth.
I had totally forgotten to mention setting Nvidia's power management mode to adaptive (in 3D settings). Optimal power often causes issues.

Quote:
Originally Posted by chros View Post
That's what I thought as well, until 2 weeks ago I had to set Optimal back in NCP because I had massive drops with Adaptive. Now it seem to work fine. (Setup is in my signature.)
I would redo that test... because setting optimal power definitely causes the GPU to clock lower more often. Did you have temperature issues or something?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2019, 19:33   #54435  |  Link
Soxbrother
Registered User
 
Join Date: Jun 2017
Posts: 9
Quote:
Originally Posted by tp4tissue View Post
HD 4400 benches 567 on passmark.

You will have to turn off many of the madvr's features in order to scale 1920x1080 into 3840x2160.

Madvr will still give you more color accuracy, but to use higher quality filters like Lanczos, there's simply not enough processing power to do it.

If you scale 1080p to 1080p, then that is chroma scaling only, 1:1 luma. the 4400 will do this perfectly fine. You can enable Jinc or Lanczos for chroma.
What about image doubling, would that require less juice than upscaling ?

Quote:
Originally Posted by tp4tissue View Post
I believe you also need displayport to hdmi 2.0 adapter, because 4400's native hdmi is probably not HDMI 2.0, which is required for 3840x2160 60hz
So you can convert the DisplayPort version 1.2 to Hdmi 2.0 or 2.0a ?

If this works, how about my Dolby Atmos audio ?
Would my Dolby Atmos audio work through such an adapter ?

Quote:
Originally Posted by tp4tissue View Post
Also, run CPU decode on Lav, do not use dxva, because you want to save as much GPU time as possible for Madvr.

Letting the CPU decode the first part is fine.
Quote:
Originally Posted by chros View Post
This is not how GPU hardware acceleration works: it's a completely separate pipeline. Just use GPU hardware acceleration all the time if you can.
Thanks guys.

At the moment I believe the screen is set at 24hz.
I can go up to 30hz.
The higher the better ?
Should I set it to 30hz ?

Would 60hz with the adapter give me better quality and smoother playback ?

Thanks in advance.
Soxbrother is offline   Reply With Quote
Old 24th January 2019, 20:53   #54436  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Soxbrother View Post
What about image doubling, would that require less juice than upscaling ?
No, it needs more.

Quote:
Originally Posted by Soxbrother View Post
So you can convert the DisplayPort version 1.2 to Hdmi 2.0 or 2.0a ?
This would need to be an active adapter because DP and HDMI don't use the same signaling. They are expensive and I expect that most (all?) don't support Atmos.

Quote:
Originally Posted by Soxbrother View Post
At the moment I believe the screen is set at 24hz.
I can go up to 30hz.
The higher the better ?
Should I set it to 30hz ?

Would 60hz with the adapter give me better quality and smoother playback ?
This is somewhat complicated and depends on the source frame rate. If you are using smooth motion the higher the better (I would say 60 Hz is the minimum for decent smooth motion) but even better than smooth motion is using a refresh rate that matches the source frame rate. This means that you need to tune your refresh rate so you never get dropped or repeated frames. This can be tricky but madVR has a custom resolution tool that helps tune the refresh rate.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2019, 23:28   #54437  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by madjock View Post
These are not HDR ? Think they are just 8-10bit panel checkers ?
They are from a free HDR10 test pattern set. The OSD should confirm. Black clipping is not really HDR because the peak of the one mentioned is less than 1 nit, but that is intended.
Warner306 is offline   Reply With Quote
Old 25th January 2019, 01:02   #54438  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 319
Quote:
Originally Posted by huhn View Post
so go to the NCP and change the power setting to adaptive or maximum performance and try again.
Thanks huhn, I think this helped a bit but didn't fixed it. Anyway I think I found the problem. I noticed that this only happened with SVP and very high bitrate videos (and with VR), so I downloaded a high bitrate 4k 60fps video and got framedrops with both CUVID an DXVA2. Then I blocked ffdshow raw and this fixed it for both modes. So the problem is to have ffdshow raw in the chain, but obviously it's needed for SVP. Also I noticed that you can select D3D11 but it doesn't work on Windows 7, I'll have to check what I'm missing.

@nevcairiel

Yes, I discovered this recently, but anyway I'm currently not using madVR's HDR to SDR because SVP drops the needed metadata. I managed to find some alternatives however, still using both SVP and madVR, but without dynamic tonemapping (though I have some ideas also for this). I wouldn't say that normal tonemapping is much worse than dynamic, not so long ago it was the only thing available and everybody was more than happy with it. Anyway I'm not very sure how madVR is doing dynamic tonemapping with BT.2390, because only changing Mastering display white level gives me different results, and I don't see other configurable parameters for this.
Alexkral is offline   Reply With Quote
Old 25th January 2019, 02:13   #54439  |  Link
Dogway
Registered User
 
Join Date: Nov 2009
Posts: 2,352
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
Dogway is offline   Reply With Quote
Old 25th January 2019, 03:30   #54440  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by Dogway View Post
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
That's how you know it's Lookin' Guud
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:24.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.