Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
24th January 2019, 13:38 | #54421 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
A 1080 and a 2070 are quite comparable in performance, that wouldn't be a huge "upgrade" to speak of. If you can get a cheap used 1080Ti, that might be a better idea if you really need more performance.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
24th January 2019, 14:20 | #54422 | Link | |
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
I'm not sure how the pipeline works, I'll take your word for it. But in my own testing, if I use Dxva or Dx11, the render time is always slightly higher than if I use CPU decode. This was especially true on my iGPU and slower graphics cards. On my HD3000 for example it will stall out /drop frames if I use Dxva decode, whereas if i let it Cpu decode, and gpu render, it will play smoothly.
__________________
Ghetto | 2500k 5Ghz |
|
24th January 2019, 15:33 | #54424 | Link |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
I'm not sure about iGPUs (never used them for madVR, I even used the discrete GPU in an optimus system): whether they have small amount of dedicated memory or they use the system wide RAM.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
24th January 2019, 16:31 | #54425 | Link |
Registered User
Join Date: Oct 2018
Posts: 324
|
I keep seeing this, and while I'm sure it's true for most cases I don't think it is for some. At least in my system (Windows 7, GTX 1080) I get framedrops with DXVA and D3D11 when playing 4k and up @ 60 fps, while CUVID runs smooth. This means that at least for this content CUVID is faster (it was even faster with old drivers but they produced some ugly artifacts with HDR). I don't know if this could be related with some particular rendering or scaling setting, but I don't think so because it's the same with other renderers and players. Maybe for VR in Windows 10 this is solved with that ATW thing.
|
24th January 2019, 16:36 | #54426 | Link | |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
Quote:
https://youtu.be/CT2o_FpNM4g?t=916
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
|
24th January 2019, 16:39 | #54427 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Quote:
D3D11 in Native mode should be the best you can do for performance (ie. without selecting a graphics device in LAV Video).
__________________
LAV Filters - open source ffmpeg based media splitter and decoders Last edited by nevcairiel; 24th January 2019 at 16:51. |
|
24th January 2019, 16:45 | #54428 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
cuvid forces your GPU into high performance mode while dxva and d3d11 use the NCP power setting which is by default optimal and which is very well known to be problematic with madVR.
no matter which hardware decoder you select on nvidia you use the same decoder in the end the only thing that changes is how this data is transferred. so go to the NCP and change the power setting to adaptive or maximum performance and try again. |
24th January 2019, 17:40 | #54429 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
https://drive.google.com/file/d/1PYY...8JqyuGH7-/view The 8-bit/10-bit combined black clipping video should be paused when making adjustments because it is short. The 8-bit values are listed along the bottom of the pattern as thick bars. The faint, grey gradient should extend as close as possible to the edge of Bar 16 from right-to-left. The target nits is tradeoff between brightness and contrast. Increasing the target nits will give you more highlight detail, but the entire image will become progressively darker to create necessary contrast. The other options have been simplified in the latest test build. Hopefully, it comes out soon because I find it is hard to get a bad picture with the latest build.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players Last edited by Warner306; 24th January 2019 at 17:50. |
|
24th January 2019, 18:05 | #54430 | Link |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
That's what I thought as well, until 2 weeks ago I had to set Optimal back in NCP because I had massive drops with Adaptive. Now it seem to work fine. (Setup is in my signature.)
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
24th January 2019, 18:45 | #54432 | Link |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
For whatever reason with Adaptive setting it often dropped the performance state and quickly raised it back (monitored with nvidiainspector), hence it produced bunch of dropped frames. Why? That's a good question, because it worked before it with the same setting.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
24th January 2019, 18:57 | #54434 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
I would redo that test... because setting optimal power definitely causes the GPU to clock lower more often. Did you have temperature issues or something?
__________________
madVR options explained |
|
24th January 2019, 19:33 | #54435 | Link | ||||
Registered User
Join Date: Jun 2017
Posts: 9
|
Quote:
Quote:
If this works, how about my Dolby Atmos audio ? Would my Dolby Atmos audio work through such an adapter ? Quote:
Quote:
At the moment I believe the screen is set at 24hz. I can go up to 30hz. The higher the better ? Should I set it to 30hz ? Would 60hz with the adapter give me better quality and smoother playback ? Thanks in advance. |
||||
24th January 2019, 20:53 | #54436 | Link | ||
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
Quote:
This is somewhat complicated and depends on the source frame rate. If you are using smooth motion the higher the better (I would say 60 Hz is the minimum for decent smooth motion) but even better than smooth motion is using a refresh rate that matches the source frame rate. This means that you need to tune your refresh rate so you never get dropped or repeated frames. This can be tricky but madVR has a custom resolution tool that helps tune the refresh rate.
__________________
madVR options explained |
||
24th January 2019, 23:28 | #54437 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
They are from a free HDR10 test pattern set. The OSD should confirm. Black clipping is not really HDR because the peak of the one mentioned is less than 1 nit, but that is intended.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
25th January 2019, 01:02 | #54438 | Link | |
Registered User
Join Date: Oct 2018
Posts: 324
|
Quote:
@nevcairiel Yes, I discovered this recently, but anyway I'm currently not using madVR's HDR to SDR because SVP drops the needed metadata. I managed to find some alternatives however, still using both SVP and madVR, but without dynamic tonemapping (though I have some ideas also for this). I wouldn't say that normal tonemapping is much worse than dynamic, not so long ago it was the only thing available and everybody was more than happy with it. Anyway I'm not very sure how madVR is doing dynamic tonemapping with BT.2390, because only changing Mastering display white level gives me different results, and I don't see other configurable parameters for this. |
|
25th January 2019, 02:13 | #54439 | Link |
Registered User
Join Date: Nov 2009
Posts: 2,361
|
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
|
25th January 2019, 03:30 | #54440 | Link | |
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
__________________
Ghetto | 2500k 5Ghz |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|