Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 27th February 2019, 22:25   #55001  |  Link
KoKlusz
Registered User
 
Join Date: Jul 2017
Posts: 27
Quote:
Originally Posted by tp4tissue View Post
WHY did he recommend RTX, that's important. What did he say in the other thread ?
IIRC he didn't want to go into details, only said that it's always better to go with the newer architecture. But I image that hardware acceleration for AI upscaling might be also a reason.

Last edited by KoKlusz; 27th February 2019 at 22:29.
KoKlusz is offline   Reply With Quote
Old 27th February 2019, 22:29   #55002  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 712
Quote:
Originally Posted by ryrynz View Post
Dunno what anyone else is using but I use Beyond Compare https://www.scootersoftware.com/download.php

I believe madshi's recommendedation was mostly down to future proofing. If there is going to be any tensor core code it's probably a year or two away if not more...
txxxx
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 27th February 2019, 22:30   #55003  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
Quote:
Originally Posted by tp4tissue View Post
WHY did he recommend RTX, that's important. What did he say in the other thread ?
tensor cores?
or with other words the new feature these card bring which are currently not used by madVR.

and that's why game comparison to the 1080 are flawed newer games already use some of the newer features.

the newer the game the better the 2060 looks. i wonder what this has to do with a new architecture that can do a lot of new unusual stuff and i'm not even talking about tensor cores here.
https://www.gamersnexus.net/guides/3...019-benchmarks
huhn is offline   Reply With Quote
Old 27th February 2019, 22:48   #55004  |  Link
griffind
Registered User
 
Join Date: Oct 2017
Posts: 6
Did this month's windows 10 update (Feb 12th KB4487044 (OS Build 17763.316)) break refresh rates for anyone else?

After the 'forced' update (FU MicroSoft) some system instability started creeping in and then I started seeing crazy motion blur issues. On inspecting the MadVR info I saw my display reporting at 62.25xxxxxHz and 24.354xxxxHz with a clock deviation of approx 4%. Up until this point my display always registered at 60hz/24Hz, it's so bad with the motion blur that I've had to disable motion smoothing along with refresh rate matching.

Last edited by griffind; 27th February 2019 at 23:01.
griffind is offline   Reply With Quote
Old 27th February 2019, 23:28   #55005  |  Link
KoKlusz
Registered User
 
Join Date: Jul 2017
Posts: 27
Quote:
Originally Posted by huhn View Post
tensor cores?
or with other words the new feature these card bring which are currently not used by madVR.

and that's why game comparison to the 1080 are flawed newer games already use some of the newer features.

the newer the game the better the 2060 looks. i wonder what this has to do with a new architecture that can do a lot of new unusual stuff and i'm not even talking about tensor cores here.
https://www.gamersnexus.net/guides/3...019-benchmarks
1080 will pull ahead especially in legacy titles that are memory and bandwidth hungry, but in others (like The Witcher 3) it runs almost the same. All and all, the difference is small enough that I don't see a reason to be getting 1080 over 2060, unless you can find used one in similar price, but than again 2060 might end up being better in the long run if madshi can tap into tensor cores potential.
KoKlusz is offline   Reply With Quote
Old 27th February 2019, 23:55   #55006  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
Quote:
Originally Posted by griffind View Post
Did this month's windows 10 update (Feb 12th KB4487044 (OS Build 17763.316)) break refresh rates for anyone else?

After the 'forced' update (FU MicroSoft) some system instability started creeping in and then I started seeing crazy motion blur issues. On inspecting the MadVR info I saw my display reporting at 62.25xxxxxHz and 24.354xxxxHz with a clock deviation of approx 4%. Up until this point my display always registered at 60hz/24Hz, it's so bad with the motion blur that I've had to disable motion smoothing along with refresh rate matching.
that a refreshrate mismatch and a wrong detection in one a win 10 classic.
reinstall the driver and try overlay rendering.

@KoKlusz
the 2060 has higher bandwidth compare to the normal 1080 and you should know he has a 1080 so you see it in context.
the witcher is CPU heavy atleast compare to other games so not the greatest GPU test and there are test where the 1080 move quite ahead i'm not even aware of a single title where the 2060 is clearly ahead.
huhn is offline   Reply With Quote
Old 28th February 2019, 00:29   #55007  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Quote:
Originally Posted by griffind View Post
Did this month's windows 10 update (Feb 12th KB4487044 (OS Build 17763.316)) break refresh rates for anyone else?

After the 'forced' update (FU MicroSoft) some system instability started creeping in and then I started seeing crazy motion blur issues. On inspecting the MadVR info I saw my display reporting at 62.25xxxxxHz and 24.354xxxxHz with a clock deviation of approx 4%. Up until this point my display always registered at 60hz/24Hz, it's so bad with the motion blur that I've had to disable motion smoothing along with refresh rate matching.
Broke things for me, clock jumping around, presentation glitches, audio dropouts. So did a clean install, no problems. All good again.
iSeries is offline   Reply With Quote
Old 28th February 2019, 00:31   #55008  |  Link
griffind
Registered User
 
Join Date: Oct 2017
Posts: 6
Quote:
Originally Posted by huhn View Post
that a refreshrate mismatch and a wrong detection in one a win 10 classic.
reinstall the driver and try overlay rendering.
Thanks for this tip, I'm glad that I'm not the only one that has come across this issue. I tried uninstalling and reinstalling a slightly older driver release (416.94) but I'm still seeing the incorrect detection of the refresh rate although the mismatch is indeed way down now to -0.0006% so a huge improvement, still a bit concerning to see my display reporting as 62.95254Hz though.

Are there known reliable Nvdia driver releases that I should be looking at? I'm running a 1070ti on Win 10 x64 if that matters.

Thank you for your time and help.

Last edited by griffind; 28th February 2019 at 00:37.
griffind is offline   Reply With Quote
Old 28th February 2019, 01:02   #55009  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
I do not have any issues with refresh rate detection (4K, HDMI) and I do have KB4487044 installed (build shows 17763.316 too), I also recently did a clean install of 419.17 (2080 Ti).
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 28th February 2019, 01:59   #55010  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Guys I apologise if this has already been covered several times but I have just confused the hell out of myself with DXVA2 native and D3D11 native.. Basically with DXVA2 native I am able to get anti alias high on my 4K movie titles and with D3D11 I can get medium if I select high the rendering increases and stutters..

With DXVA2 native anti alias high render times are 35ms and with D3D11 native anti alias medium its the same but when I select high it is unwatchable.. Also with DXVA2 the CPU is very low but with D3D11 the CPU percentage increases.

I want the best possible video quality I can get with my 1080p to 4K upscales and my 4K HDR movies on my projector. I am using madvr tone mapping so what's the best option to select please.. D3D11 native or DXVA2 native or even copy back on either??

I have a Sapphire Nitro + RX 580 4GB..would appreciate your help and advice.. Thank you guys..
mkohman is offline   Reply With Quote
Old 28th February 2019, 02:36   #55011  |  Link
KoKlusz
Registered User
 
Join Date: Jul 2017
Posts: 27
Quote:
Originally Posted by huhn View Post
the 2060 has higher bandwidth compare to the normal 1080
1080 has only marginally lower bandwidth (320GB/s vs 336GB/s), but on the other hand it has 2GB more VRAM, which can get useful even on 1080p.
Quote:
Originally Posted by huhn View Post
the witcher is CPU heavy atleast compare to other games so not the greatest GPU test and there are test where the 1080 move quite ahead i'm not even aware of a single title where the 2060 is clearly ahead.
2060 on stock clocks beats 1080 in Wolfenstein 2, but that's more of a exception.
Quote:
Originally Posted by huhn View Post
you should know he has a 1080 so you see it in context.
Yeah, if you already have 1080 (or any other 10xx card for that matter), that there's no point bothering with RTX, at least until 7nm cards will come out.

Overall all that's just, like, my opinion, maan, so I'm done going off topic for now.

Last edited by KoKlusz; 28th February 2019 at 02:39.
KoKlusz is offline   Reply With Quote
Old 28th February 2019, 03:55   #55012  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 712
Quote:
Originally Posted by mkohman View Post
Guys I apologise if this has already been covered several times but I have just confused the hell out of myself with DXVA2 native and D3D11 native.. Basically with DXVA2 native I am able to get anti alias high on my 4K movie titles and with D3D11 I can get medium if I select high the rendering increases and stutters..

With DXVA2 native anti alias high render times are 35ms and with D3D11 native anti alias medium its the same but when I select high it is unwatchable.. Also with DXVA2 the CPU is very low but with D3D11 the CPU percentage increases.

I want the best possible video quality I can get with my 1080p to 4K upscales and my 4K HDR movies on my projector. I am using madvr tone mapping so what's the best option to select please.. D3D11 native or DXVA2 native or even copy back on either??

I have a Sapphire Nitro + RX 580 4GB..would appreciate your help and advice.. Thank you guys..
Don't use dxva2 native.

Use either Dxva2 copyback, or DX11 copyback
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 28th February 2019, 04:05   #55013  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Quote:
Originally Posted by tp4tissue View Post
Don't use dxva2 native.

Use either Dxva2 copyback, or DX11 copyback
So you mean I should use DXVA2 copyback or D3D11 Copyback? Why is this? why not Native? Also whats better DXVA 2 or D3D11? Thanks
mkohman is offline   Reply With Quote
Old 28th February 2019, 05:03   #55014  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Use search.
ryrynz is offline   Reply With Quote
Old 28th February 2019, 09:20   #55015  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Just to say that I've reported another bug to JVC, as some of you are asking me via PMs about using 12bits with the JVCs.

When in 12bits, with any driver version, if you set the nVidia CP to RGB Full the JVC will report RGB 12bits input in its info screen, but internally it switches to YCC422, which is of course very bad for chroma as it happens entirely behind madVR's back.

You can see this if you set the levels to standard in the JVCs, as you have to set them to 0-255 in MadVR given that there are no levels in YCC, which is an incorrect setting. However, if you set both levels and colorspace to auto in the JVC and have madVR set to 0-255 anyway, then you won't see any problem as everything will happen behind your back (and madVR's back). madVR sends RGB 0-255 12bits, the GPU is set to 0-255 12bits, the JVC displays an RGB 12bits input, but in fact the GPU is forced to send YCC 422 by the JVC.

You can see this if you set colorspace to RGB (what it should be according to the GPU settings): you get the wrong colors. The only way to get correct colors in 12bits is to set the JVC to 422 or auto. This has been reported to JVC so hopefully will be fixed by f/w in the new models.

To avoid this happening behind madVR's back, you either have to select YCC422 in the nVidia CP (not recommended) or select RGB 8bits. Given the issues in 12bits with calibration and refresh rates, I still recommend to select 8bits both in the nVidia CP and in madVR properties. There is no banding even with 10bits content with this setting as madVR's dithering is excellent (ordered or above). There might be a very slightly higher noise level near black, but that's invisible from a sitting position (even at 1 screen width). Even 7bits does very well. You have to go down to 6bits to start seeing more noise, and even then banding is minimal. That's testing with pixel shader tonemapping. I haven't tested with HDR passthrough.

Unfortunately selecting 8bits isn't an option with the older models (at least it was the case for my rs500) due to the magenta bug, in all refresh rates in 4K with recent drivers, and at 4K60 in 385.28. The only way to get rid of the magenta is to not send HDR metadata, both in SDR and HDR, and even that doesn't work for everyone.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 30th April 2020 at 12:45.
Manni is offline   Reply With Quote
Old 28th February 2019, 11:02   #55016  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Quote:
Originally Posted by mkohman View Post
So you mean I should use DXVA2 copyback or D3D11 Copyback? Why is this? why not Native? Also whats better DXVA 2 or D3D11? Thanks
DXVA2 native can reduce picture quality on some GPU/Driver (mainly Nvidia, IIRC).

DXVA2 copy-back, D3D11 copy-back, D3D11 native and software decoding have perfect picture quality. But D3D11 native doesn't support deinterlacing nor black bar detection. If you don't need those you can use D3D11 native.

https://forum.doom9.org/showpost.php...ostcount=54943
sneaker_ger is offline   Reply With Quote
Old 28th February 2019, 11:14   #55017  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by sneaker_ger View Post
DXVA2 native can reduce picture quality on some GPU/Driver (mainly Nvidia, IIRC).

DXVA2 copy-back, D3D11 copy-back, D3D11 native and software decoding have perfect picture quality. But D3D11 native doesn't support deinterlacing nor black bar detection. If you don't need those you can use D3D11 native.

https://forum.doom9.org/showpost.php...ostcount=54943
UHD BD Menus in jRiver also need copyback (they don't work with native).

So it's not just for black bar detection that we need copyback.

Also D3D11 native is the fastest mode to measure files with madMeasureHDR for many.

So there might be some frequent switching involved between native and copyback, at least if you use measurements files...
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 28th February 2019, 11:27   #55018  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Quote:
Originally Posted by sneaker_ger View Post
DXVA2 native can reduce picture quality on some GPU/Driver (mainly Nvidia, IIRC).

DXVA2 copy-back, D3D11 copy-back, D3D11 native and software decoding have perfect picture quality. But D3D11 native doesn't support deinterlacing nor black bar detection. If you don't need those you can use D3D11 native.

https://forum.doom9.org/showpost.php...ostcount=54943
Quote:
Originally Posted by Manni View Post
UHD BD Menus in jRiver also need copyback (they don't work with native).

So it's not just for black bar detection that we need copyback.

Also D3D11 native is the fastest mode to measure files with madMeasureHDR for many.

So there might be some frequent switching involved between native and copyback, at least if you use measurements files...
Thank you so much guys.. So what I take from this is to use D3D11 copyback as I use masking within my jvc on my scope screen.. What I was wondering is are they're any difference in quality between D3D11 native and copyback or are they the same in terms of quality? Thank you.

By the way I have a RX 580 and currently use ngu anti alias (medium) on my 4K titles. I was looking to change to the vega 64 card but madshi recommend the rtx 2060 or 2070 any thoughts on this as I thought the vega was a stronger card? Or shall I just stick with my RX 580 for now and see what the future holds. Tbh I am not sure about nvidia drivers and all the bugs they've had. AMD card's have aways worked a treat for me..

Last edited by mkohman; 28th February 2019 at 11:29.
mkohman is offline   Reply With Quote
Old 28th February 2019, 14:23   #55019  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Quote:
Originally Posted by mkohman View Post
What I was wondering is are they're any difference in quality between D3D11 native and copyback or are they the same in terms of quality?
Same quality. All decoders except sometimes DXVA2 native have the same quality.
sneaker_ger is offline   Reply With Quote
Old 28th February 2019, 14:27   #55020  |  Link
grendelrt
Registered User
 
Join Date: Sep 2017
Posts: 15
Quote:
Originally Posted by Manni View Post
Just to say that I've reported another bug to JVC, as some of you are asking me via PMs about using 12bits with the JVCs.

When in 12bits, with any driver version, if you set the nVidia CP to RGB Full the JVC will report RGB 12bits input in its info screen, but internally it switches to YCC422, which is of course very bad for chroma as it happens entirely behind madVR's back.

You can see this if you set the levels to standard in the JVCs, as you have to set them to 0-255 in MadVR given that there are no levels in YCC, which is an incorrect setting. However, if you set both levels and colorspace to auto in the JVC and have madVR set to 0-255 anyway, then you won't see any problem as everything will happen behind your back (and madVR's back). madVR sends RGB 0-255 12bits, the GPU is set to 0-255 12bits, the JVC displays an RGB 12bits input, but in fact the GPU is forced to send YCC 422 by the JVC.

You can see this if you set colorspace to RGB (what it should be according to the GPU settings): you get the wrong colors. The only way to get correct colors in 12bits is to set the JVC to 422 or auto. This has been reported to JVC so hopefully will be fixed by f/w in the new models.

To avoid this happening behind madVR's back, you either have to select YCC444 in the nVidia CP (not recommended) or select RGB 8bits. Given the issues in 12bits with calibration and refresh rates, I still recommend to select 8bits both in the nVidia CP and in madVR properties. There is no banding even with 10bits content with this setting as madVR's dithering is excellent (ordered or above). There might be a very slightly higher noise level near black, but that's invisible from a sitting position (even at 1 screen width). Even 7bits does very well. You have to go down to 6bits to start seeing more noise, and even then banding is minimal. That's testing with pixel shader tonemapping. I haven't tested with HDR passthrough.

Unfortunately selecting 8bits isn't an option with the older models (at least it was the case for my rs500) due to the magenta bug, in all refresh rates in 4K with recent drivers, and at 4K60 in 385.28. The only way to get rid of the magenta is to not send HDR metadata, both in SDR and HDR, and even that doesn't work for everyone.
Thanks Mani for the post. I will move to RGB 8bit in both Nvidia and the native display bitdepth setting for the time being. For your last sentence though, why would you be sending HDR metadata (I am assuming you are using tonemapping in Madvr and sending the mapped data to the JVC and would not need it?).

Last edited by grendelrt; 28th February 2019 at 14:32.
grendelrt is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:04.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.