Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 8th January 2019, 09:55   #54161  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 44
video card upgrade = dozen of presentation glitches every second

Hi,

I just upgraded from a Gigabyte Radeon HD 7870 to a XFX Radeon RX 580 GTS and now I get dozens of presentation glitches every second when playing any video if the HDMI Scaling slider is set above 0% in the AMD Radeon Settings GUI.

I tried updating the Radeon drivers to 18.12.2 (was on 17.12.1 with the old card). but still get the issue. I've also tried restoring madVR's default settings. And I tried it with the same madVR settings I used with the previous card. I also tried the previous version of madVR.

Never had this issue with the 7870 and I had the HDMI scaling slider set to 3% (which is where I also need it set for the new card).

Thanks,

>g.
glc650 is offline   Reply With Quote
Old 8th January 2019, 12:03   #54162  |  Link
actarusfleed
Registered User
 
Join Date: Jun 2009
Posts: 71
Did someone ever compared (only with madvr) a gtx 1070ti vs rtx 2070?

Some YouTube videos are about this but only on the videogame side...

I'm only interested to evaluate this by the madvr point of view...
actarusfleed is offline   Reply With Quote
Old 8th January 2019, 15:46   #54163  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 387
Quote:
Originally Posted by actarusfleed View Post
Did someone ever compared (only with madvr) a gtx 1070ti vs rtx 2070?

Some YouTube videos are about this but only on the videogame side...

I'm only interested to evaluate this by the madvr point of view...

Peeps are just cranking the upscaling to arbitrarily high setting to saturate their cards.

I'm not sure we can call that Performance..

at 2 meters from my tv, I can't even see the difference between bilinear chroma and nnedi..

I guess you could make a case for scaling 4K up to 8K tvs..

I have an old 7970, and it can do 4K jinc chroma, shader to SDR in 15ms..
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 8th January 2019 at 15:50.
tp4tissue is offline   Reply With Quote
Old 8th January 2019, 16:39   #54164  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 387
Quote:
Originally Posted by huhn View Post
do you have a screen that can do more than DCI P3? well no so you don't have to care the 3D LUT will do the gamut mapping anyway. just use the default setting.


clearly not 10000 so feel free to use the measured peak brightness.
i personally don't think this is a useful measuring anyway on an x900f the peak brightness for 10% is far higher than 2%.
You misunderstood my question huhn..

My question is,

How does changing the Sent meta data from rec2020 to dci-p3 change the image, if at all.

How does changing the peak nit part of the meta data affect the image..

If, there is a change at all, what does the tv DO with that meta data, does it pop into different modes.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 8th January 2019 at 16:41.
tp4tissue is offline   Reply With Quote
Old 8th January 2019, 17:01   #54165  |  Link
SirSwede
Registered User
 
Join Date: Nov 2017
Posts: 67
Quote:
Originally Posted by tp4tissue View Post
Peeps are just cranking the upscaling to arbitrarily high setting to saturate their cards.

I'm not sure we can call that Performance..

at 2 meters from my tv, I can't even see the difference between bilinear chroma and nnedi..

I guess you could make a case for scaling 4K up to 8K tvs..

I have an old 7970, and it can do 4K jinc chroma, shader to SDR in 15ms..
Still on Bicubic 75 for Chroma here; never changed it. Looks splendid on my TV. Two metres here as well.

But, then again, it is madVR - after all.
SirSwede is offline   Reply With Quote
Old 8th January 2019, 18:47   #54166  |  Link
border.community
Registered User
 
Join Date: Dec 2018
Posts: 10
I am mostly playing UHD mkv rips and was wondering which madvr settings actually affect the picture since I am neither upscaling or downscaling.

Seems like only the settings under Devices, Processing and Rendering would make a difference, and Scaling Algorithms would not---but if I were to change NGU Anti-Alias to "High quality" it would obviously affect performance which begs the question: what are Scaling Algorithms doing to files that do not need to be scaled?
border.community is offline   Reply With Quote
Old 8th January 2019, 18:52   #54167  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,203
Chroma still needs to be scaled. It's not about resolution. Chroma data is stored in 4:2:0 and needs to be scaled to 4:4:4 or 4:2:2 depending on what you're sending to your display. However, whether you can tell the difference between NGU High AA or bicubic75 or not is an individual thing and only you can decide with your eyes if you can see a difference.
__________________
HTPC: Windows 10, I9 9900k, RTX 2070 Founder's Edition, Pioneer Elite VSX-LX303, LG C8 65" OLED
SamuriHL is offline   Reply With Quote
Old 8th January 2019, 19:15   #54168  |  Link
border.community
Registered User
 
Join Date: Dec 2018
Posts: 10
Quote:
Originally Posted by SamuriHL View Post
Chroma still needs to be scaled.
Ah, thanks. I have a 1050ti and a C8 as well. Chroma upscaling is really the only thing I have turned on. Do you have anything else turned on, such as any of the Processing options?
border.community is offline   Reply With Quote
Old 8th January 2019, 19:58   #54169  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,203
I've got a 1060 6gb and I've got quite a few things enabled yes. I've had to change some of my settings recently because of new HDR tonemapping options that will be in the upcoming madvr release. It's a little more punishing than it is in the current release build. It's always a trade off of performance vs quality which is why you see so many people shelling out cash for 1080's and 2080's to make sure they have some breathing room. Everyone's configuration in madvr is going to be different based on what content they're viewing and what they're looking for in terms of quality.
__________________
HTPC: Windows 10, I9 9900k, RTX 2070 Founder's Edition, Pioneer Elite VSX-LX303, LG C8 65" OLED
SamuriHL is offline   Reply With Quote
Old 8th January 2019, 20:51   #54170  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,734
Quote:
Originally Posted by tp4tissue View Post
How does changing the Sent meta data from rec2020 to dci-p3 change the image, if at all.
if your screen can do bt 2020 natively it would do something if it is close to DCI P3 it should do nothing at all.
Quote:
How does changing the peak nit part of the meta data affect the image..
Quote:
If, there is a change at all, what does the tv DO with that meta data, does it pop into different modes.
who knows literally what ever they want. HDR tonemapping doesn'T have a real one way is correct spec.
the difference should be the image dynamic until it hits it max brightness then it does what ever it wants.

as i said before there is no clear answer.
Quote:
Originally Posted by SamuriHL View Post
Chroma still needs to be scaled. It's not about resolution. Chroma data is stored in 4:2:0 and needs to be scaled to 4:4:4 or 4:2:2 depending on what you're sending to your display.
and in madVR case it will always be scaled to 4.4:4 and only RGB is send to the GPU driver that's why sending 4:2:2 or 4:2:0 is bad for the applied dithering and chroma quality the driver has to RGB -YCbCr convert which alone creates floatpoint numbers downsacle the chroma which again results in float point numbers. chroma sub sampling is just storing chroma in a lower resolution than luma so it's pretty much resolution.

and as a fun fact 8 bit RGB has more BPP as 10 bit YCbCr 4:2:2.

chroma is for most scenes not important up to a point that bilinear will be enough but as soon as an image get's chroma heavy or even based (red text on black background as an example) it shows significant differences especially on BDs where you are now literal using an good scaler on SD.

Last edited by huhn; 8th January 2019 at 20:55.
huhn is offline   Reply With Quote
Old 8th January 2019, 21:00   #54171  |  Link
NM20
Registered User
 
Join Date: Jan 2019
Posts: 11
O.k. I have read a fair bit and you guys (and gals) seem to know what you are talking about.

I have a 1080ti and sending the image to a JVC projector that is 4k capable.

My questions are:

I am using 'tone map HDR using pixel shaders'. Would ticking 'measure each frame's peak luminance' be the correct thing to do?

Also, where do I check the DXVA scaling/decoding? Is it in LAV filters?

Thanks
NM20 is offline   Reply With Quote
Old 8th January 2019, 21:04   #54172  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,203
Quite true. That's where the compromises come into play if you're on a performance limited card, however. Do you lean more toward the HDR processing (I am right now) or toward chroma upscaling? It really depends on what you've got under the hood and the content you're trying to watch. And that's also where profiles can come into play. Something I've still not delved into myself yet but really should soon. Cause you can crank up chroma upscaling if you're e.g. scaling a blu-ray to UHD resolution and sacrifice chrome upscaling to save performance for HDR tonemapping when watching UHD content. That's just one example, but, you get the idea.

At the moment I'm fighting with my 1060, AVR, and TV to get my picture to not be cut off. I tried activating scaling in the nVidia control panel but it kills HDR as soon as I enable it. So I'm stuck with native res that's now for whatever reason getting cut off on the TV. I really can't win. LOL
__________________
HTPC: Windows 10, I9 9900k, RTX 2070 Founder's Edition, Pioneer Elite VSX-LX303, LG C8 65" OLED
SamuriHL is offline   Reply With Quote
Old 8th January 2019, 21:26   #54173  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,172
Quote:
Originally Posted by SamuriHL View Post
However, whether you can tell the difference between NGU High AA or bicubic75 or not is an individual thing and only you can decide with your eyes if you can see a difference.
Yeah, there will be scenes where I think I would notice a difference but it's not a question of that, it's that I know from my own investigations that NGU AA is superior to Bicubic 75 and I have the performance to use it, the card is there to be used to why not eek everything I think is worth it out of it? Nothing wrong with Bicubic 75 though, it's one of the better lower end choices, nobody is going to knock you for using that.

Quote:
Originally Posted by SamuriHL View Post
It really depends on what you've got under the hood and the content you're trying to watch. And that's also where profiles can come into play.
Very much this, people need to be using profiles, they're so easy to set up.
For full HD content on my 1080 I do a 2x supersample with NGU sharp and downscale with SSIM 2D, it's demanding as hell but because I have no other sharpening on my set or with madVR the resulting picture is very nice.. the edges are sharp and the picture does not look sharpened at all and the whole thing has a very high res sort of pop to it.
For hand drawn animated content I don't do this because the detail isn't there (most anime is actually captured at 720P, so much of it is upscaled anyway) and I've tested what I gain from supersampling this and it's next to nothing.. You got to know where to invest those resources. From the 750ti->960->1060 6GB-> 1080 it's been a balancing act I think I've managed particularly well, not a lot has changed when it comes to non HDR content upscaling/processing.

Quote:
Originally Posted by NM20 View Post
O.k. I have read a fair bit and you guys (and gals) seem to know what you are talking about.

I am using 'tone map HDR using pixel shaders'. Would ticking 'measure each frame's peak luminance' be the correct thing to do?
It's something you can do for better highlights, there's nothing right or wrong about any of the options. If you can enable it without frame drops then do it.

Quote:
Originally Posted by NM20 View Post
Also, where do I check the DXVA scaling/decoding? Is it in LAV filters?
Yes, those options are under 'Hardware decoder to use'.
You may wish to select the chroma DXVA options in 'Trade quality for performance' if you want the best performance from your card at the expense of image quality, evaluate for yourself.

Last edited by ryrynz; 8th January 2019 at 21:43.
ryrynz is offline   Reply With Quote
Old 8th January 2019, 22:24   #54174  |  Link
NM20
Registered User
 
Join Date: Jan 2019
Posts: 11
Quote:
Originally Posted by ryrynz View Post
Yes, those options are under 'Hardware decoder to use'.
You may wish to select the chroma DXVA options in 'Trade quality for performance' if you want the best performance from your card at the expense of image quality, evaluate for yourself.
What option should I be picking in LAV with a 1080ti to get the best performance and the HDR tone mapping? CUVID? Native? Copy back?

Thanks for the help so far.
NM20 is offline   Reply With Quote
Old 8th January 2019, 23:14   #54175  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,203
Quote:
Originally Posted by ryrynz View Post
Yeah, there will be scenes where I think I would notice a difference but it's not a question of that, it's that I know from my own investigations that NGU AA is superior to Bicubic 75 and I have the performance to use it, the card is there to be used to why not eek everything I think is worth it out of it? Nothing wrong with Bicubic 75 though, it's one of the better lower end choices, nobody is going to knock you for using that.
I only changed to bicubic75 in the latest test build because I'm preferring the tonemapping options right now as they are quite impressive. Eventually I'll tweak everything and settle on a happy medium somewhere. I agree wholeheartedly that NGU AA absolutely rocks and I can definitely see a difference depending on content. I'll get back to it. Then again, I'm looking at some rebuilding of machines this year at some point so I'll have the horsepower to crank everything back up. The 1060 was a compromise when I bought it. I really wanted a 1070 but the stupid crypto nonsense was going on and I couldn't find one local and online prices were ridiculous. Everything is a compromise.

Quote:
Originally Posted by ryrynz View Post
Very much this, people need to be using profiles, they're so easy to set up.
For full HD content on my 1080 I do a 2x supersample with NGU sharp and downscale with SSIM 2D, it's demanding as hell but because I have no other sharpening on my set or with madVR the resulting picture is very nice.. the edges are sharp and the picture does not look sharpened at all and the whole thing has a very high res sort of pop to it.
For hand drawn animated content I don't do this because the detail isn't there (most anime is actually captured at 720P, so much of it is upscaled anyway) and I've tested what I gain from supersampling this and it's next to nothing.. You got to know where to invest those resources. From the 750ti->960->1060 6GB-> 1080 it's been a balancing act I think I've managed particularly well, not a lot has changed when it comes to non HDR content upscaling/processing.
Profiles are absolutely my next time investment. I just haven't gone there yet but it's definitely needed now. When I get back from vacation that's top of the list.
__________________
HTPC: Windows 10, I9 9900k, RTX 2070 Founder's Edition, Pioneer Elite VSX-LX303, LG C8 65" OLED
SamuriHL is offline   Reply With Quote
Old 8th January 2019, 23:16   #54176  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,203
Quote:
Originally Posted by NM20 View Post
What option should I be picking in LAV with a 1080ti to get the best performance and the HDR tone mapping? CUVID? Native? Copy back?

Thanks for the help so far.
Unless you NEED copy back for whatever reason (black bar detection, deinterlacing, etc), I would personally recommend D3D11. It's what I've been using for a long time now and it works very well.
__________________
HTPC: Windows 10, I9 9900k, RTX 2070 Founder's Edition, Pioneer Elite VSX-LX303, LG C8 65" OLED
SamuriHL is offline   Reply With Quote
Old 8th January 2019, 23:17   #54177  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,172
Should really be asked in the LAV forum, but use D3D11 preferably as Samurai said.
ryrynz is offline   Reply With Quote
Old 8th January 2019, 23:17   #54178  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,105
Definitely D3D11 Native with pixel shader tone mapping.
Warner306 is offline   Reply With Quote
Old 9th January 2019, 00:16   #54179  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 387
Quote:
Originally Posted by huhn View Post
if your screen can do bt 2020 natively it would do something if it is close to DCI P3 it should do nothing at all.




who knows literally what ever they want. HDR tonemapping doesn'T have a real one way is correct spec.
the difference should be the image dynamic until it hits it max brightness then it does what ever it wants.

as i said before there is no clear answer.

So, everything pretty much work as expected now on my 1050ti and 1060. thx alot huhn.

However, my basement ATI 7870xt isn't showing the measure screen nit and the highlight restoration seems to have no effect.. is it not hooking into the old gpu's direct compute ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 9th January 2019, 01:15   #54180  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 494
Quote:
Originally Posted by glc650 View Post
I just upgraded from a Gigabyte Radeon HD 7870 to a XFX Radeon RX 580 GTS and now I get dozens of presentation glitches every second when playing any video if the HDMI Scaling slider is set above 0% in the AMD Radeon Settings GUI.
Sounds like an AMD driver bug then. As scaling for TV overscan compensation is one of the very last driver steps before sending out the picture, it may explain the presentation glitches.
Have you tried exclusive mode? Does your TV really not have any mode with no overscan, even an hidden one like in a service menu or something?
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 17:59.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.