Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th May 2024, 23:40   #64801  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,684
EVR and also EVP cp, if it wasn't clear. madVR is not affected. Try MPC Video Renderer. Use default settings, that means d3d11 is disabled. Issue appears, use d3d11 issue is gone, that means only workaround.

Which issue is in Nvidia cards? I didn't understood it.
Klaus1189 is offline   Reply With Quote
Old 19th May 2024, 04:35   #64802  |  Link
whitestar999
Registered User
 
Join Date: Sep 2011
Posts: 54
Quote:
Originally Posted by Klaus1189 View Post
EVR and also EVP cp, if it wasn't clear. madVR is not affected. Try MPC Video Renderer. Use default settings, that means d3d11 is disabled. Issue appears, use d3d11 issue is gone, that means only workaround.

Which issue is in Nvidia cards? I didn't understood it.
About this:

Quote:
but my AMD 79xtx card pop in and out of HDR instantly, no blank screen. My older 1060 has a delay and goes blank screen when NV HDR triggers, and I get loading handshake circles on the TV, sometimes it crashes the mpchc application.

If you're just watching movies, consider going AMD.

Also if you use gamma table, AMD is bulletproof, Nvidia drops its dithering and it's a pain to keep it working consistently.
whitestar999 is offline   Reply With Quote
Old 19th May 2024, 05:10   #64803  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,001
Quote:
Originally Posted by tp4tissue View Post
That only proves my point huhn, why do I need or want to fiddle with this yet-another-loader, when AMD's thingie just works.

It's such a core thing to have working right. Nvidia has had this problem for 10 years, they haven't fixed it.
but nvidia is bit perfect while amd dithering is much harder pain to get rid of. i would love to have the same tool for amd so i don't have to use the registry to fix simple stuff as the dithering.
huhn is offline   Reply With Quote
Old 19th May 2024, 06:22   #64804  |  Link
whitestar999
Registered User
 
Join Date: Sep 2011
Posts: 54
Quote:
Originally Posted by huhn View Post
but nvidia is bit perfect while amd dithering is much harder pain to get rid of. i would love to have the same tool for amd so i don't have to use the registry to fix simple stuff as the dithering.
Considering this would you suggest a 3050 over similarly priced RX6600 or 3060 (non-Ti) over similarly priced 7600 where 3060/7600 are around 30% costlier than 3050/6600.

Last edited by whitestar999; 19th May 2024 at 06:23. Reason: typo
whitestar999 is offline   Reply With Quote
Old 19th May 2024, 11:34   #64805  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by huhn View Post
but nvidia is bit perfect while amd dithering is much harder pain to get rid of. i would love to have the same tool for amd so i don't have to use the registry to fix simple stuff as the dithering.
I can understand the utility of bit-perfect for measurement tools, but in media consumption why would you turn off dithering. The only time I've turned off madvr's internal dithering is to test the dithering on the TV itself. I don't see why I would turn off the GPU's dithering under normal use.

Nvidia's dithering is sporadic. You don't know when or if it'll bug out, it just does it randomly.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 19th May 2024 at 11:40.
tp4tissue is offline   Reply With Quote
Old 19th May 2024, 11:38   #64806  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by whitestar999 View Post
Considering this would you suggest a 3050 over similarly priced RX6600 or 3060 (non-Ti) over similarly priced 7600 where 3060/7600 are around 30% costlier than 3050/6600.
I can only confirm the 7xxx series works fine for gamma table, and does not have lifted blacks for madvr/youtube/general dxva.

I had a 6950xt (sold) for a while, but I've only tested madvr on that, no lifted blacks. Youtube also worked fine.

If you are buying Nvidia, 4090 is the only card worth buying, everything below that is ridiculously overpriced for that raytracing halo feature, which is poorly implemented in almost all games, because developers spend minimal time on it.

Rumors, AMD's GCN4 will have double the RT performance and target $500 pricepoint, which would make the new cards very competitive, but these are just rumors.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 19th May 2024 at 11:44.
tp4tissue is offline   Reply With Quote
Old 19th May 2024, 14:03   #64807  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,001
Quote:
Originally Posted by whitestar999 View Post
Considering this would you suggest a 3050 over similarly priced RX6600 or 3060 (non-Ti) over similarly priced 7600 where 3060/7600 are around 30% costlier than 3050/6600.
it's only one part of the equation there is more to it.
in general the 3050 is a terrible card all-around for example.
Quote:
Originally Posted by tp4tissue View Post
I can understand the utility of bit-perfect for measurement tools, but in media consumption why would you turn off dithering. The only time I've turned off madvr's internal dithering is to test the dithering on the TV itself. I don't see why I would turn off the GPU's dithering under normal use.

Nvidia's dithering is sporadic. You don't know when or if it'll bug out, it just does it randomly.
i can't remember me say to disable madVR dithering.
huhn is offline   Reply With Quote
Old 19th May 2024, 15:48   #64808  |  Link
whitestar999
Registered User
 
Join Date: Sep 2011
Posts: 54
Quote:
Originally Posted by tp4tissue View Post
I can only confirm the 7xxx series works fine for gamma table, and does not have lifted blacks for madvr/youtube/general dxva.

I had a 6950xt (sold) for a while, but I've only tested madvr on that, no lifted blacks. Youtube also worked fine.

If you are buying Nvidia, 4090 is the only card worth buying, everything below that is ridiculously overpriced for that raytracing halo feature, which is poorly implemented in almost all games, because developers spend minimal time on it.

Rumors, AMD's GCN4 will have double the RT performance and target $500 pricepoint, which would make the new cards very competitive, but these are just rumors.
4090 is out of my budget, in fact my budget can max accommodate a 3060Ti that too after a bit of stretching. I am not a pc gamer so can't justify spending so much on a graphics card.

Quote:
Originally Posted by huhn View Post
it's only one part of the equation there is more to it.
in general the 3050 is a terrible card all-around for example.
I am not a pc gamer so anything related to gaming performance is irrelevant to me. Main use of graphics card in my system will be for madvr/similar renderer & HW assisted video decoding.
whitestar999 is offline   Reply With Quote
Old 19th May 2024, 15:57   #64809  |  Link
Sunspark
Registered User
 
Join Date: Nov 2015
Posts: 495
Internal dithering is needed because despite the panel of a display being, e.g. 10-bit and thus capable of displaying everything natively, if you're working with 4:2:0 and 4:2:2 inputs it's going to have to interpolate/dither/create the missing pieces to get it up to 4:4:4 and the panel won't do that, it'll just display what it receives and Windows is natively 4:4:4 so it has to be done before it gets to the panel.

By the way, the talk a little earlier about motion interpolation.. I decided to run the blur busters ufo test to see which presets that "overdrive" was actually working on on my monitor. The presets on my monitor are a mix of 4:4:4 and 4:2:2 as per the purple test image. Unfortunately, only 2 presets have working overdrive to reduce LCD blur, and those 2 are both 4:2:2. I wonder what things send native 4:2:2.. a game console, but only for videos?
Sunspark is offline   Reply With Quote
Old 19th May 2024, 18:03   #64810  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by whitestar999 View Post
I am not a pc gamer so anything related to gaming performance is irrelevant to me. Main use of graphics card in my system will be for madvr/similar renderer & HW assisted video decoding.
The cheapest AMD 7-series in your market will do everything in Madvr. 7700xt has hit $350-380 in the USA on sale. I can't think of any advantage for Nvidia in Madvr. HDR switching is also not as seemless as amd 7 series.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 20th May 2024, 14:21   #64811  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by Sunspark View Post
Internal dithering is needed because despite the panel of a display being, e.g. 10-bit and thus capable of displaying everything natively, if you're working with 4:2:0 and 4:2:2 inputs it's going to have to interpolate/dither/create the missing pieces to get it up to 4:4:4 and the panel won't do that, it'll just display what it receives and Windows is natively 4:4:4 so it has to be done before it gets to the panel.
It's a dithering battle. The display "should" have proper dithering, but that's up in the air, which is why it's nice Madvr lets you turn the internal on and off to double check the display's dithering performance.

If the display has a good chip for 10bit, there should be very little difference visually on gradients with madvr's dithering off.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 20th May 2024, 16:54   #64812  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,001
not how that works at all.
huhn is offline   Reply With Quote
Old 20th May 2024, 22:35   #64813  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
How does it work then huhn, I thought turning off madvr's dithering is the best way to check if the TV is dithering properly or at all. Some TVs just give you an 8bit gradient even fed 10bit.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 20th May 2024, 23:50   #64814  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,001
if you disable dithering then the signal has banding it's now a part of the image and not a processing error anymore not showing it would be wrong.

if a TV is 8 bit internal and bands it doesn't matter if a 10 bit signal is dithered or not the TV creates the banding from turning 10 to 8. if you don't dither you will get banding if it is dithered before that or not does not matter here.

if you want to know how good a device is you should not disable dithering. you disable dithering to show that it isn't debanding an can properly show banding if the signal has it.

the dithering of madVR, GPU driver and end device have nothing to do with each other if they do lossy math/float point math they need dithering to hide the error they created.
huhn is offline   Reply With Quote
Old 21st May 2024, 02:40   #64815  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Hang on huhn, I've tested this a bunch. With a proper 10bit spears gradiant test file, if you output 10bit, w/ madvr dithering off, and the tv is doing bad 8bit internally, the 10bit side will be blotchy. If you turn on madvr dithering on 8bit or 10bit, madvr will smooth it over significantly, even if the tv is doing a bad job.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 21st May 2024, 03:14   #64816  |  Link
Sunspark
Registered User
 
Join Date: Nov 2015
Posts: 495
My perspective is that if you have 4:2:0 and 4:2:2, if your panel is set to 4:4:4 and as Windows is natively 4:4:4, if you don't dither before it gets to the panel, then it's going to be missing pieces of the chroma.

I did read that the Nvidia drivers have the ability to output 4:2:2 and 4:2:0 in the settings, but as I don't have this card, I have never tested that functionality and I actually haven't seen much discussion on it so I have no idea if it works well or not. I suppose if you set the card that way, and also changed the panel to a 4:2:2 mode (which my monitor can switch to), then maybe you don't need MadVR's dithering.

But my iGPU is Intel, it's only ever going to output 4:4:4 so I need to make sure the panel is also set to 4:4:4, and also need to have MadVR's dithering on as a result.
Sunspark is offline   Reply With Quote
Old 21st May 2024, 03:59   #64817  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,001
Quote:
Originally Posted by tp4tissue View Post
Hang on huhn, I've tested this a bunch. With a proper 10bit spears gradiant test file, if you output 10bit, w/ madvr dithering off, and the tv is doing bad 8bit internally, the 10bit side will be blotchy. If you turn on madvr dithering on 8bit or 10bit, madvr will smooth it over significantly, even if the tv is doing a bad job.
it doesn't smooth over anything that's the job of deband.
if the noise level is high enough it may hide it a bit but it can not be smooth. if you disable dithering.

the spear and muesli test are as usually trash. the 8 bit part of these test files is rounded it a completely waste of time there is no way it will ever be smooth without deband use something proper: https://www.bealecorner.org/red/test...ient-16bit.png
i mean they are video files they are ycbcr... and lossy on top of that.

if you disable dithering in madVR the result will be bands everywhere that's not the TVs fault and the effect will be worse on native 10 bit compared to 8 bit in 10 bit.

@Sunspark
no nothing of that.

windwos is not 4:4:4 it is RGB and madVR can only output RGB.
ycbcr to RGB always need dithering so you always need dithering the end.
huhn is offline   Reply With Quote
Old 21st May 2024, 20:48   #64818  |  Link
Amuat
Registered User
 
Join Date: Jun 2023
Posts: 59
I guess the recommended setting in NVidia Control Panel is RGB full then? What about 10 bit color, might it cause problems with 8 bit content as convenient it is not to have to ever change it?
Amuat is offline   Reply With Quote
Old 21st May 2024, 22:23   #64819  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,001
8 bit content also "needs" far more then 8 bit to be shown accurately.

edit: if your device can do 10 bit accurately is a different story.
huhn is offline   Reply With Quote
Old 22nd May 2024, 13:25   #64820  |  Link
Amuat
Registered User
 
Join Date: Jun 2023
Posts: 59
Quote:
Originally Posted by huhn View Post
8 bit content also "needs" far more then 8 bit to be shown accurately.
Would you mind elaborating on that?

As it should be evident from even considering not always having 10 bit on, I decided it's probably best to go back to SDR mode for all but HDR content. I did a bit more reading about the subject, and although it seems that my monitor is one of the exceptions that can actually handle SDR content on HDR quite well based on the descriptions how bad it usually looks, I'm sufficiently convinced now that it's still not the best way to go. I even dropped the wide color gamut SDR option that I previously used, as on closer look it makes skin tones slightly reddish instead of slightly yellowish which looks more natural. Is there any good comprehensive guides on all this stuff that I could read instead of learning things sporadically from here and there?
Amuat is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 20:11.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.