Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 24th January 2020, 01:00   #58421  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 489
Quote:
Originally Posted by Siso View Post
In short, I'll stay to 8 bit.
Since you own a colorimeter, you can test both and see if your accuracy increases/ decreases/ contrast/ gradient smoothness etc.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 24th January 2020, 01:37   #58422  |  Link
nsnhd
Registered User
 
Join Date: Jul 2016
Posts: 109
Quote:
Originally Posted by tp4tissue View Post
You need to test it with 10bit/8bit gradient test files. See which one is smoother.

But since 444 60hz only works in 8bit, and autoswitch 23/24hz is hit or miss depending on driver, I've stuck with 8bit.

I've not seen 10bit input produce noticably smoother gradient either because madvr's dithering is great.
By switching madVR's dithering off I can see 10bit gradient files smoother by 10bit path. Hope that some browsers can take advantage of it when playing videos, but I'm not sure.
You have to connect via Displayport to have 444/60Hz 10bit and some newer driver versions don't like it.
nsnhd is online now   Reply With Quote
Old 24th January 2020, 02:05   #58423  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,225
yeah and what else should happen if you disable dithering?
huhn is offline   Reply With Quote
Old 24th January 2020, 03:19   #58424  |  Link
nsnhd
Registered User
 
Join Date: Jul 2016
Posts: 109
I haven't noticed anything else except 10bit path playing gradient files smother than 8bit path. I just did that for testing only, and always enable dithering in madVR. I have to find out how to test videos such as football matches in browsers to see if 10bit path might have any effect.
nsnhd is online now   Reply With Quote
Old 24th January 2020, 03:25   #58425  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,225
that's not a "test" how could there be another result.

you have to test if 8 bit dithering or 10 bit dithering is smoother.
yuor test is comparable to taking a shower and figuring out if you get wet or not.
huhn is offline   Reply With Quote
Old 24th January 2020, 05:21   #58426  |  Link
nsnhd
Registered User
 
Join Date: Jul 2016
Posts: 109
Yeah, it's not a test, just what I can see. I don't know how to test browsers such as Edge or Chrome which don't tell about bit depth and dithering options/settings.
nsnhd is online now   Reply With Quote
Old 24th January 2020, 06:05   #58427  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,225
if the web browser doesn't dither(they don't) 10 bit rendering would be always better but it's not that easy to render 10 bit so they generally don't do that.

10 bit rendering with dithering on a screen that doesn't support it will always win against 8 bit none dithering. so what is the point of looking at 8 bit none dithered ever why should 10 bit none dithered not always be better? i mean the screen doesn't even matter in this case.
huhn is offline   Reply With Quote
Old 24th January 2020, 06:31   #58428  |  Link
IngramAU
Registered User
 
Join Date: Jan 2018
Posts: 2
Managed to get MadVR to accept a 24hz custom mode by going for 23.999xxx hopefully after a few runs I can get it closer to 24. Not sure why I can't start off from my EDID it just won't accept it with an unknown error.
IngramAU is offline   Reply With Quote
Old 24th January 2020, 14:27   #58429  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,107
Quote:
Originally Posted by tp4tissue View Post
Use Madvr's tonemapping, The Tv's tonemapping is prolly poop anyway.


There is a well documented bug with AMD HDR API which causes this, its nothing to do with tone mapping, AMD's private API isnt sending BT2020, we have to use windows HDR API for now.

I'm already using MADVR's tone mapping, however I have to turn on WINDOWS HDR first which isnt ideal but there is no fix in sight for this bug yet.
__________________
OLED 4k HDR EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 1909 444 RGB -MD RX 5700 8GB 20.1.3 KODI DS - MAD/LAV 92.17+ 113 beta - 0.74.1 - 3D MVC / FSE:off / MADVR 10bit
mclingo is offline   Reply With Quote
Old 24th January 2020, 15:45   #58430  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,799
Quote:
Originally Posted by huhn View Post
10 bit rendering with dithering on a screen that doesn't support it will always win against 8 bit none dithering. so what is the point of looking at 8 bit none dithered ever why should 10 bit none dithered not always be better? i mean the screen doesn't even matter in this case.
I wanted to expand on this because it is an interesting and more subtle point than I thought.

With dithering disabled you are not reality testing your 8 vs 10 bit paths. 10 bit will always look smoother even if, with dithering, 8 bit would look smoother.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2020, 17:55   #58431  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA
Posts: 142
If you're unsure whether your display is 8-bit or 10-bit, there really isn't a test to determine that 100% anyway. But if you know the bit depth, you can certainly test for banding in different desktop and TV modes. And for that, I would recommend disabling dithering in madVR. You still can't be sure that the video card itself doesn't do any dithering, but it's a start. Just make sure to turn it back on when you're done testing.
__________________
Henry

LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2
NVIDIA GeForce GTX 960 | LAV Filters | madVR | MPC-HC | Plex | X-Rite i1Display Pro | DisplayCAL | HCFR
VBB is offline   Reply With Quote
Old 24th January 2020, 18:15   #58432  |  Link
mrmojo666
Registered User
 
Join Date: Jan 2017
Posts: 98
@Mclingo, I'm very curious to follow this bt2020 bug, could you please lnk something regardig the documentation you are referring to?
__________________
AMD Ry 1500x - 8GB - RX460 4GB
TV Philips 55pus6501+ Marantz 1608 avr
WIN10(1903) 4K/444RGB
Mediaportal - Mpc-hc
MADVR-D3D11/10bit
mrmojo666 is offline   Reply With Quote
Old 24th January 2020, 18:29   #58433  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,799
Quote:
Originally Posted by VBB View Post
But if you know the bit depth, you can certainly test for banding in different desktop and TV modes. And for that, I would recommend disabling dithering in madVR.
But if you disable dithering what do the results of the test tell you? 10 bit will always be better than 8 bit with dithering disabled, even if 8 bit would be better with it enabled. You have to do your testing with the same settings you actually use when watching, with dithering enabled.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2020, 19:31   #58434  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,107
Quote:
Originally Posted by mrmojo666 View Post
@Mclingo, I'm very curious to follow this bt2020 bug, could you please lnk something regardig the documentation you are referring to?
When I say documented i'm mean the lesser meaning of this, i.e written down, there are no afficial documents detailing this, however the is a bug logged on MADVR here:

http://bugs.madshi.net/view.php?id=630

We moved the discussion about this to the driver thread as we had a complaint about spamming this thread:

https://forum.doom9.org/showthread.php?t=176013&page=50

This issue has dominated the last few pages of this thread, its now pretty clear the API is causing it, we're just waiting for AMD to fix it, here is DMU explanation of the issue:

You have entered support for HDR mode. The agsSetDisplayMode() function used to set a specific display in HDR mode, does its job perfectly: it sends metadata to the display device, which is defined in section 6.9 «Dynamic Range and Mastering InfoFrame» according to Table 5 of the CTA-861 standard. But in the same Table 5 there is also «Auxiliary Video Information (AVI)» defined in section 6.4. And all display devices are required to use the color space (colorimetry) from this data section (AVI InfoFrame) for the current video signal.
Suppose we are in SDR mode with the standard sRGB color space. And we want to switch to the HDR mode with the BT.2020 color space, which is the main one for this mode. By calling the agsSetDisplayMode() function, we put the display device in HDR mode. And we see distorted or unsaturated colors. This is because the display device did not receive the corresponding flag from the GPU in the AVI InfoFrame and is trying to display our BT.2020 color space in its sRGB.
Please tell me, do you think that such HDR support in AGS_SDK is sufficient? If yes, then advise what else needs to be done so that the display device passes into the correct color space when activating the HDR mode using AGS?

The outcome, MADVR is not sending BT2020 to your display so your colours are very unsaturated. Best workaround is to turn on windows HDR before starting your movie player.


*ONLY AFFECTS NAVI BASED CARDS
__________________
OLED 4k HDR EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 1909 444 RGB -MD RX 5700 8GB 20.1.3 KODI DS - MAD/LAV 92.17+ 113 beta - 0.74.1 - 3D MVC / FSE:off / MADVR 10bit

Last edited by mclingo; 24th January 2020 at 19:34.
mclingo is offline   Reply With Quote
Old 24th January 2020, 19:41   #58435  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA
Posts: 142
Quote:
Originally Posted by Asmodian View Post
But if you disable dithering what do the results of the test tell you? 10 bit will always be better than 8 bit with dithering disabled, even if 8 bit would be better with it enabled. You have to do your testing with the same settings you actually use when watching, with dithering enabled.
You're not testing for bit depth, though. You're trying to find the best combo in regards to banding, and from what I've seen, the best combo is the best with dithering on and off. I guess what I'm trying to say is that I find 4:2:2 10-bit without dithering to be smoother even compared to 4:4:4 or RGB with dithering on. Hope that makes sense.

...and not to confuse anyone here: the above is meant for LG OLEDs only.
__________________
Henry

LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2
NVIDIA GeForce GTX 960 | LAV Filters | madVR | MPC-HC | Plex | X-Rite i1Display Pro | DisplayCAL | HCFR

Last edited by VBB; 24th January 2020 at 20:05.
VBB is offline   Reply With Quote
Old 24th January 2020, 20:48   #58436  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,799
But you would come to exactly the same conclusion if you left dithering enabled for all your testing. I assume 10 bit 4:2:2 with dithering is still smoother than 4:4:4 or RGB with dithering, if it is smoother without dithering.

Aren't we testing bit depth? 8 v.s. 10?

I am arguing against testing without dithering, it never tells you anything useful about your video path.

Edit: In my testing, on a C9 in PC mode, 10 bit without dithering is better than 8 bit without dithering (for banding) but 8 bit with dithering is better than 10 bit with dithering.
__________________
madVR options explained

Last edited by Asmodian; 24th January 2020 at 21:16.
Asmodian is offline   Reply With Quote
Old 24th January 2020, 21:33   #58437  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 144
Quote:
Originally Posted by Asmodian View Post
You do get different chroma... whether or not it is worse is more debatable. In testing it does seem better to handle chroma the same as luma but it is one of the least loss of quality options in trade quality for performance.
I was also very interested in this question, so I did some tests.
Pic 1 - "scale chroma separately, if it saves performance" - OFF
Pic 2 - "scale chroma separately, if it saves performance" - ON
As you can see, double conversion of chroma noticeably degrades image quality. Correct me, please, if I did something wrong.
__________________
R3 2200G / Vega8 / Samsung UE40NU7100
Win10Pro 1909 / 4K RGB 59Hz / AMD 20.1.3
MPC-HC 1.9.1 / madVR 0.92.17 / FSW / 10bit@59Hz
DMU is offline   Reply With Quote
Old 24th January 2020, 21:45   #58438  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA
Posts: 142
Quote:
Originally Posted by Asmodian View Post
But you would come to exactly the same conclusion if you left dithering enabled for all your testing. I assume 10 bit 4:2:2 with dithering is still smoother than 4:4:4 or RGB with dithering, if it is smoother without dithering.

Aren't we testing bit depth? 8 v.s. 10?

I am arguing against testing without dithering, it never tells you anything useful about your video path.

Edit: In my testing, on a C9 in PC mode, 10 bit without dithering is better than 8 bit without dithering (for banding) but 8 bit with dithering is better than 10 bit with dithering.
For me it's more about seeing the display's raw performance and what works best with its built-in processing. My final testing is always with dithering on. PC mode is special
__________________
Henry

LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2
NVIDIA GeForce GTX 960 | LAV Filters | madVR | MPC-HC | Plex | X-Rite i1Display Pro | DisplayCAL | HCFR
VBB is offline   Reply With Quote
Old 24th January 2020, 21:56   #58439  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,799
That is the thing though, this myth about "seeing the displays raw performance". 10 bit will always be better. You are not testing your displays raw performance, you are noticing that 10 bit has more steps than 8 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2020, 22:05   #58440  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA
Posts: 142
But I'm not even bringing up 10-bit. You did I wholeheartedly agree that madVR's 8-bit dithering is indistinguishable from 10-bit. It was merely a coincidence that I found the best mode for me happens to be 10-bit. I would have happily picked 8-bit if it was better in this case.
__________________
Henry

LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2
NVIDIA GeForce GTX 960 | LAV Filters | madVR | MPC-HC | Plex | X-Rite i1Display Pro | DisplayCAL | HCFR
VBB is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 01:13.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.