Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 11th May 2015, 19:42   #1  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
madVR 10-bit Display Support Test

10-bit Display Support Test

This test will make it easier to see if your display actually supports 10bit input to output.
We already know that D3D11 + FSE in madVR will output a 10bit signal in windows 7 and up.
Now we just need to provide a super smooth black to white test pattern to clearly be able to see with our eyes.

First download 16bit PNG test patterns from here: http://www.bealecorner.org/red/test-...ient-16bit.png
I also attached this PNG (zipped) at the bottom in case the URL disappears.
Open a new MPC-HC (or other player) window and drag the test pattern to the MPC-HC window.
Make sure RGB48 is enabled in LAV Video Decoder.

Adjust madVR settings as follows:
display->properties->10bit
display->calibration->disable & disable GPU gamma ramp.
rendering-> general-> Direct3D 11 ON
rendering->dithering->none
rendering->general->automatic exclusive fullscreen mode (FSE)-> off (for now).
Aero in Windows should be On.

When Dithering is Off the resulting output image will be a gradient in the bitdepth you set under properties.
You can switch between bit depths and dithering on/off to see what I mean, and then go back to 10bit dithering-off to continue the test.

Now go fullscreen without FSE (windowed fullscreen) to see the 8bit banding which should be visible because there are only 256 gradients from black to white.
This is because the GPU is still sending the image in 8bit to the display in this mode, even though we selected 10bit in madVR.
You may have to move closer to the display to actually see the gradients.

True 10bit output from your GPU needs FSE mode to work.
Now switch FSE On and go fullscreen again, now the GPU actually sends 10bit image to the display, and if your display supports 10bit input, you should now see 1024 gradients from black to white.
These gradients are 4 times narrower compared to 8bit and are practically indiscernible (to me).
In other words, you should NOT see any gradients.

If you still see the 8bit gradients even though the GPU sends 10bit to your display in FSE+D3D11 mode (check with Ctrl+J), your display does not support 10bit.

Important note for AMD users:
Quote:
Originally Posted by MS-DOS
AMD cards dither the output by default, but you can disable it:

Quote:
Check the Catalyst Control Center under "Information" -> "Software" to get the 2D driver path. Then run regedit.exe and open that path in the left pane.
You should see values in the right pane like "AdapterDesc" and a bunch of settings.
Add a DWORD value named TMDS_DisableDither (for DVI), DP_DisableDither (for DisplayPort), HDMI_DisableDither (for HDMI) with a value of 1. Reboot after.
Only that can explain why I see a difference between 8 and 10 bit output on my 6-bit DELL U2212HM connected with DVI
Note for NVIDIA users:
Quote:
Since driver version 353.06 users can select bit depth in the Nvidia Control Panel on all systems.
If you can't see 10bit option in the CP, Nvidia will dither down to the bit depth selected if it is lower.

In other words, If you can't choose 10bit in nvidia CP and keep it on 8bit, when you run madVR in 10bit FSE, nvidia WILL Dither and you'll see smooth gradient, not true 10bit.
If you can select 10bit in Nvidia CP, the driver will NOT dither and sent true 10bit signal to your display.
Dell U2410 flickers and switches to 10bit mode and the image is smooth (first time seeing 10bit mode after owning this display for 5 years).
Attached Files
File Type: zip Gradient-16bit.zip (1.5 KB, 5240 views)
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 21st June 2015 at 06:32.
James Freeman is offline   Reply With Quote
Old 11th May 2015, 20:47   #2  |  Link
luk008
Registered User
 
Join Date: Aug 2011
Posts: 38
Disable GPU gamma ramp and disable dithering is only to proceed with test, right?

The requeriments to have 10 bit output are D3D11 + FSE + compatible GPU + compatible display?
luk008 is offline   Reply With Quote
Old 11th May 2015, 21:32   #3  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,343
Unfortunately, the test isn't necessarily as reliable as one would hope. At least it may not tell you if your display can actually show 10-bit.

What happens for me, I tried this on both a screen known to have 10-bit support (also a U2410), as well as a screen I know which does not have 10-bit support - both connected over DP.
The result was that both screens showed a smooth gradient. Now I can only assume that the U2410 actually managed to show 10-bit, but what the other screen did, I'm not sure. Did it employ dithering? Does my GPU know that it doesn't take 10-bit, and does it dither? I don't know, and I don't know how to find out.

Obviously if you do see banding after the test, you do know that you should definitely stay away from 10-bit.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 11th May 2015 at 21:36.
nevcairiel is online now   Reply With Quote
Old 11th May 2015, 21:44   #4  |  Link
tickled_pink
Registered User
 
Join Date: Dec 2011
Posts: 12
The test seems to have worked for me. I noticed my TV flickered when it switched to 10-bit mode. Similar to how it flickers when switching between 50hz and 60hz. Could this be a tell tale sign that the TV in fact supports 10-bit?
tickled_pink is offline   Reply With Quote
Old 11th May 2015, 21:53   #5  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
No, it might flicker simply due to the GPU changing modes.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 11th May 2015, 22:03   #6  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
and don't forget nearly all TV support 10 or 12 bit input that doesn't tell us anything about the panel it self.
and using FSE with 10 bit doesn't mean 10 bit is outputted at all.
huhn is offline   Reply With Quote
Old 11th May 2015, 22:14   #7  |  Link
MS-DOS
Registered User
 
Join Date: Sep 2012
Posts: 77
To everyone:

AMD cards dither the output by default, but you can disable it:

http://www.monitortests.com/forum/Th...d=3314#pid3314

Quote:
Check the Catalyst Control Center under "Information" -> "Software" to get the 2D driver path. Then run regedit.exe and open that path in the left pane. You should see values in the right pane like "AdapterDesc" and a bunch of settings. Add a DWORD value named TMDS_DisableDither (for DVI) or DP_DisableDither (for DisplayPort) with a value of 1
HDMI_DisableDither for HDMI. Reboot after.

Only that can explain why I see a difference between 8 and 10 bit output on my 6-bit DELL U2212HM connected with DVI

Last edited by MS-DOS; 11th May 2015 at 22:27.
MS-DOS is offline   Reply With Quote
Old 11th May 2015, 22:50   #8  |  Link
luk008
Registered User
 
Join Date: Aug 2011
Posts: 38
Quote:
Originally Posted by MS-DOS View Post
To everyone:

AMD cards dither the output by default, but you can disable it:

http://www.monitortests.com/forum/Th...d=3314#pid3314


HDMI_DisableDither for HDMI. Reboot after.

Only that can explain why I see a difference between 8 and 10 bit output on my 6-bit DELL U2212HM connected with DVI
Disabling AMD dithering I discovered that my TV is in fact 8 bits. I can see a smooth gradient with 10 bits when dithering is enabled. So should I still use 10 bits or just stay with 8?
luk008 is offline   Reply With Quote
Old 12th May 2015, 00:10   #9  |  Link
Zachs
Suptitle, MediaPlayer.NET
 
Join Date: Nov 2001
Posts: 1,721
From memory when I tested this on over 10 monitors/TVs over half a year ago with MPDN, the link you use to your display makes a difference too (I.e. HDMI vs DVI vs analog). Some will even display corrupted image. Some works better when dx10 is used instead of dx11.
Zachs is offline   Reply With Quote
Old 12th May 2015, 00:13   #10  |  Link
MS-DOS
Registered User
 
Join Date: Sep 2012
Posts: 77
Quote:
Originally Posted by luk008 View Post
Disabling AMD dithering I discovered that my TV is in fact 8 bits. I can see a smooth gradient with 10 bits when dithering is enabled. So should I still use 10 bits or just stay with 8?
I see no point in using 10 bit output in MadVR unless your monitor\TV supports 10 bit input. It will only decrease the overall image quality because of the need of two dithering steps, one by MadVR (16 -> 10) and another by the GPU (10 -> 8).

We disable dithering here just for test purposes. It should always be enabled when converting to a lower bit depth, otherwise you may see banding artifacts.
MS-DOS is offline   Reply With Quote
Old 12th May 2015, 00:20   #11  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
I use a 16bit greyscale ramp png (converted from 10 bit test ramp.psd, which should be same as the 1 by AMD) that was made to test OPENGL 30bit support in photoshop.

It is much easier to spot if the panel do 10 bit.

https://mega.co.nz/#!OYoQ2IID!0CzRLl...R70VyqpKROztK4

PS: enable LAV filter RGB48.
baii is offline   Reply With Quote
Old 12th May 2015, 02:26   #12  |  Link
tobindac
Registered User
 
Join Date: May 2013
Posts: 115
This gives me dithering on 8bit for some reason. The other video you had showed 8bit more choppy. Unless the other one had a bug and didn't really show 8bit but less.
tobindac is offline   Reply With Quote
Old 12th May 2015, 02:32   #13  |  Link
tobindac
Registered User
 
Join Date: May 2013
Posts: 115
Ah nevermind, your idea on madvr thread about the sharpening filter was actually good. On this file it makes it much easier to see the lines ('artifact removal').

This monitor is hilarious. I can see no artifacts on 10bit exclusive.

I guess the assumption is something dithers it, but is it for sure?
tobindac is offline   Reply With Quote
Old 12th May 2015, 04:07   #14  |  Link
bcec
Registered User
 
Join Date: Nov 2014
Posts: 81
Quote:
Originally Posted by baii View Post
I use a 16bit greyscale ramp png (converted from 10 bit test ramp.psd, which should be same as the 1 by AMD) that was made to test OPENGL 30bit support in photoshop.

It is much easier to spot if the panel do 10 bit.

https://mega.co.nz/#!OYoQ2IID!0CzRLl...R70VyqpKROztK4

PS: enable LAV filter RGB48.
Thanks for the file. What is the proper way to use it? This is what I did (please tell me if I did something wrong):
- Loaded the file up in mpc-hc x64
- Enabled Dx11 and FSE in madvr.
- Switched madvr to 10-bit
- Disabled refresh rate switcher.
- Switched mpc-hc to full screen.
- Disabled dithering.

I still see banding. The display itself should be 10-bit (or 8-bit+frc) afaik, so I was expecting not to see any banding.
bcec is offline   Reply With Quote
Old 12th May 2015, 04:16   #15  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
Quote:
Originally Posted by bcec View Post
Thanks for the file. What is the proper way to use it? This is what I did (please tell me if I did something wrong):
- Loaded the file up in mpc-hc x64
- Enabled Dx11 and FSE in madvr.
- Switched madvr to 10-bit
- Disabled refresh rate switcher.
- Switched mpc-hc to full screen.
- Disabled dithering.

I still see banding. The display itself should be 10-bit (or 8-bit+frc) afaik, so I was expecting not to see any banding.
check crtl-J OSD, make sure it say RGB48LE, 16bit. If it is not, you need to check RGB48 in lav video decoder.
baii is offline   Reply With Quote
Old 12th May 2015, 04:22   #16  |  Link
bcec
Registered User
 
Join Date: Nov 2014
Posts: 81
Quote:
Originally Posted by baii View Post
check crtl-J OSD, make sure it say RGB48LE, 16bit. If it is not, you need to check RGB48 in lav video decoder.
Thanks, that seems to be correct, it says RGB48LE, 16bit.

I noticed tho that madvr says: "full screen exclusive mode, new path" instead of "full screen exclusive mode, 10bit". Any ideas why?
bcec is offline   Reply With Quote
Old 12th May 2015, 04:44   #17  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
All folks with AMD cards should redo the test with a small tweak noted in the first post.

MS-DOS pointed out that AND driver dithers the output when it outputs more than 8bit, so you may actually get dithered 8bit which looks like 10bit on ANY panel.
Nevciriel pointed that he gets a smooth ramp even with a known not 10bit panel, and MS-DOS pointed that his 6-bit DELL U2212HM is also smooth...
So all you AMD folks need to do a little registry work to test actual 10bit output and not dithered 8bit by AMD driver.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 12th May 2015 at 04:46.
James Freeman is offline   Reply With Quote
Old 12th May 2015, 04:49   #18  |  Link
bcec
Registered User
 
Join Date: Nov 2014
Posts: 81
Quote:
Originally Posted by bcec View Post
Thanks, that seems to be correct, it says RGB48LE, 16bit.

I noticed tho that madvr says: "full screen exclusive mode, new path" instead of "full screen exclusive mode, 10bit". Any ideas why?
aero was off. When on, it worked, thanks!
bcec is offline   Reply With Quote
Old 12th May 2015, 05:13   #19  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by baii View Post
I use a 16bit greyscale ramp png (converted from 10 bit test ramp.psd, which should be same as the 1 by AMD) that was made to test OPENGL 30bit support in photoshop.

It is much easier to spot if the panel do 10 bit.

https://mega.co.nz/#!OYoQ2IID!0CzRLl...R70VyqpKROztK4

PS: enable LAV filter RGB48.
Great Thanks.
Much easier to see and works with x64 indeed.

I also found 16bit PNG that is much smaller so I should edit my first post with it.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 12th May 2015 at 05:25.
James Freeman is offline   Reply With Quote
Old 12th May 2015, 06:54   #20  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
just to add my experiences.

i use a cheap Phillips TV it supports 12 bit input and is not chroma subsampling at all. the panel is clearly an 8 bit ips panel.

test picture set to video frame double size to see more banding.

FSE D3D11 10 bit
AMD set to 8 bit output no dither = clear high banding
AMD set to 10 bit output no dither= little banding
AMD set to 12 bit output no dither=little banding

8 bit window mode
no dither clear high banding
madVR dither=little banding

these dither "hacks" are set in the registry. without these hacks AMD set to 8 bit was still pretty good so the GPU can dither on is own with at least ok quality. but i will double check that later.

looks like my screen can use proper dithering
huhn is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 17:37.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.