Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
11th May 2015, 19:42 | #1 | Link | |||
Registered User
Join Date: Sep 2013
Posts: 919
|
madVR 10-bit Display Support Test
10-bit Display Support Test
This test will make it easier to see if your display actually supports 10bit input to output. We already know that D3D11 + FSE in madVR will output a 10bit signal in windows 7 and up. Now we just need to provide a super smooth black to white test pattern to clearly be able to see with our eyes. First download 16bit PNG test patterns from here: http://www.bealecorner.org/red/test-...ient-16bit.png I also attached this PNG (zipped) at the bottom in case the URL disappears. Open a new MPC-HC (or other player) window and drag the test pattern to the MPC-HC window. Make sure RGB48 is enabled in LAV Video Decoder. Adjust madVR settings as follows: display->properties->10bit display->calibration->disable & disable GPU gamma ramp. rendering-> general-> Direct3D 11 ON rendering->dithering->none rendering->general->automatic exclusive fullscreen mode (FSE)-> off (for now). Aero in Windows should be On. When Dithering is Off the resulting output image will be a gradient in the bitdepth you set under properties. You can switch between bit depths and dithering on/off to see what I mean, and then go back to 10bit dithering-off to continue the test. Now go fullscreen without FSE (windowed fullscreen) to see the 8bit banding which should be visible because there are only 256 gradients from black to white. This is because the GPU is still sending the image in 8bit to the display in this mode, even though we selected 10bit in madVR. You may have to move closer to the display to actually see the gradients. True 10bit output from your GPU needs FSE mode to work. Now switch FSE On and go fullscreen again, now the GPU actually sends 10bit image to the display, and if your display supports 10bit input, you should now see 1024 gradients from black to white. These gradients are 4 times narrower compared to 8bit and are practically indiscernible (to me). In other words, you should NOT see any gradients. If you still see the 8bit gradients even though the GPU sends 10bit to your display in FSE+D3D11 mode (check with Ctrl+J), your display does not support 10bit. Important note for AMD users: Quote:
Quote:
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 21st June 2015 at 06:32. |
|||
11th May 2015, 21:32 | #3 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
|
Unfortunately, the test isn't necessarily as reliable as one would hope. At least it may not tell you if your display can actually show 10-bit.
What happens for me, I tried this on both a screen known to have 10-bit support (also a U2410), as well as a screen I know which does not have 10-bit support - both connected over DP. The result was that both screens showed a smooth gradient. Now I can only assume that the U2410 actually managed to show 10-bit, but what the other screen did, I'm not sure. Did it employ dithering? Does my GPU know that it doesn't take 10-bit, and does it dither? I don't know, and I don't know how to find out. Obviously if you do see banding after the test, you do know that you should definitely stay away from 10-bit.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders Last edited by nevcairiel; 11th May 2015 at 21:36. |
11th May 2015, 21:44 | #4 | Link |
Registered User
Join Date: Dec 2011
Posts: 12
|
The test seems to have worked for me. I noticed my TV flickered when it switched to 10-bit mode. Similar to how it flickers when switching between 50hz and 60hz. Could this be a tell tale sign that the TV in fact supports 10-bit?
|
11th May 2015, 21:53 | #5 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
|
No, it might flicker simply due to the GPU changing modes.
__________________
madVR options explained |
11th May 2015, 22:14 | #7 | Link | |
Registered User
Join Date: Sep 2012
Posts: 77
|
To everyone:
AMD cards dither the output by default, but you can disable it: http://www.monitortests.com/forum/Th...d=3314#pid3314 Quote:
Only that can explain why I see a difference between 8 and 10 bit output on my 6-bit DELL U2212HM connected with DVI Last edited by MS-DOS; 11th May 2015 at 22:27. |
|
11th May 2015, 22:50 | #8 | Link | |
Registered User
Join Date: Aug 2011
Posts: 38
|
Quote:
|
|
12th May 2015, 00:10 | #9 | Link |
Suptitle, MediaPlayer.NET
Join Date: Nov 2001
Posts: 1,721
|
From memory when I tested this on over 10 monitors/TVs over half a year ago with MPDN, the link you use to your display makes a difference too (I.e. HDMI vs DVI vs analog). Some will even display corrupted image. Some works better when dx10 is used instead of dx11.
|
12th May 2015, 00:13 | #10 | Link | |
Registered User
Join Date: Sep 2012
Posts: 77
|
Quote:
We disable dithering here just for test purposes. It should always be enabled when converting to a lower bit depth, otherwise you may see banding artifacts. |
|
12th May 2015, 00:20 | #11 | Link |
Registered User
Join Date: Dec 2011
Posts: 180
|
I use a 16bit greyscale ramp png (converted from 10 bit test ramp.psd, which should be same as the 1 by AMD) that was made to test OPENGL 30bit support in photoshop.
It is much easier to spot if the panel do 10 bit. https://mega.co.nz/#!OYoQ2IID!0CzRLl...R70VyqpKROztK4 PS: enable LAV filter RGB48. |
12th May 2015, 02:32 | #13 | Link |
Registered User
Join Date: May 2013
Posts: 115
|
Ah nevermind, your idea on madvr thread about the sharpening filter was actually good. On this file it makes it much easier to see the lines ('artifact removal').
This monitor is hilarious. I can see no artifacts on 10bit exclusive. I guess the assumption is something dithers it, but is it for sure? |
12th May 2015, 04:07 | #14 | Link | |
Registered User
Join Date: Nov 2014
Posts: 81
|
Quote:
- Loaded the file up in mpc-hc x64 - Enabled Dx11 and FSE in madvr. - Switched madvr to 10-bit - Disabled refresh rate switcher. - Switched mpc-hc to full screen. - Disabled dithering. I still see banding. The display itself should be 10-bit (or 8-bit+frc) afaik, so I was expecting not to see any banding. |
|
12th May 2015, 04:16 | #15 | Link | |
Registered User
Join Date: Dec 2011
Posts: 180
|
Quote:
|
|
12th May 2015, 04:22 | #16 | Link | |
Registered User
Join Date: Nov 2014
Posts: 81
|
Quote:
I noticed tho that madvr says: "full screen exclusive mode, new path" instead of "full screen exclusive mode, 10bit". Any ideas why? |
|
12th May 2015, 04:44 | #17 | Link |
Registered User
Join Date: Sep 2013
Posts: 919
|
All folks with AMD cards should redo the test with a small tweak noted in the first post.
MS-DOS pointed out that AND driver dithers the output when it outputs more than 8bit, so you may actually get dithered 8bit which looks like 10bit on ANY panel. Nevciriel pointed that he gets a smooth ramp even with a known not 10bit panel, and MS-DOS pointed that his 6-bit DELL U2212HM is also smooth... So all you AMD folks need to do a little registry work to test actual 10bit output and not dithered 8bit by AMD driver.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 12th May 2015 at 04:46. |
12th May 2015, 05:13 | #19 | Link | |
Registered User
Join Date: Sep 2013
Posts: 919
|
Quote:
Much easier to see and works with x64 indeed. I also found 16bit PNG that is much smaller so I should edit my first post with it.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 12th May 2015 at 05:25. |
|
12th May 2015, 06:54 | #20 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
just to add my experiences.
i use a cheap Phillips TV it supports 12 bit input and is not chroma subsampling at all. the panel is clearly an 8 bit ips panel. test picture set to video frame double size to see more banding. FSE D3D11 10 bit AMD set to 8 bit output no dither = clear high banding AMD set to 10 bit output no dither= little banding AMD set to 12 bit output no dither=little banding 8 bit window mode no dither clear high banding madVR dither=little banding these dither "hacks" are set in the registry. without these hacks AMD set to 8 bit was still pretty good so the GPU can dither on is own with at least ok quality. but i will double check that later. looks like my screen can use proper dithering |
Thread Tools | Search this Thread |
Display Modes | |
|
|