Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 21st March 2018, 19:31   #49681  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Thanks Warner for the further details...
I'm sorry guys. I just can't get my head wrapped around all of this. Here's what I'm struggling to understand:

Installed new driver. RGB 4:4:4 and set it to my native 2160p 8bit 60Hz. Then I switched to 2160p 12bit 23Hz and 24Hz. Then set back to 2160p 8bit 60Hz. Next I played a 2160p 23Hz HDR 10bit title no FSE. Looked in NCP during playback and it is showing 8bit at 23Hz as if it ignored my previous command to play 23Hz at 12bit. My display does not show detailed info so I check info from my Denon AVR. It shows RGB 4:4:4 8bit. To me, I don't think this is correct and why I ask you guys. So, during playback I select 12bit in the NCP. I go back to info from AVR and it shows RGB 4:4:4 12bit now. I know title is 10bit so AVR info means nothing I guess? True? Either does bit set depth setting in NCP? True? And madVR does not report anything beyond what the GPU is sending it? True? So how do I know if my display is outputting 8bit or taking advantage of the higher 10bit depth of an HDR title? Sorry I am so naïve!

To make understanding more difficult, after reboot that 12bit setting no longer appears in NCP or my AVR even though I manually changed during playback before I rebooted. It's back to 8bit as if I never set it.
I'm not technical enough to answer all of your questions, but I can start. The first scenario where your AVR is reporting 8-bit sounds like a driver error if you selected 12-bit in the NCP. This would be confirmed by the fact you were able to correct this during playback by changing the bit depth in the NCP. Did this change stick?

Second, you are not taking advantage of the 10-bits of the source. It could be output at 8-bits with dithering without most users noticining much of a difference. The color space is not clipped. It is all about smoothing gradients, and high-quality dithering makes various bit depths look smooth. But, of course, you want 10-bit output if your display can support this. Just remember, madVR is processing everything at very high bit depths (16-bits); higher than the highest output bit depth (10-bits). Errors will not occur when going to any bit depth below madVR's processing.

As far as the GPU output is concerned, I don't know what Nvidia sends to display. I thought it passed-through 10-bit, but it might actually be upconverted to 12-bits. That is beyond my technical acumen.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:36   #49682  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 53
Quote:
Originally Posted by Warner306 View Post
I think passthrough is the higher-quality method. Your display knows itself best, so it should be calibrated to maximize HDR content.
My projector is calibrated and HDR looks great ... but you know there is always still room for improvement and playing with new techi gadgets is fun as well
There are some discussions going on atm on projector boards where it is discussed which method is best for the HDR effect and using madvr is one of them.

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 19:37   #49683  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
And the madVR OSD shows what it receives from the source and what conversions are done by madVR. It is placed between the source and the GPU, so it has no idea what the GPU is doing to the image after it has handed it off.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:38   #49684  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by NoTechi View Post
My projector is calibrated and HDR looks great ... but you know there is always still room for improvement and playing with new techi gadgets is fun as well
There are some discussions going on atm on projector boards where it is discussed which method is best for the HDR effect and using madvr is one of them.

NoTechi
As I said in my edit to the first post, every display is designed to map using its own methods taking into account the limitations of its output. It is not a universal algorithm.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:49   #49685  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 228
Yes the change stuck until I rebooted. Yes my display supports 10bit. Using my old driver, these settings would apply when I opened a title first time, every time, and they would apply and stick after a reboot . As I understand it now, none of this sticking matters? By sending 8bit from the GPU to madVR, evidently to prevent the GPU from dithering instead of madVR is correct if I understood replies correctly. I don't understand how madVR can dither 8bit it is receiving up to 10bit when playing 2160p 23Hz HDR 10bit? My AVR can't either nor NCP? What am I not understanding here?
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10 1909 9604GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI MPC-HC/BE PDVD19 DVDFab 3 & 5 PotPlayer
65JS8500 UHD HDR 3D

Last edited by brazen1; 21st March 2018 at 20:16.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 20:04   #49686  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,984
https://forum.doom9.org/showthread.p...18#post1271418
huhn is offline   Reply With Quote
Old 21st March 2018, 20:09   #49687  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 228
Are we in agreement that newer drivers do not retain a 12bit setting after reboot and reverts to 8bit? If no, has anyone established why it sticks for some and not for others? If yes, are you all playing 10bit sources at 8bit even though your hardware is all 10 bit compatible?

huhn, is there something specific I should concentrate on there? If your simply pointing me to page one, well.......
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10 1909 9604GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI MPC-HC/BE PDVD19 DVDFab 3 & 5 PotPlayer
65JS8500 UHD HDR 3D

Last edited by brazen1; 21st March 2018 at 20:15.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 20:13   #49688  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Yes the change stuck until I rebooted. Yes my display supports 10bit. Using my old driver, these settings would apply when I opened a title first time, every time, and they would apply and stick after a reboot . As I understand it now, none of this sticking matters? By sending 8bit from the GPU to madVR, evidently to prevent the GPU from dithering instead of madVR is correct if I understood replies correctly. I don't understand how madVR can dither 8bit it is receiving up to 10bit when playing 2160p HDR 10bit? My AVR can't either nor NCP? What am I not understanding here?
I don't know if this is exact or not but...

The source starts as 10-bit. This is great because there is no knowing if the studio used dithering or not and this helps ensure there is no banding in the SOURCE.

madVR takes this information and blows it up to 16-bits. This is all math designed to avoid rounding errors and other mistakes that can lead to inaccurate color values. Then, the result is dithered in the highest-quality possible. So the end result is a 10-bit source upconverted and then downconverted for display.

madVR is designed, in almost every way, to avoid inaccurate color conversions, no matter what it is doing, so it should never introduce banding if the bit depth is 8-bits or higher. This all depends on the quality of the source and whether it had banding to begin with.

Like I said a couple of times now, the color space has fixed top and bottom values. You can manipulate the bit depth all you want without screwing up the colors you started with. You just get more shades of each color when the bit depth is increased; everything in between becomes smoother, not more colorful. This is mitigated in madVR by the use of dithering.

Check out these two images, which show the impact of dithering with a bit depth as low as 2-bits.

Dithering to 2-bits:
2 bit Ordered Dithering
2 bit No Dithering

Pretty impressive?

Last edited by Warner306; 21st March 2018 at 23:39.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:16   #49689  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
And, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0.

You only want to send 8-bit RGB when HDMI bandwidth is a problem (at 60 Hz), or when the display does not support 10-bits or has trouble display 10-bits without banding.

Last edited by Warner306; 21st March 2018 at 20:18.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:18   #49690  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,984
Quote:
Actually YCbCr -> RGB conversion gives us floating point data! And not even HDMI 1.4 can transport that. So we have to convert the data down to some integer bitdepth, e.g. 16bit or 10bit or 8bit.
is this part not clear enough.

and literal nearly every TV supports 12 bit input that doesn't mean they are even 8 bit.

it's very simple, can you easily see a difference between 8 bit madVR output or 10 bit?

yes bother with it. no don't bother with it.
huhn is offline   Reply With Quote
Old 21st March 2018, 20:26   #49691  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Are we in agreement that newer drivers do not retain a 12bit setting after reboot and reverts to 8bit? If no, has anyone established why it sticks for some and not for others? If yes, are you all playing 10bit sources at 8bit even though your hardware is all 10 bit compatible?

huhn, is there something specific I should concentrate on there? If your simply pointing me to page one, well.......
The driver appears to work for some people and not for others. I don't know how to get in touch with the people in the know to fix it.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:30   #49692  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by huhn View Post
is this part not clear enough.

and literal nearly every TV supports 12 bit input that doesn't mean they are even 8 bit.

it's very simple, can you easily see a difference between 8 bit madVR output or 10 bit?

yes bother with it. no don't bother with it.
Send 10-bits unless you know your display can't support this, or if you notice banding is introduced by the display (or GPU), or if you want to simplify your set-up until HDMI 2.1 increases the available bandwidth. You won't cripple yourself by going to 8-bits, but this is not the highest possible quality.

My fingers are tired of typing about this topic, so I hope we're clear?

Someone else can confirm what bit depth the GPU is sending to the display; I don't know for sure.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:40   #49693  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,984
so why should i bother with sending 10 bit if i know for sure my TV is 8 bit FRC (like nearly every Tv out there, except what LG claism for the OLEDs but there is no screen with more banding problems...). why should i send 10 bit if i know for sure it doesn't matter for image quality? so i send 10 bit because the number is higher? is 10 bit even better if it gets dithered again?

seriously why can't people simply use there eyes to judge it.

if you want to know what an GPU send it easy with an AMD card it will always send what you select be default it is 10 bit if the device supports it. by nvidia well not that easy. in the past the bitdeep option was ignored(generally not a totally bad idea if you ask me) and was based on what was used for presentation.
huhn is offline   Reply With Quote
Old 21st March 2018, 20:45   #49694  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by huhn View Post
so why should i bother with sending 10 bit if i know for sure my TV is 8 bit FRC (like nearly every Tv out there, except what LG claism for the OLEDs but there is no screen with more banding problems...). why should i send 10 bit if i know for sure it doesn't matter for image quality? so i send 10 bit because the number is higher? is 10 bit even better if it gets dithered again?

seriously why can't people simply use there eyes to judge it.

if you want to know what an GPU send it easy with an AMD card it will always send what you select be default it is 10 bit if the device supports it. by nvidia well not that easy. in the past the bitdeep option was ignored(generally not a totally bad idea if you ask me) and was based on what was used for presentation.
I outlined why you should not use 10-bits. It covers all of that.

And, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0. That is a direct quote from madshi. He didn't clarify the quality difference between each setting.

Last edited by Warner306; 21st March 2018 at 20:47.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:45   #49695  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 735
Quote:
Originally Posted by Warner306 View Post
Send 10-bits unless you know your display can't support this, or if you notice banding is introduced by the display (or GPU), or if you want to simplify your set-up until HDMI 2.1 increases the available bandwidth. You won't cripple yourself by going to 8-bits, but this is not the highest possible quality.

My fingers are tired of typing about this topic, so I hope we're clear?

Someone else can confirm what bit depth the GPU is sending to the display; I don't know for sure.
I have already confirmed that. When set to 12bits, the GPU sends 12bits, at least here. Whether these 12bits are simply padded with trailing zero from the 10bits dithered output of MadVR (most likely) or “true” 12bits, I have no idea.
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 21st March 2018, 20:47   #49696  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 228
Sorry for bugging you guys. I'm just going to use the new driver and let it do its thing at 8bit. Fwiw, my 'go to' movie for checking this discussion is Allied 2016. Scene 2:15 through 3:00 shows a dessert slow pan with a cloudy sky. That sky shows banding using 12bit settings in NCP. Using the 8bit settings (that it's going to revert to after a reboot anyway), there is no banding. I don't know what higher quality I will miss by using 8bit but I won't miss that banding. Thanks for all the input. If I make further progress somewhere down the line, I'll share.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10 1909 9604GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI MPC-HC/BE PDVD19 DVDFab 3 & 5 PotPlayer
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Old 21st March 2018, 20:50   #49697  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Sorry for bugging you guys. I'm just going to use the new driver and let it do its thing at 8bit. Fwiw, my 'go to' movie for checking this discussion is Allied 2016. Scene 2:15 through 3:00 shows a dessert slow pan with a cloudy sky. That sky shows banding using 12bit settings in NCP. Using the 8bit settings (that it's going to revert to after a reboot anyway), there is no banding. I don't know what higher quality I will miss by using 8bit but I won't miss that banding. Thanks for all the input. If I make further progress somewhere down the line, I'll share.
One other thing...I read you were having trouble with MPC-BE in the Kodi forums. This only occurred when launching the player from Kodi.

Have you tried programming Stop and Exit to the same key on your remote instead of using the automatically close after playback setting? That is what I used to do and it never failed with either player.

I try and stay out of your set up guide. And I don't know anything about ISO's or BDMV's or batch files. Sorry to the other users, but Brazen posts a lot of set up information for new users to madVR that want to use Kodi.

Edit: Probably shouldn't have posted that here, but I did.

Last edited by Warner306; 21st March 2018 at 20:58.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:59   #49698  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,984
bit deep is about banding and noise and that's.
the higher the bit deep the lower the noise floor.
even 6 bit can create a banding free image but most people will see the added noise to hide the banding.

and that'S why blindly using 10 bit is not a good idea there is a reason it is not default and no sending 8 bit with 10 bit madVR is not that bad... it's clearly not optimal to say it friendly.
huhn is offline   Reply With Quote
Old 21st March 2018, 21:02   #49699  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 228
And I shouldn't answer you and this will be the end of it. Yes, my remote is programmed exactly like that. For MPC-HC too yet each behaves differently. To get technical beyond my understanding, it depends if stereoscopic is engaged or not. Don't ask me why but that is exactly what it boils down to. I also map alt + f4 to force them to close if the auto function didn't. I prefer the auto close because the less interaction the better. You are welcome in my guide anytime. I appreciate you. You know that. Half of what I know is because of your guides and because of the diverse crowd here. Good folks all of you. Much nicer guide than mine for sure
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10 1909 9604GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI MPC-HC/BE PDVD19 DVDFab 3 & 5 PotPlayer
65JS8500 UHD HDR 3D

Last edited by brazen1; 21st March 2018 at 21:16.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 21:09   #49700  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 228
Yes huhn. I've finally grasped what you are trying convey all this time. 8 bit vs 10 bit is just a couple of numbers. Because one is higher does not mean it is better especially when you consider hardware being used. In the end, what our eyes see should be our final deciding factor. You've been correct all along. Thank you for finally beating it into my head.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10 1909 9604GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI MPC-HC/BE PDVD19 DVDFab 3 & 5 PotPlayer
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 14:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.