Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd May 2024, 14:34   #64821  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
62.56 102.34 240 ycbcr is red 255 0 0 notice something you can't put it in 8 bit it's far bigger.

here is the other way around:
ycbcr 63 102 240 RGB 255.51 0.58 -0.2

i know no guide for this.
huhn is offline   Reply With Quote
Old 22nd May 2024, 15:41   #64822  |  Link
Amuat
Registered User
 
Join Date: Jun 2023
Posts: 59
By "all this stuff" I meant more generally how to get the most out of your monitor, hardware and software. A practical guide to set up everything correctly, basically.
Amuat is offline   Reply With Quote
Old 22nd May 2024, 16:21   #64823  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
get a colorimeter and learn calibration.
that how you learn to setup a device correctly the rest is learning by doing.
huhn is offline   Reply With Quote
Old 27th May 2024, 15:25   #64824  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by Amuat View Post
I guess the recommended setting in NVidia Control Panel is RGB full then? What about 10 bit color, might it cause problems with 8 bit content as convenient it is not to have to ever change it?
10bit is just 8 bit dithered. Even if you had a true 10 bit panel, you'd still need dithering near black because of over/undershooting.

The data and processing is digital, but the device itself has unavoidable analog elements, which behave wildly with temperature swings. The professional gear have temperature control and secret sauce in their pipeline in the form of compensation algorithms.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 28th May 2024, 14:45   #64825  |  Link
Amuat
Registered User
 
Join Date: Jun 2023
Posts: 59
I guess 8 bit is the safest bet for 8 bit content then? BTW, what does "disable GPU gamma ramps" do exactly and in what cases it could be useful?
Amuat is offline   Reply With Quote
Old 28th May 2024, 17:10   #64826  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
no it's not.
disabling gamma ramp is unloading icc/icm profiles.
huhn is offline   Reply With Quote
Old 28th May 2024, 17:36   #64827  |  Link
Amuat
Registered User
 
Join Date: Jun 2023
Posts: 59
Quote:
Originally Posted by tp4tissue View Post
The data and processing is digital, but the device itself has unavoidable analog elements, which behave wildly with temperature swings.
Oh, I understood this as a result of the dithering ("10bit is just 8 bit dithered"). But it's better to have 10 bit always on then after all?
Amuat is offline   Reply With Quote
Old 28th May 2024, 17:52   #64828  |  Link
Yz345a
Registered User
 
Join Date: May 2024
Posts: 3
It depends on your use case and the quality of your display. For general use, keeping 10 bit always on might not provide a noticeable benefit and could potentially introduce issues if your content is primarily 8 bit. However, if you're working with high dynamic range (HDR) content or doing professional color grading, having 10 bit enabled could be beneficial. It's always good to test and see if it makes a difference for your specific setup.
Yz345a is offline   Reply With Quote
Old 28th May 2024, 18:22   #64829  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
yes your display matters.
if a source is 8 or 10 bit does not matter it isn't after RGB conversation.
mathematically 10 bit is always better.

if your display is broken it doesn'T matter if the source is 8 or 10 bit.
huhn is offline   Reply With Quote
Old 29th May 2024, 13:25   #64830  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by Amuat View Post
Oh, I understood this as a result of the dithering ("10bit is just 8 bit dithered"). But it's better to have 10 bit always on then after all?
You have to test the display. If it has very good 8bit / 10bit processing, then feel free to leave it in whichever mode.

Madvr has internal extra dithering when it detects 8bit display. This dithering makes gradiants alot smoother than native 8bit would normally be.

So Madvr has the ability to make a really shitty 8bit display as good as a 10bit display.

But if you enable 10bit on that shitty display, and you tell Madvr under, this display is 10bit+, Madvr will use a different 10bit dithering, this is not nearly as good at hiding banding problems of a bad display.


So you have to test it. You want to enable the dithering on/off toggle hotkey to figure out the native performance of the display. You want to see how it does in 8 bit and 10 bit WITHOUT Madvr's help.

If it does poorly in 10bit, then you want to go back to 8.
If it does both 8bit and 10bit, smoothly without madvr's dithering enabled, then you can choose any combination of options and it will be fine.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 29th May 2024 at 13:33.
tp4tissue is offline   Reply With Quote
Old 29th May 2024, 15:02   #64831  |  Link
JNW
Registered User
 
Join Date: Sep 2017
Posts: 52
To convolute matters more displays will do there own dithering. Most commercial advertised 10bit displays are actually 8bit + FRC.
JNW is offline   Reply With Quote
Old 29th May 2024, 16:39   #64832  |  Link
Amuat
Registered User
 
Join Date: Jun 2023
Posts: 59
Alright, thank you all for the info! This is on a completely different topic, but I was wondering if being able to use Direct3D 12 instead of 11 as a decoder would have any benefits? Maybe more efficient use of resources, if not any visual improvements? If there would be benefits, are there video processors already available that make use of it?
Amuat is offline   Reply With Quote
Old 29th May 2024, 16:58   #64833  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
Quote:
Originally Posted by tp4tissue View Post
You have to test the display. If it has very good 8bit / 10bit processing, then feel free to leave it in whichever mode.

Madvr has internal extra dithering when it detects 8bit display. This dithering makes gradiants alot smoother than native 8bit would normally be.

So Madvr has the ability to make a really shitty 8bit display as good as a 10bit display.

But if you enable 10bit on that shitty display, and you tell Madvr under, this display is 10bit+, Madvr will use a different 10bit dithering, this is not nearly as good at hiding banding problems of a bad display.


So you have to test it. You want to enable the dithering on/off toggle hotkey to figure out the native performance of the display. You want to see how it does in 8 bit and 10 bit WITHOUT Madvr's help.

If it does poorly in 10bit, then you want to go back to 8.
If it does both 8bit and 10bit, smoothly without madvr's dithering enabled, then you can choose any combination of options and it will be fine.
for the last time if you disable dithering the result can not be smooth.

madVR also has no special 8 bit dithering it's the same algorithm.

Quote:
Originally Posted by JNW View Post
To convolute matters more displays will do there own dithering. Most commercial advertised 10bit displays are actually 8bit + FRC.
yes but also not the issue.
dispaly don't have a gamma response the response is "something". doing this with a higher bit deep is technically always better even on FRC displays.

technically and reality are sadly not the same things.

Quote:
Originally Posted by Amuat View Post
Alright, thank you all for the info! This is on a completely different topic, but I was wondering if being able to use Direct3D 12 instead of 11 as a decoder would have any benefits? Maybe more efficient use of resources, if not any visual improvements? If there would be benefits, are there video processors already available that make use of it?
nothing the main benefit is for DX12 renderer so they don't need to do an interop for native decode.
huhn is offline   Reply With Quote
Old 29th May 2024, 19:39   #64834  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by huhn View Post
for the last time if you disable dithering the result can not be smooth.

madVR also has no special 8 bit dithering it's the same algorithm.

Huhn that can't be true.

On my s90c, test gradiant, 10bit input file, 8bit display out, tell madvr 8bit, madvr dithering enabled the gradiant looks smooth.

- disable dithering on the 8/8, gradiant looks chunky

10bit display out, tell madvr 10bit, madvr dithering disabled, gradiant looks smooth. dithering enabled, gradiant looks smooth. (both look smooth).

I've tested this on both nvidia and amd cards.

On my old TCL, 10bit input file, 8 bit display out, tell madvr 8bit, dithering enabled, gradiant looks smooth.

- disable dithering on 8/8 gradiant looks chunky.

10bit display output, tell madvr 10bit, dithering on or off, the gradiant looks chunky.


So some of the work/behavior, must come down to the TV/Monitor, where gradiants are concerned, regardless of madvr's dithering setting.

I don't know that it's a different algorithm, but the interaction is very different depending on the display.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 29th May 2024 at 21:59.
tp4tissue is offline   Reply With Quote
Old 29th May 2024, 19:55   #64835  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,414
Quote:
Originally Posted by tp4tissue View Post
I don't know that it's a different algorithm, but the interaction is very different depending on the display.
These test results do not refute what huhn wrote.

They only tell you that your old TCL display handles 10-bit input badly (likely truncates/rounds to 8-bit without dithering) and you don't notice the 'chunks' in a undithered 10-bit gradient on your s90c. The steps in a 10-bit gradient are much less significant, so undithered 10-bit gradients do look smoother than undithered 8-bit gradients.

Obviously, a display can mess up a good signal.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 29th May 2024, 21:57   #64836  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by Asmodian View Post
Obviously, a display can mess up a good signal.
That's exactly my point. You wouldn't typically disable dithering on purpose. But it's prudent to test how the display behaves with native 8bit and 10bit inputs, WITHOUT madvr's dithering help.

The second test case proves that Madvr will produce smoothness even if the Display itself can not under SPECIFIC conditions.

If the display does not have good native 10bit gradiant performance, it's better to fall back to 8 bit, and let madvr dithering help it look smooth.

You have to TEST to know this is happening. Can't make the assumption that 10bit output will be better. Alot of mid to high priced TVs still had poor 10bit output through 2021
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 29th May 2024 at 22:09.
tp4tissue is offline   Reply With Quote
Old 30th May 2024, 00:59   #64837  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,414
Quote:
Originally Posted by tp4tissue View Post
But it's prudent to test how the display behaves with native 8bit and 10bit inputs, WITHOUT madvr's dithering help.
But in that case, as your testing proves, madVR's dithering does not help anyway, so no need to disable it to check if the display can handle 10-bit input properly.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 30th May 2024, 03:48   #64838  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
Quote:
Originally Posted by tp4tissue View Post
Huhn that can't be true.

On my s90c, test gradiant, 10bit input file, 8bit display out, tell madvr 8bit, madvr dithering enabled the gradiant looks smooth.

- disable dithering on the 8/8, gradiant looks chunky

10bit display out, tell madvr 10bit, madvr dithering disabled, gradiant looks smooth. dithering enabled, gradiant looks smooth. (both look smooth).

I've tested this on both nvidia and amd cards.

On my old TCL, 10bit input file, 8 bit display out, tell madvr 8bit, dithering enabled, gradiant looks smooth.

- disable dithering on 8/8 gradiant looks chunky.

10bit display output, tell madvr 10bit, dithering on or off, the gradiant looks chunky.


So some of the work/behavior, must come down to the TV/Monitor, where gradiants are concerned, regardless of madvr's dithering setting.

I don't know that it's a different algorithm, but the interaction is very different depending on the display.
10 bit none dithered is more chunky then 8 bit dithered.
you are testing something completely irrelevant because it already lost before the test is done.

as soon as you dither the difference is noise level and that's even 6 bit is totally smooth and will win hands down to 10 bit none dithered.

if you can't tell the difference between 8 bit dithered and 10 bit none i'm very sorry to hear that just use 8.
huhn is offline   Reply With Quote
Old 30th May 2024, 20:39   #64839  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 753
Quote:
Originally Posted by huhn View Post
10 bit none dithered is more chunky then 8 bit dithered.
you are testing something completely irrelevant because it already lost before the test is done.

as soon as you dither the difference is noise level and that's even 6 bit is totally smooth and will win hands down to 10 bit none dithered.

if you can't tell the difference between 8 bit dithered and 10 bit none i'm very sorry to hear that just use 8.
it depends on the tv.

10bit non dithered (properly done), vs 8 bit dithered is very subtle. If the panel shows a large difference, well that only means the panel is kinda Meh..

The test, is not for madvr, it's for the TV,

One could make the mistake in assumption that because the Panel looks good in 8bit, it should look good in 10bit, this isn't true, because MADVR is so great at hiding bad gradiant handling in 8 bit.

So, if you don't test for this condition, you wouldn't know that the set may end up performing WORSE in 10bit dithered or no dither.

Like I said, that's been the case for MANY monitors and TVs I've owned. 10bit looks worse, even with madvr's dithering enabled.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 30th May 2024, 21:41   #64840  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 8,008
just test it with dithering it's really not that complicated and no 10 bit no dither has no chance against dithering.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:42.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.