Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th December 2017, 16:47   #47781  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 998
Quote:
Originally Posted by Clammerz View Post
This is not true. While a 10bit file is generally good depending on the encoder, 10bit output to the display is a crapshoot. MadVR outputting 8bit with high quality dithering can outperform a TV's processing engine (or graphics card drivers which may downsample behind the scenes) which may display banding with 10bit output.

You then go on to say that personally for yourself you don't see a difference and that is fine. But you shouldn't say "should be better" without qualifying so as not to confuse users which may not know the details behind the scenes.

Regarding your 3D issue, perhaps you should post about it on the bugtracker for a detailed record of the problem.
What you are essentially saying there is that people who think they are getting 10 bit might not be due to some background processing they might not be aware of and if this is the case then 8 bit + dithering will be better - well thats obvious isnt it.

Those with a true 10 bit panel and the ability to sent 10 bit to it will have the better picture though wont they as dithering is filling in data thats already there on 10 bit.


My 3D is working, I just have the well documented FES problems so I want to stop using it, I cant get 3d MVC to work without FES, I just want to know if i'm wasting my time trying to get it to work as it just doesnt work without FES.
mclingo is offline   Reply With Quote
Old 17th December 2017, 17:25   #47782  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
what the difference between 10 bit and 8 bit? noise level
dithering is effecting the lowest bits the bit witch are already all over the place thanks to encoding and again are altered thanks to chroma scaling and YCbCr -> RGB conversation.
so now you go and do RGB dither it to 10 bit and makes the lowest bit more inaccurate give it the GPU that go and convert it to YCbCr down scale it to half the resolution 4:2:0 (yes it has half the resolution of RGB) and hopefully dither it yet again which again adds noise to the lowest bits.

but now the real fun guess what a display has to do to display an image. well it needs RGB guess what it is doing yet again...

RGB 8 bit shines most with TVs that properly support 4:4:4 and on these TVs chroma scaler have a bigger impact on the image.

RGB 8 bit can give you a more refined image while high bit deep can give you less dither noise with madVR and for not properly dithering renderer less banding.
huhn is offline   Reply With Quote
Old 17th December 2017, 19:14   #47783  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 738
Quote:
Originally Posted by huhn View Post
what the difference between 10 bit and 8 bit? noise level
dithering is effecting the lowest bits the bit witch are already all over the place thanks to encoding and again are altered thanks to chroma scaling and YCbCr -> RGB conversation.
so now you go and do RGB dither it to 10 bit and makes the lowest bit more inaccurate give it the GPU that go and convert it to YCbCr down scale it to half the resolution 4:2:0 (yes it has half the resolution of RGB) and hopefully dither it yet again which again adds noise to the lowest bits.

but now the real fun guess what a display has to do to display an image. well it needs RGB guess what it is doing yet again...

RGB 8 bit shines most with TVs that properly support 4:4:4 and on these TVs chroma scaler have a bigger impact on the image.

RGB 8 bit can give you a more refined image while high bit deep can give you less dither noise with madVR and for not properly dithering renderer less banding.
This might be true with SDR, not with HDR. ST2084 needs at least 10bits in order to not produce any banding, especially in dark/bright areas, although it's more important in the content than for rendering.

My display is 12bits from end to end (input to panels) and 10bits output produces a better result than 8bits, especially in HDR. This is with my nVidia set to 4K23 RGB Full 12bits 4:4:4, as per my sig below.
__________________
Win10 Pro x64 b1909 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 441.66 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25/CMC V3.1
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 17th December 2017, 19:24   #47784  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
and no you don't need 10 bit for HDR with no question at all nearly all TV out there have a 8 bit panel and no banding problems.
ST2084 is created so it creates less banding in the signal but pixel doesn't work like this. pixel are run pretty much the same with SDR or HDR because the pixel are changing there brightness with the number of transistors active for each sub pixel.

the problem with the real bit deep of panels is so bit that it is close to impossibile to test a panels bit deep with signal because a 8 FRC looks so close to a real 10 bit panel that the human eye and measurement device can't easily see the difference. test are so bad that even a VG248QE a clear 6 bit panel gets 8 bit a rtings even the specs say it is 6 bit... (and obviously visible to me with test pattern not real content)

and AFAIK there is no 12 bit panel on the market at all (professional screens are 10 bit too)and a very limited number of 10 bit panels.

if that'S the result you get feel free to sick to it.
huhn is offline   Reply With Quote
Old 17th December 2017, 22:49   #47785  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 738
Quote:
Originally Posted by huhn View Post
and no you don't need 10 bit for HDR with no question at all nearly all TV out there have a 8 bit panel and no banding problems.
ST2084 is created so it creates less banding in the signal but pixel doesn't work like this. pixel are run pretty much the same with SDR or HDR because the pixel are changing there brightness with the number of transistors active for each sub pixel.

the problem with the real bit deep of panels is so bit that it is close to impossibile to test a panels bit deep with signal because a 8 FRC looks so close to a real 10 bit panel that the human eye and measurement device can't easily see the difference. test are so bad that even a VG248QE a clear 6 bit panel gets 8 bit a rtings even the specs say it is 6 bit... (and obviously visible to me with test pattern not real content)

and AFAIK there is no 12 bit panel on the market at all (professional screens are 10 bit too)and a very limited number of 10 bit panels.

if that'S the result you get feel free to sick to it.
The 2015+ JVC projectors have been confirmed to handle 12bits from the input to the panels. I don't care about flat panels so no idea if it exists for TVs or not.
__________________
Win10 Pro x64 b1909 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 441.66 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25/CMC V3.1
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 17th December 2017, 23:41   #47786  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,766
Quote:
Originally Posted by Manni View Post
This might be true with SDR, not with HDR. ST2084 needs at least 10bits in order to not produce any banding, especially in dark/bright areas, although it's more important in the content than for rendering.
Quote:
Originally Posted by Manni View Post
The 2015+ JVC projectors have been confirmed to handle 12bits from the input to the panels. I don't care about flat panels so no idea if it exists for TVs or not.
You still do not need a >8-bit output for displaying HDR content without banding if you have high quality dithering. 8-bit output is a little nosier but that noise is very subtle. Keep 10-bit output from madVR and 12-bit from the GPU but don't expect it to make a significant difference.

Can you tell the difference between 8-bit and 10-bit from madVR with ordered dithering enabled (with the GPU and projector settings the same)?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 00:01   #47787  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 998
ok, so if there is no point using 10 bit do people recommend we all drop back to 8 bit and deploy 4:4:4 full RGB on our TV, how will this effect the playback of 10 bit HDR material?
mclingo is offline   Reply With Quote
Old 18th December 2017, 00:35   #47788  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,249
Manni calm down on your excessive quoting. If you're replying directly after a post there's no need to quote.
ryrynz is offline   Reply With Quote
Old 18th December 2017, 00:35   #47789  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,766
It gets a lot more complicated when you bring GPU drivers and TVs into it. Some need 10 bit input to trigger their HDR modes, in that case you need the GPU to output 10+ bits.

The point is that being unable to get madVR to output 10-bit is fine for image quality, even if you are watching 10 bit HDR, and 8 bit RGB or YCbCr 4:4:4 is better than 10 bit YCbCr 4:2:2 or 4:2:0 unless your TV isn't good with RGB or 4:4:4 input (e.g. does its internal processing in 4:2:2).
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 00:44   #47790  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
Quote:
Originally Posted by Manni View Post
The 2015+ JVC projectors have been confirmed to handle 12bits from the input to the panels. I don't care about flat panels so no idea if it exists for TVs or not.
Quote:
Originally Posted by mclingo View Post
ok, so if there is no point using 10 bit do people recommend we all drop back to 8 bit and deploy 4:4:4 full RGB on our TV, how will this effect the playback of 10 bit HDR material?
i can't remember but if i'm not mistaken AMD needs 10 bit for what ever reason or it will not send HDR.

so it is your choice if you go 4:2:2 10 bit or RGB 8 bit for 60 HZ which will be double dithered(and i tested the amd dithering to 6 bit is is pretty much random dithering not bad but not good too 10->8 bit should be the same but hard to test because it looks very similar)

windows 10 HDR shoudl work at 8 bit even with AMD cards.
Quote:
Originally Posted by Manni View Post
The 2015+ JVC projectors have been confirmed to handle 12bits from the input to the panels. I don't care about flat panels so no idea if it exists for TVs or not.
it's hard to find a screen that doesn't support 12 bit input even TV from 2012 and older can do this easily.
and i wonder how people confirm stuff like this...
huhn is offline   Reply With Quote
Old 18th December 2017, 00:46   #47791  |  Link
mitchmalibu
Registered User
 
Join Date: Mar 2009
Posts: 37
It's frankly not that difficult to test for yourself ... I watched a few HDR movies in 4:4:4 12bits and RGB 8bit, and frankly, without using comparison shots, I wouldn't be able to see any difference on a 2016 LG 4k OLED panel. 8bit RGB with a custom resolution to avoid frame drop seems like the better alternative from what I personally tried.
What madshi and others said makes sense : better reduce the number of conversions and drop down the bitdepth than try to push for max bitdepth and induce who knows how many post processing treatments on which you don't have any control.
Now, if only Nvidia gave us the possibility to make proper custom resolutions other than 8bit RGB ...
__________________
OS: Win10 1703
GPU: GTX 1070 (latest stable drivers)
Monitor: LG OLED55B6V TV / Yamaha RX-A860 AVR
Media setup: MPC-BE x64, madvr, lav filters (nightly)
mitchmalibu is offline   Reply With Quote
Old 18th December 2017, 02:09   #47792  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 738
Quote:
Originally Posted by Asmodian View Post
You still do not need a >8-bit output for displaying HDR content without banding if you have high quality dithering. 8-bit output is a little nosier but that noise is very subtle. Keep 10-bit output from madVR and 12-bit from the GPU but don't expect it to make a significant difference.

Can you tell the difference between 8-bit and 10-bit from madVR with ordered dithering enabled (with the GPU and projector settings the same)?
I don't want to dither if I don't have to. So yes, I prefer to keep 10bits output over 12bits GPU for 10bits HDR playback |(except for 60p where the driver drops automatically to 8bits). I don't see why I should add some noise when I don't have to.

I remember testing a while ago and the difference was subtle, but it was there for 10bits HDR content. When I tried to see a difference with bluray content (8bits SDR), I couldn't see any difference between 8bits and 10bits. I don't have the time to do more pixel peeping just because some prefer to watch HDR 10bits content dithered to 8bits.

I want as close as the "pure direct" equivalent as I can get . I do agree though that it's not the end of the world if you drop to 8bits, but I want the same PQ as my standalone player (or better) with MadVR, not worse.
__________________
Win10 Pro x64 b1909 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 441.66 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25/CMC V3.1
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 18th December 2017 at 02:13.
Manni is offline   Reply With Quote
Old 18th December 2017, 02:16   #47793  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,766
We are not saying we prefer dithering 8 bit to 10 bit, only that no one can tell the difference all else being equal.

The point is only that you should not sacrifice anything for 10 bit output, not that 8 bit output is better.

Edit: And you always have to dither, you simply dither to 10 bit or 12 bit instead of dithering to 8 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 02:50   #47794  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 738
Quote:
Originally Posted by huhn View Post
i can't remember but if i'm not mistaken AMD needs 10 bit for what ever reason or it will not send HDR.

so it is your choice if you go 4:2:2 10 bit or RGB 8 bit for 60 HZ which will be double dithered(and i tested the amd dithering to 6 bit is is pretty much random dithering not bad but not good too 10->8 bit should be the same but hard to test because it looks very similar)

windows 10 HDR shoudl work at 8 bit even with AMD cards.


it's hard to find a screen that doesn't support 12 bit input even TV from 2012 and older can do this easily.
and i wonder how people confirm stuff like this...
I don't know about AMD, I have nVidia 1080ti (see my sig). It's perfectly possible to send 8bits or 12bits to get HDR with nVidia. It is not possible to use 4:2:2 10bits with nVidia. It's 8bits or 12bits.

Therefore there is no need to use 4:2:2 with nVidia. I use 12bits 4:4:4 for 99.99% of films, and 8bits 4:4:4 for the odd 60p film such as Billy Lynn.

I did tell you that it was not only the input but all the chain up to the panels that was 12bits. I don't have the reference here but it has been confirmed. It wasn't the case for the pre-2015 JVCs, which had 12bits inputs but 10bits panels.

Quote:
Originally Posted by Asmodian View Post
We are not saying we prefer dithering 8 bit to 10 bit, only that no one can tell the difference all else being equal.

The point is only that you should not sacrifice anything for 10 bit output, not that 8 bit output is better.

Edit: And you always have to dither, you simply dither to 10 bit or 12 bit instead of dithering to 8 bit.
I am not sacrificing anything, so I don't see why I should use 8bits. And yes, I meant that I don't want to dither to 8bits if I don't have to, I thought the context would make it clear given that we have already discussed many times that MadVR's output is in 16bits before dithering.

I suggest we discuss this again when you have been able to compare 8bits vs 10bits output on native 10/12bits panels. Until then, I'm out.
__________________
Win10 Pro x64 b1909 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 441.66 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25/CMC V3.1
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 18th December 2017, 05:44   #47795  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Hi guys, question... Is there any way that when Madvr automatically changes to 23hz also change the bit depth to 12 bits? with the old Nvidia drivers I could do it. Is there any other software that I can use in Windows 10 and combine it with Madvr or run independently from Madvr?

Thks
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 07:28   #47796  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,766
Quote:
Originally Posted by Manni View Post
I am not sacrificing anything, so I don't see why I should use 8bits.
You should not. If you don't have to sacrifice anything use 10 bit, of course.

Quote:
Originally Posted by Manni View Post
I suggest we discuss this again when you have been able to compare 8bits vs 10bits output on native 10/12bits panels. Until then, I'm out.
I am using a 2017 LG OLED, which is native 10 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 07:33   #47797  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Quote:
Originally Posted by Asmodian View Post

I am using a 2017 LG OLED, which is native 10 bit.
I have LG OLED 2017 too. These panels are 10 or 12 bits?
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 07:35   #47798  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,766
10 bit
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 07:40   #47799  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Quote:
Originally Posted by Asmodian View Post
10 bit
I thought it was 12 bits because it supported dolby vision
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 08:08   #47800  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,062
there are 8 bit panels that can do dolby vision.

LG never confirmed more than 10 bit internal processing. they even featured this as a new "thing".

WRGB OLED can't be properly compared anyway it needs heavy processing to even get an image on the screen and if the sub pixel are really run with 10 bpc that would mean this screen has 40 bpp.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 18:55.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.