Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th January 2019, 19:16 | #54242 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
Have you any sources in 12 bit ? I have read on this forum and many others whilst chasing a supposed better dream by increasing bits and in reality it does nothing and indeed can cause issues. Search 8bit vs 10 bit in Google and the majority say its not worth it, you cannot get 10bit as nVidia has disabled it unless its for games, you will never see more colours and madVR will never output more than 10bit anyway. So no not snarky, just thought there was something I was missing, and wondered why people chased the extra hastle of changing settings all the time. |
|
12th January 2019, 20:15 | #54243 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
I don't think there is even a 12 bit panel made for consumers at least nothing mainstream.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
12th January 2019, 20:55 | #54244 | Link |
Registered User
Join Date: Mar 2009
Posts: 3,650
|
I switch to 12 bit. After turning off dithering for testing and comparing I could see better graduation in the greyscale ramp, it's minor but it's a free improvement so I'll take it, just test for yourself how your screen handles it.
|
12th January 2019, 21:01 | #54245 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
excuses me? disabling dithering if you only saw minor difference then you should maybe look for a better test pattern.
i even have a troubles imaging an processing error in TV that would be worse then sending 8 bit without dithering. |
12th January 2019, 21:42 | #54246 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Quote:
While this is true for now, its also irrelevant. We use 12-bit because thats what NVIDIA gives you with many HDMI TVs. Its either 8 or 12, and if you can output 10-bit with madVR, then clearly 12 is the only mode to actually preserve those 10-bits, because 8 would reduce it.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
|
12th January 2019, 21:57 | #54247 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
Dithering is often misunderstood, it is not hiding errors, it is how to do the math correctly when viewing/listening to digitally sampled sources. You can use test patterns with dithering off when trying to understand what your display's internal processing is doing but do not make extrapolations from those tests like you did here. You assume there is a "free improvement" from a test that was not representative of the image you would actually see when using 8 or 12 bit. How does that tell you anything? I get why we all do these overly simply tests but we need to be very careful when interpreting, or disseminating, the results. There can be a lot of misinformation spread based on inappropriate testing methodology.
__________________
madVR options explained |
|
13th January 2019, 04:26 | #54250 | Link |
Registered User
Join Date: Mar 2009
Posts: 3,650
|
Of course I did this first, however I couldn't determine a clear result by doing that. So testing this way helped me to determine if this it made any difference at all and because it did i can only conclude if it's not harming quality then it's at least the same or better.. That's good enough for me.
|
13th January 2019, 05:42 | #54251 | Link |
Guest
Posts: n/a
|
Why is it none of these refresh rate tools and utilities have color depth entries?
There is also good old Windows 10 versioning issue. In the the latest 1809-17763 there is no option for me to select 12bit in any refresh rate mode other than 60Hz and 59Hz, but in 1607-14393 there definitely was 12bit available for 23-24Hz range. MadVR mode tool sets 1809 to 24Hz 8bit and 1607 to 24Hz BLANK bit (presumably BLANK bit = 12bit). |
13th January 2019, 10:19 | #54252 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
I am using 416.34. |
|
13th January 2019, 11:57 | #54253 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
Last edited by madjock; 13th January 2019 at 13:46. |
|
13th January 2019, 12:17 | #54254 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Quote:
Please stop spreading your half-knowledge. The only thing you need Quadro for is 10-bit OpenGL output, which has some weird historical reasons. I encourage you to read up on this topic, which has been discussed countless times in this very thread already. You don't need a Pro card for 10-bit. And what logic would that even be, why would 10-bit be limited to Pro cards, but 12-bit be allowed? Clearly 12 is superior to 10 if its fully supported, since its even more bits. The reason you get 12-bit on HDMI TVs is that they only offer the highest bitdepth the display supports, which in that case is 12. It has nothing to do with "Pro" at all. If you have a screen which only supports 10-bit, like some PC monitors, then it'll offer 10-bit output in the control panel. There is absolutely no "Pro" limitation whatsoever here. 10-bit or 12-bit are used for proper HDR output even by the OS itself, as well as games. One doesn't game on Pro cards much.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders Last edited by nevcairiel; 13th January 2019 at 12:24. |
|
13th January 2019, 12:58 | #54255 | Link |
Registered User
Join Date: Oct 2016
Posts: 896
|
Do you mean Windows is able to 'force' 10-bit output for native OS HDR even if NVIDIA doesn't make the option available in its control panel? Or is the OS HDR mode ultimately converted to the output setting that is selected in NVIDIA's control panel?
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 |
13th January 2019, 13:08 | #54256 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
no it is just sending an image to the GPU driver and the GPU does what ever it wants to do with it. the driver has to make the image fit the signal needed for the end device.
but you can always, even on 8 bit screen, throw 10 or even 16 bit at the head of the driver. 10 bit output is pretty important for games game developer never learn or cared about dithering and high bit deep buffering. banding in skys is just the norm in games that render in 10 bit. and why not simply put the nail in the coffin AMD can send 10 bit AMD can even do 6. |
13th January 2019, 13:40 | #54257 | Link |
Registered User
Join Date: Apr 2017
Posts: 366
|
Re Nvidia - there is a way to get 12 bit out and survive a reboot with later drivers.
Use CRU to create the custom res, and then go into the NCP and select "use default colour settings"....however, you're not done yet, as this will change the output to Limited range.....so, next download this tool to force Full RGB for everything (link http://blog.metaclassofnil.com/wp-co...angeToggle.zip). exctract, run and select force Full range and reboot.....all good to go now for custom res for 23.976 at full range with 12 bit. Job done.
__________________
LG OLED55BX6LB, Zidoo Z1000 Pro, Yamaha RX-A3060, Polk Signature Fronts & Centre, Wharfedale D300 Atmos surrounds, Polk Signature HTS 10 Sub, DSPeaker Antimode 8033 Cinema Last edited by oldpainlesskodi; 13th January 2019 at 13:49. |
13th January 2019, 13:45 | #54259 | Link |
Registered User
Join Date: Apr 2017
Posts: 366
|
Yeah, but wasnt sure if it merely enabled Full range, or forced Full range for everything - but good to know.
__________________
LG OLED55BX6LB, Zidoo Z1000 Pro, Yamaha RX-A3060, Polk Signature Fronts & Centre, Wharfedale D300 Atmos surrounds, Polk Signature HTS 10 Sub, DSPeaker Antimode 8033 Cinema |
13th January 2019, 14:29 | #54260 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
12-bit YCbCr 4:2:2 is all that is available for Blu-ray players as part of the HDMI 2.0 spec at 60 Hz. There is no 10-bit YCbCr 4:2:2. So the UHD format was designed for 12-bit output. I doubt 10-bits would make much of a difference unless the GPU is doing a poor conversion. Current UHD displays are supposed to be designed for 12-bit YCbCr 4:2:2.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|