Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
![]() |
#54261 | Link |
Registered User
Join Date: Jan 2014
Posts: 489
|
Why is it none of these refresh rate tools and utilities have color depth entries?
There is also good old Windows 10 versioning issue. In the the latest 1809-17763 there is no option for me to select 12bit in any refresh rate mode other than 60Hz and 59Hz, but in 1607-14393 there definitely was 12bit available for 23-24Hz range. MadVR mode tool sets 1809 to 24Hz 8bit and 1607 to 24Hz BLANK bit (presumably BLANK bit = 12bit).
__________________
8700K @ 5Ghz | ASUS Z370 Hero X | Corsair 16GB @ 3200Mhz | RTX 2080 Ti @ 2100Mhz | Samsung 970 NVMe 250GB | WD Black 2TB | Corsair AX 850W | LG 32GK850G-B @ 165Hz | Xonar DGX | Windows 10 LTSC 1809 |
![]() |
![]() |
![]() |
#54262 | Link | |
Registered User
Join Date: May 2018
Posts: 224
|
Quote:
I am using 416.34. |
|
![]() |
![]() |
![]() |
#54263 | Link | |
Registered User
Join Date: May 2018
Posts: 224
|
Quote:
Last edited by madjock; 13th January 2019 at 13:46. |
|
![]() |
![]() |
![]() |
#54264 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,840
|
Quote:
Please stop spreading your half-knowledge. The only thing you need Quadro for is 10-bit OpenGL output, which has some weird historical reasons. I encourage you to read up on this topic, which has been discussed countless times in this very thread already. You don't need a Pro card for 10-bit. And what logic would that even be, why would 10-bit be limited to Pro cards, but 12-bit be allowed? Clearly 12 is superior to 10 if its fully supported, since its even more bits. The reason you get 12-bit on HDMI TVs is that they only offer the highest bitdepth the display supports, which in that case is 12. It has nothing to do with "Pro" at all. If you have a screen which only supports 10-bit, like some PC monitors, then it'll offer 10-bit output in the control panel. There is absolutely no "Pro" limitation whatsoever here. 10-bit or 12-bit are used for proper HDR output even by the OS itself, as well as games. One doesn't game on Pro cards much.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders Last edited by nevcairiel; 13th January 2019 at 12:24. |
|
![]() |
![]() |
![]() |
#54265 | Link |
Registered User
Join Date: Oct 2016
Posts: 549
|
Do you mean Windows is able to 'force' 10-bit output for native OS HDR even if NVIDIA doesn't make the option available in its control panel? Or is the OS HDR mode ultimately converted to the output setting that is selected in NVIDIA's control panel?
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti |
![]() |
![]() |
![]() |
#54266 | Link |
Registered User
Join Date: Oct 2012
Posts: 5,985
|
no it is just sending an image to the GPU driver and the GPU does what ever it wants to do with it. the driver has to make the image fit the signal needed for the end device.
but you can always, even on 8 bit screen, throw 10 or even 16 bit at the head of the driver. 10 bit output is pretty important for games game developer never learn or cared about dithering and high bit deep buffering. banding in skys is just the norm in games that render in 10 bit. and why not simply put the nail in the coffin AMD can send 10 bit AMD can even do 6. |
![]() |
![]() |
![]() |
#54267 | Link |
Registered User
Join Date: Apr 2017
Posts: 196
|
Re Nvidia - there is a way to get 12 bit out and survive a reboot with later drivers.
Use CRU to create the custom res, and then go into the NCP and select "use default colour settings"....however, you're not done yet, as this will change the output to Limited range.....so, next download this tool to force Full RGB for everything (link http://blog.metaclassofnil.com/wp-co...angeToggle.zip). exctract, run and select force Full range and reboot.....all good to go now for custom res for 23.976 at full range with 12 bit. Job done.
__________________
Sapphire RX 5700 XT (19.12.2) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1904.1), Silverstone LC13B-E, Onkyo TX-NR686, Samsung UE55KS8000, Mission M35i speakers, Kodi Dsplayer 17.6 X64 Last edited by oldpainlesskodi; 13th January 2019 at 13:49. |
![]() |
![]() |
![]() |
#54269 | Link |
Registered User
Join Date: Apr 2017
Posts: 196
|
Yeah, but wasnt sure if it merely enabled Full range, or forced Full range for everything - but good to know.
__________________
Sapphire RX 5700 XT (19.12.2) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1904.1), Silverstone LC13B-E, Onkyo TX-NR686, Samsung UE55KS8000, Mission M35i speakers, Kodi Dsplayer 17.6 X64 |
![]() |
![]() |
![]() |
#54270 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
12-bit YCbCr 4:2:2 is all that is available for Blu-ray players as part of the HDMI 2.0 spec at 60 Hz. There is no 10-bit YCbCr 4:2:2. So the UHD format was designed for 12-bit output. I doubt 10-bits would make much of a difference unless the GPU is doing a poor conversion. Current UHD displays are supposed to be designed for 12-bit YCbCr 4:2:2.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
![]() |
![]() |
![]() |
#54271 | Link |
Registered User
Join Date: Oct 2012
Posts: 5,985
|
HDMI 2.0 can do 10 bit 4:2:2 YCbCr and TV support it that one of the rare cases where nvidia is doing 10 bit.
a bd player should use 4:2:0 10 bit for 60 hz HDR that's lossless. i'm not sure where this 12 bit comes from because PC don't really support it. you could do it with 16 bit and let the driver do the rest but still. |
![]() |
![]() |
![]() |
#54272 | Link |
Registered User
Join Date: May 2018
Posts: 224
|
So you keep saying.
Here is the thing though, so you think nVidia out of the goodness of their hearts have let us have the option of 12bit to help us out, you are quite delusional. The 10bit is for graphic designers and the likes, if you choose to ignore this fact which is all over the web, then good for you mate. Does it make any sense no, but it is what it is, if it was that simple we would have no 8bit, as surely nVvidia would want what is the best for us and it would default to 12 bit straight away, like no breaking drivers every 2nd release and fluid 23.976 frame rates etc etc, and of course 12 bit stays there after a reboot unless you frig something. https://www.pugetsystems.com/labs/ar...t-Output-1221/ https://www.reddit.com/r/nvidia/comm...abling_10_bit/ https://forums.evga.com/gtx-1080-sup...-m2510043.aspx https://devtalk.nvidia.com/default/t.../post/5078463/ https://www.eizoglobal.com/support/c...18-nvidia-amd/ But lets leave that there. Last edited by madjock; 13th January 2019 at 19:24. |
![]() |
![]() |
![]() |
#54273 | Link |
Registered User
Join Date: Oct 2012
Posts: 5,985
|
the professional 10 bit is using openGL on a 8 bit surface to hack the 2 missing bits in the alpha channel. and programs that use that you need a professional card to get 10 bit in the GPU driver. the GPU output has nothing todo with this!
since windows 7 we have 10 bit output to the GPU using directx 11 on any consumer card. you just need a fullscreen exclusive d3d11 surface and this is obviously not usable for photoshop but easily with game or a video renderer. high bit deep output is a well known and for years used feature of madVR. do you even read your links? https://nvidia.custhelp.com/app/answ...11/kw/10%20bit |
![]() |
![]() |
![]() |
#54275 | Link |
Registered User
Join Date: Dec 2002
Posts: 5,494
|
madvr dithers to the bitdepth you set for your display device in the madvr settings. What your GPU driver does with that is a different question. Usually madvr should dither to 10 bit because it does processing (like scaling) in higher precision irregardless of the bitdepth of the source.
Last edited by sneaker_ger; 13th January 2019 at 20:14. |
![]() |
![]() |
![]() |
#54277 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
This is an image from the official HDMI 2.0 spec page: https://i.postimg.cc/WpfKqJRZ/HDMI-2-0-Specs.png The majority of UHD streaming boxes and Blu-ray players output at 12-bit YCbCr 4:2:2 at all refresh rates. The PC shouldn't be at a distinct disadvantage because it has no option for 10-bit RGB.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
|
![]() |
![]() |
![]() |
#54278 | Link |
Registered User
Join Date: Oct 2012
Posts: 5,985
|
source bitdeep is irrelevant because it is ycbcr and converting it to RGB creates floatpoint number (no bit deep is big enough to save the number).
so the madVR output bit deep mostly changes the nosie level in the image created from madVR. if dithering would be disabled the difference between 8 and 10 bit would be a significant difference in banding even with an 8bit source. |
![]() |
![]() |
![]() |
#54279 | Link | |
Registered User
Join Date: May 2018
Posts: 224
|
Quote:
If someone can show me how to get a 10bit Resolution in Windows 10 with a normal nVidia card I am all ears. I am also pretty sure over many threads here that 8bit RGB is usually always recommended, but it is a slow madVR day nowadays. ![]() |
|
![]() |
![]() |
![]() |
#54280 | Link | |
Registered User
Join Date: Oct 2012
Posts: 5,985
|
Quote:
good job HDMI organisation. and i'm sending 10 bit just for the fun of it with an nvidia card. |
|
![]() |
![]() |
![]() |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|