Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 13th January 2019, 05:42   #54261  |  Link
XMonarchY
Registered User
 
Join Date: Jan 2014
Posts: 489
Why is it none of these refresh rate tools and utilities have color depth entries?

There is also good old Windows 10 versioning issue. In the the latest 1809-17763 there is no option for me to select 12bit in any refresh rate mode other than 60Hz and 59Hz, but in 1607-14393 there definitely was 12bit available for 23-24Hz range.

MadVR mode tool sets 1809 to 24Hz 8bit and 1607 to 24Hz BLANK bit (presumably BLANK bit = 12bit).
__________________
8700K @ 5Ghz | ASUS Z370 Hero X | Corsair 16GB @ 3200Mhz | RTX 2080 Ti @ 2100Mhz | Samsung 970 NVMe 250GB | WD Black 2TB | Corsair AX 850W | LG 32GK850G-B @ 165Hz | Xonar DGX | Windows 10 LTSC 1809
XMonarchY is offline   Reply With Quote
Old 13th January 2019, 10:19   #54262  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by XMonarchY View Post
Why is it none of these refresh rate tools and utilities have color depth entries?

There is also good old Windows 10 versioning issue. In the the latest 1809-17763 there is no option for me to select 12bit in any refresh rate mode other than 60Hz and 59Hz, but in 1607-14393 there definitely was 12bit available for 23-24Hz range.

MadVR mode tool sets 1809 to 24Hz 8bit and 1607 to 24Hz BLANK bit (presumably BLANK bit = 12bit).
I can select 12 bit in NVCP panel in 1809 but obviously cant get it to stick in a custom res in the NVCP.

I am using 416.34.
madjock is offline   Reply With Quote
Old 13th January 2019, 11:57   #54263  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by nevcairiel View Post
This is just wrong. You can get 10-bit/12-bit output on any modern GPU, depending on monitor support.

It is far from wrong, it is stated everywhere nVidia wise anyway that you need a Pro card to get 10bit like a Quadro, hence why we can only select 8 or 12bit. 10 bit they have reserved for Direct X games as far as I can read up on


While this is true for now, its also irrelevant. We use 12-bit because thats what NVIDIA gives you with many HDMI TVs. Its either 8 or 12, and if you can output 10-bit with madVR, then clearly 12 is the only mode to actually preserve those 10-bits, because 8 would reduce it.
The majority of people myself included have an 8bit TV with FRC to make it 10, so am I going to see an improvement going from 12 to 10 on a fake 10 bit TV anyway ? are we not all talking about dithering in one way or another, would we see any extra colours, because again all I can read is it is due to banding issues so I have really no idea if it is better to stay at 8bit as we could never tell anyway ?

Last edited by madjock; 13th January 2019 at 13:46.
madjock is offline   Reply With Quote
Old 13th January 2019, 12:17   #54264  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,817
Quote:
Originally Posted by madjock View Post
It is far from wrong, it is stated everywhere by nVidia that you need a Pro card to get 10bit like a Quadro, hence why we can only select 8 or 12bit.
You are just plain out wrong. If you have a monitor which only accepts 10-bit, and not 12-bit, then NVIDIA will offer you 10-bit output on consumer cards. I know, because I'm looking at it right this second with just such a monitor.

Please stop spreading your half-knowledge. The only thing you need Quadro for is 10-bit OpenGL output, which has some weird historical reasons.
I encourage you to read up on this topic, which has been discussed countless times in this very thread already. You don't need a Pro card for 10-bit. And what logic would that even be, why would 10-bit be limited to Pro cards, but 12-bit be allowed? Clearly 12 is superior to 10 if its fully supported, since its even more bits.

The reason you get 12-bit on HDMI TVs is that they only offer the highest bitdepth the display supports, which in that case is 12. It has nothing to do with "Pro" at all. If you have a screen which only supports 10-bit, like some PC monitors, then it'll offer 10-bit output in the control panel. There is absolutely no "Pro" limitation whatsoever here. 10-bit or 12-bit are used for proper HDR output even by the OS itself, as well as games. One doesn't game on Pro cards much.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 13th January 2019 at 12:24.
nevcairiel is offline   Reply With Quote
Old 13th January 2019, 12:58   #54265  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 536
Quote:
Originally Posted by nevcairiel View Post
10-bit or 12-bit are used for proper HDR output even by the OS itself, as well as games.
Do you mean Windows is able to 'force' 10-bit output for native OS HDR even if NVIDIA doesn't make the option available in its control panel? Or is the OS HDR mode ultimately converted to the output setting that is selected in NVIDIA's control panel?
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Old 13th January 2019, 13:08   #54266  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
no it is just sending an image to the GPU driver and the GPU does what ever it wants to do with it. the driver has to make the image fit the signal needed for the end device.

but you can always, even on 8 bit screen, throw 10 or even 16 bit at the head of the driver.

10 bit output is pretty important for games game developer never learn or cared about dithering and high bit deep buffering. banding in skys is just the norm in games that render in 10 bit.

and why not simply put the nail in the coffin AMD can send 10 bit AMD can even do 6.
huhn is offline   Reply With Quote
Old 13th January 2019, 13:40   #54267  |  Link
oldpainlesskodi
Registered User
 
Join Date: Apr 2017
Posts: 178
Re Nvidia - there is a way to get 12 bit out and survive a reboot with later drivers.

Use CRU to create the custom res, and then go into the NCP and select "use default colour settings"....however, you're not done yet, as this will change the output to Limited range.....so, next download this tool to force Full RGB for everything (link http://blog.metaclassofnil.com/wp-co...angeToggle.zip). exctract, run and select force Full range and reboot.....all good to go now for custom res for 23.976 at full range with 12 bit.

Job done.
__________________
Sapphire RX 5700 XT (19.7.5) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1903), Silverstone LC13B-E, Pioneer SC-LX501 Elite D3, Samsung UE55KS8000, Mission M33i speakers, Kodi Dsplayer 17.6 X64

Last edited by oldpainlesskodi; 13th January 2019 at 13:49.
oldpainlesskodi is offline   Reply With Quote
Old 13th January 2019, 13:44   #54268  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
you may want to have a look at madleveltweaker.exe
huhn is offline   Reply With Quote
Old 13th January 2019, 13:45   #54269  |  Link
oldpainlesskodi
Registered User
 
Join Date: Apr 2017
Posts: 178
Yeah, but wasnt sure if it merely enabled Full range, or forced Full range for everything - but good to know.
__________________
Sapphire RX 5700 XT (19.7.5) Ryzen 7 3700x, PRIME X570-Pro, Win 10 x64 (1903), Silverstone LC13B-E, Pioneer SC-LX501 Elite D3, Samsung UE55KS8000, Mission M33i speakers, Kodi Dsplayer 17.6 X64
oldpainlesskodi is offline   Reply With Quote
Old 13th January 2019, 14:29   #54270  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,122
12-bit YCbCr 4:2:2 is all that is available for Blu-ray players as part of the HDMI 2.0 spec at 60 Hz. There is no 10-bit YCbCr 4:2:2. So the UHD format was designed for 12-bit output. I doubt 10-bits would make much of a difference unless the GPU is doing a poor conversion. Current UHD displays are supposed to be designed for 12-bit YCbCr 4:2:2.
Warner306 is offline   Reply With Quote
Old 13th January 2019, 14:43   #54271  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
HDMI 2.0 can do 10 bit 4:2:2 YCbCr and TV support it that one of the rare cases where nvidia is doing 10 bit.
a bd player should use 4:2:0 10 bit for 60 hz HDR that's lossless.

i'm not sure where this 12 bit comes from because PC don't really support it. you could do it with 16 bit and let the driver do the rest but still.
huhn is offline   Reply With Quote
Old 13th January 2019, 18:48   #54272  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by nevcairiel View Post
You are just plain out wrong.
So you keep saying.

Here is the thing though, so you think nVidia out of the goodness of their hearts have let us have the option of 12bit to help us out, you are quite delusional.

The 10bit is for graphic designers and the likes, if you choose to ignore this fact which is all over the web, then good for you mate.

Does it make any sense no, but it is what it is, if it was that simple we would have no 8bit, as surely nVvidia would want what is the best for us and it would default to 12 bit straight away, like no breaking drivers every 2nd release and fluid 23.976 frame rates etc etc, and of course 12 bit stays there after a reboot unless you frig something.

https://www.pugetsystems.com/labs/ar...t-Output-1221/
https://www.reddit.com/r/nvidia/comm...abling_10_bit/
https://forums.evga.com/gtx-1080-sup...-m2510043.aspx
https://devtalk.nvidia.com/default/t.../post/5078463/
https://www.eizoglobal.com/support/c...18-nvidia-amd/



But lets leave that there.

Last edited by madjock; 13th January 2019 at 19:24.
madjock is offline   Reply With Quote
Old 13th January 2019, 20:00   #54273  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
the professional 10 bit is using openGL on a 8 bit surface to hack the 2 missing bits in the alpha channel. and programs that use that you need a professional card to get 10 bit in the GPU driver. the GPU output has nothing todo with this!

since windows 7 we have 10 bit output to the GPU using directx 11 on any consumer card. you just need a fullscreen exclusive d3d11 surface and this is obviously not usable for photoshop but easily with game or a video renderer.

high bit deep output is a well known and for years used feature of madVR.

do you even read your links? https://nvidia.custhelp.com/app/answ...11/kw/10%20bit
huhn is offline   Reply With Quote
Old 13th January 2019, 20:10   #54274  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 603
Question about madVR 10bit output - if I have set 12bit in the gpu driver, and the source is 8bit, does madVR dither to 8bit or 10bit?
iSeries is offline   Reply With Quote
Old 13th January 2019, 20:12   #54275  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,494
madvr dithers to the bitdepth you set for your display device in the madvr settings. What your GPU driver does with that is a different question. Usually madvr should dither to 10 bit because it does processing (like scaling) in higher precision irregardless of the bitdepth of the source.

Last edited by sneaker_ger; 13th January 2019 at 20:14.
sneaker_ger is offline   Reply With Quote
Old 13th January 2019, 20:16   #54276  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 603
So best to setup profiles for 8bit and 10bit content then I guess (i am assuming it should be dithered to at least the source bit depth)
iSeries is offline   Reply With Quote
Old 13th January 2019, 20:16   #54277  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,122
Quote:
Originally Posted by huhn View Post
HDMI 2.0 can do 10 bit 4:2:2 YCbCr and TV support it that one of the rare cases where nvidia is doing 10 bit.
a bd player should use 4:2:0 10 bit for 60 hz HDR that's lossless.

i'm not sure where this 12 bit comes from because PC don't really support it. you could do it with 16 bit and let the driver do the rest but still.
There is no option for 10-bit 4:2:2 above 30 Hz.

This is an image from the official HDMI 2.0 spec page: https://i.postimg.cc/WpfKqJRZ/HDMI-2-0-Specs.png

The majority of UHD streaming boxes and Blu-ray players output at 12-bit YCbCr 4:2:2 at all refresh rates. The PC shouldn't be at a distinct disadvantage because it has no option for 10-bit RGB.
Warner306 is offline   Reply With Quote
Old 13th January 2019, 20:18   #54278  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
source bitdeep is irrelevant because it is ycbcr and converting it to RGB creates floatpoint number (no bit deep is big enough to save the number).

so the madVR output bit deep mostly changes the nosie level in the image created from madVR. if dithering would be disabled the difference between 8 and 10 bit would be a significant difference in banding even with an 8bit source.
huhn is offline   Reply With Quote
Old 13th January 2019, 20:19   #54279  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by huhn View Post
the professional 10 bit is using openGL on a 8 bit surface to hack the 2 missing bits in the alpha channel. and programs that use that you need a professional card to get 10 bit in the GPU driver. the GPU output has nothing todo with this!

since windows 7 we have 10 bit output to the GPU using directx 11 on any consumer card. you just need a fullscreen exclusive d3d11 surface and this is obviously not usable for photoshop but easily with game or a video renderer.

high bit deep output is a well known and for years used feature of madVR.

do you even read your links? https://nvidia.custhelp.com/app/answ...11/kw/10%20bit
I don't get why I am getting jumped on when it is quite obvious we can't select 10 bit. Whatever can or cannot be done, but it is obviously easier to get personal.

If someone can show me how to get a 10bit Resolution in Windows 10 with a normal nVidia card I am all ears. I am also pretty sure over many threads here that 8bit RGB is usually always recommended, but it is a slow madVR day nowadays.
madjock is offline   Reply With Quote
Old 13th January 2019, 20:22   #54280  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
Quote:
Originally Posted by Warner306 View Post
There is no option for 10-bit 4:2:2 above 30 Hz.

This is an image from the official HDMI 2.0 spec page: https://i.postimg.cc/WpfKqJRZ/HDMI-2-0-Specs.png

The majority of UHD streaming boxes and Blu-ray players output at 12-bit YCbCr 4:2:2 at all refresh rates. The PC shouldn't be at a distinct disadvantage because it has no option for 10-bit RGB.
well no: https://abload.de/img/wellno2ikpj.png
good job HDMI organisation.

and i'm sending 10 bit just for the fun of it with an nvidia card.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 00:51.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.