Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 13th January 2019, 14:43   #54261  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
HDMI 2.0 can do 10 bit 4:2:2 YCbCr and TV support it that one of the rare cases where nvidia is doing 10 bit.
a bd player should use 4:2:0 10 bit for 60 hz HDR that's lossless.

i'm not sure where this 12 bit comes from because PC don't really support it. you could do it with 16 bit and let the driver do the rest but still.
huhn is offline   Reply With Quote
Old 13th January 2019, 18:48   #54262  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by nevcairiel View Post
You are just plain out wrong.
So you keep saying.

Here is the thing though, so you think nVidia out of the goodness of their hearts have let us have the option of 12bit to help us out, you are quite delusional.

The 10bit is for graphic designers and the likes, if you choose to ignore this fact which is all over the web, then good for you mate.

Does it make any sense no, but it is what it is, if it was that simple we would have no 8bit, as surely nVvidia would want what is the best for us and it would default to 12 bit straight away, like no breaking drivers every 2nd release and fluid 23.976 frame rates etc etc, and of course 12 bit stays there after a reboot unless you frig something.

https://www.pugetsystems.com/labs/ar...t-Output-1221/
https://www.reddit.com/r/nvidia/comm...abling_10_bit/
https://forums.evga.com/gtx-1080-sup...-m2510043.aspx
https://devtalk.nvidia.com/default/t.../post/5078463/
https://www.eizoglobal.com/support/c...18-nvidia-amd/



But lets leave that there.

Last edited by madjock; 13th January 2019 at 19:24.
madjock is offline   Reply With Quote
Old 13th January 2019, 20:00   #54263  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the professional 10 bit is using openGL on a 8 bit surface to hack the 2 missing bits in the alpha channel. and programs that use that you need a professional card to get 10 bit in the GPU driver. the GPU output has nothing todo with this!

since windows 7 we have 10 bit output to the GPU using directx 11 on any consumer card. you just need a fullscreen exclusive d3d11 surface and this is obviously not usable for photoshop but easily with game or a video renderer.

high bit deep output is a well known and for years used feature of madVR.

do you even read your links? https://nvidia.custhelp.com/app/answ...11/kw/10%20bit
huhn is offline   Reply With Quote
Old 13th January 2019, 20:10   #54264  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Question about madVR 10bit output - if I have set 12bit in the gpu driver, and the source is 8bit, does madVR dither to 8bit or 10bit?
iSeries is offline   Reply With Quote
Old 13th January 2019, 20:12   #54265  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
madvr dithers to the bitdepth you set for your display device in the madvr settings. What your GPU driver does with that is a different question. Usually madvr should dither to 10 bit because it does processing (like scaling) in higher precision irregardless of the bitdepth of the source.

Last edited by sneaker_ger; 13th January 2019 at 20:14.
sneaker_ger is offline   Reply With Quote
Old 13th January 2019, 20:16   #54266  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
So best to setup profiles for 8bit and 10bit content then I guess (i am assuming it should be dithered to at least the source bit depth)
iSeries is offline   Reply With Quote
Old 13th January 2019, 20:16   #54267  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by huhn View Post
HDMI 2.0 can do 10 bit 4:2:2 YCbCr and TV support it that one of the rare cases where nvidia is doing 10 bit.
a bd player should use 4:2:0 10 bit for 60 hz HDR that's lossless.

i'm not sure where this 12 bit comes from because PC don't really support it. you could do it with 16 bit and let the driver do the rest but still.
There is no option for 10-bit 4:2:2 above 30 Hz.

This is an image from the official HDMI 2.0 spec page: https://i.postimg.cc/WpfKqJRZ/HDMI-2-0-Specs.png

The majority of UHD streaming boxes and Blu-ray players output at 12-bit YCbCr 4:2:2 at all refresh rates. The PC shouldn't be at a distinct disadvantage because it has no option for 10-bit RGB.
Warner306 is offline   Reply With Quote
Old 13th January 2019, 20:18   #54268  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
source bitdeep is irrelevant because it is ycbcr and converting it to RGB creates floatpoint number (no bit deep is big enough to save the number).

so the madVR output bit deep mostly changes the nosie level in the image created from madVR. if dithering would be disabled the difference between 8 and 10 bit would be a significant difference in banding even with an 8bit source.
huhn is offline   Reply With Quote
Old 13th January 2019, 20:19   #54269  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by huhn View Post
the professional 10 bit is using openGL on a 8 bit surface to hack the 2 missing bits in the alpha channel. and programs that use that you need a professional card to get 10 bit in the GPU driver. the GPU output has nothing todo with this!

since windows 7 we have 10 bit output to the GPU using directx 11 on any consumer card. you just need a fullscreen exclusive d3d11 surface and this is obviously not usable for photoshop but easily with game or a video renderer.

high bit deep output is a well known and for years used feature of madVR.

do you even read your links? https://nvidia.custhelp.com/app/answ...11/kw/10%20bit
I don't get why I am getting jumped on when it is quite obvious we can't select 10 bit. Whatever can or cannot be done, but it is obviously easier to get personal.

If someone can show me how to get a 10bit Resolution in Windows 10 with a normal nVidia card I am all ears. I am also pretty sure over many threads here that 8bit RGB is usually always recommended, but it is a slow madVR day nowadays.
madjock is offline   Reply With Quote
Old 13th January 2019, 20:22   #54270  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Warner306 View Post
There is no option for 10-bit 4:2:2 above 30 Hz.

This is an image from the official HDMI 2.0 spec page: https://i.postimg.cc/WpfKqJRZ/HDMI-2-0-Specs.png

The majority of UHD streaming boxes and Blu-ray players output at 12-bit YCbCr 4:2:2 at all refresh rates. The PC shouldn't be at a distinct disadvantage because it has no option for 10-bit RGB.
well no: https://abload.de/img/wellno2ikpj.png
good job HDMI organisation.

and i'm sending 10 bit just for the fun of it with an nvidia card.
huhn is offline   Reply With Quote
Old 13th January 2019, 20:28   #54271  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by huhn View Post
well no: https://abload.de/img/wellno2ikpj.png
good job HDMI organisation.

and i'm sending 10 bit just for the fun of it with an nvidia card.

Thats not a normal NVCP...show us RGB - Full then win a prize

Last edited by madjock; 13th January 2019 at 20:34.
madjock is offline   Reply With Quote
Old 13th January 2019, 20:37   #54272  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
do yourself and this thread a favour and move on from this topic.

nev has said anything that needs to be said about this.
sending 10 bit RGB is a no issue with AMD cards the nvidia controls are not as flexible as these from AMD in this regard.
sending 12 instead 10 would always be fine if the processing in TVs would be solid.
huhn is offline   Reply With Quote
Old 13th January 2019, 20:47   #54273  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by huhn View Post
well no: https://abload.de/img/wellno2ikpj.png
good job HDMI organisation.

and i'm sending 10 bit just for the fun of it with an nvidia card.
That doesn't make much sense. It should be option, but most UHD players output 12-bit YCbCr 4:2:2.
Warner306 is offline   Reply With Quote
Old 13th January 2019, 21:05   #54274  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by huhn View Post
do yourself and this thread a favour and move on from this topic.

nev has said anything that needs to be said about this.
sending 10 bit RGB is a no issue with AMD cards the nvidia controls are not as flexible as these from AMD in this regard.
sending 12 instead 10 would always be fine if the processing in TVs would be solid.
madjock is offline   Reply With Quote
Old 14th January 2019, 02:01   #54275  |  Link
ryokoseigo
Registered User
 
Join Date: Jul 2015
Posts: 6
Could anyone recommend me the settings to best lower noise?

I'm wondering if the high sharpness is causing more grain in the videos I'm watching. I'm using 4k hdr videos of high quality bitrate through MPC-BE and MAdVR. But some videos just dont look very good, for example The Prestige looks like a grainy mess, and Infinitywar isnt very great.

Are there settings I should use that are simply better then other options? I have LAV hardware decoder set to d3d11.

I'm using gtx 1080ti and 8700k so power shouldnt be an issue.
ryokoseigo is offline   Reply With Quote
Old 14th January 2019, 06:37   #54276  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
This sounds like film grain, it's meant to be there. Reduce all your sharpening (TV/madVR) and observe, you have 4K high bitrate sources, there is little reason to add even moderate sharpening which may exacerbate noise.
Not sure if you looked but madVR does have a option called 'reduce random noise' under processing -> artifact removal.. Not sure how you missed that. Also your TV itself with have some artifact/noise removal options too.
ryrynz is offline   Reply With Quote
Old 14th January 2019, 12:34   #54277  |  Link
svengun
Registered User
 
Join Date: Jan 2018
Location: Barcelona
Posts: 50
Quote:
Originally Posted by huhn View Post
you may want to have a look at madleveltweaker.exe
Thanks ! very handy
svengun is offline   Reply With Quote
Old 14th January 2019, 15:36   #54278  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by huhn View Post
do yourself and this thread a favour and move on from this topic.

nev has said anything that needs to be said about this.
sending 10 bit RGB is a no issue with AMD cards the nvidia controls are not as flexible as these from AMD in this regard.
sending 12 instead 10 would always be fine if the processing in TVs would be solid.
Huhn, I are confused now.

If I have nvidia, and I want to output HDR to TV, should my setting on the nvcp output page be 4:2:2 10bit / 12bit ?

or does it not matter, and the Madvr auto pops the screen to NVHDR, which does whatever it does , "which is ??"


I also have an ATI card incoming, what should my output be set for ATI

__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 14th January 2019, 15:47   #54279  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
for best results you should always send RGB if possible.

madVR doesn't care about the output of the GPU it just send images the driver which is always RGB.

Quote:
I also have an ATI card incoming, what should my output be set for ATI
sorry to hear that. AMD needs 10 bit output from madVR or the AMD HDR API will not work.
the rest doesn't change. you should not go over 10 bit on AMD use the chance of 10 bit RGB on them for a at least in theory untouched output.
huhn is offline   Reply With Quote
Old 14th January 2019, 21:36   #54280  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 77
I'm thinking about sending my Radeon 580 back. How does a GTX 1060 6GB compare to a Radeon 580 8GB?. With the 580 I'm able to run NGU A-A low for chroma upscaling and Jinc with sigmoidal light & anti-ringing for image upscaling. I also have most of the image enhancement and upscaling refinement options enabled (with default values).

Last edited by glc650; 14th January 2019 at 21:52.
glc650 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:12.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.