Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
13th January 2019, 14:43 | #54261 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
HDMI 2.0 can do 10 bit 4:2:2 YCbCr and TV support it that one of the rare cases where nvidia is doing 10 bit.
a bd player should use 4:2:0 10 bit for 60 hz HDR that's lossless. i'm not sure where this 12 bit comes from because PC don't really support it. you could do it with 16 bit and let the driver do the rest but still. |
13th January 2019, 18:48 | #54262 | Link |
Registered User
Join Date: May 2018
Posts: 259
|
So you keep saying.
Here is the thing though, so you think nVidia out of the goodness of their hearts have let us have the option of 12bit to help us out, you are quite delusional. The 10bit is for graphic designers and the likes, if you choose to ignore this fact which is all over the web, then good for you mate. Does it make any sense no, but it is what it is, if it was that simple we would have no 8bit, as surely nVvidia would want what is the best for us and it would default to 12 bit straight away, like no breaking drivers every 2nd release and fluid 23.976 frame rates etc etc, and of course 12 bit stays there after a reboot unless you frig something. https://www.pugetsystems.com/labs/ar...t-Output-1221/ https://www.reddit.com/r/nvidia/comm...abling_10_bit/ https://forums.evga.com/gtx-1080-sup...-m2510043.aspx https://devtalk.nvidia.com/default/t.../post/5078463/ https://www.eizoglobal.com/support/c...18-nvidia-amd/ But lets leave that there. Last edited by madjock; 13th January 2019 at 19:24. |
13th January 2019, 20:00 | #54263 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
the professional 10 bit is using openGL on a 8 bit surface to hack the 2 missing bits in the alpha channel. and programs that use that you need a professional card to get 10 bit in the GPU driver. the GPU output has nothing todo with this!
since windows 7 we have 10 bit output to the GPU using directx 11 on any consumer card. you just need a fullscreen exclusive d3d11 surface and this is obviously not usable for photoshop but easily with game or a video renderer. high bit deep output is a well known and for years used feature of madVR. do you even read your links? https://nvidia.custhelp.com/app/answ...11/kw/10%20bit |
13th January 2019, 20:12 | #54265 | Link |
Registered User
Join Date: Dec 2002
Posts: 5,565
|
madvr dithers to the bitdepth you set for your display device in the madvr settings. What your GPU driver does with that is a different question. Usually madvr should dither to 10 bit because it does processing (like scaling) in higher precision irregardless of the bitdepth of the source.
Last edited by sneaker_ger; 13th January 2019 at 20:14. |
13th January 2019, 20:16 | #54267 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
This is an image from the official HDMI 2.0 spec page: https://i.postimg.cc/WpfKqJRZ/HDMI-2-0-Specs.png The majority of UHD streaming boxes and Blu-ray players output at 12-bit YCbCr 4:2:2 at all refresh rates. The PC shouldn't be at a distinct disadvantage because it has no option for 10-bit RGB.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
|
13th January 2019, 20:18 | #54268 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
source bitdeep is irrelevant because it is ycbcr and converting it to RGB creates floatpoint number (no bit deep is big enough to save the number).
so the madVR output bit deep mostly changes the nosie level in the image created from madVR. if dithering would be disabled the difference between 8 and 10 bit would be a significant difference in banding even with an 8bit source. |
13th January 2019, 20:19 | #54269 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
If someone can show me how to get a 10bit Resolution in Windows 10 with a normal nVidia card I am all ears. I am also pretty sure over many threads here that 8bit RGB is usually always recommended, but it is a slow madVR day nowadays. |
|
13th January 2019, 20:22 | #54270 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
Quote:
good job HDMI organisation. and i'm sending 10 bit just for the fun of it with an nvidia card. |
|
13th January 2019, 20:28 | #54271 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
Thats not a normal NVCP...show us RGB - Full then win a prize Last edited by madjock; 13th January 2019 at 20:34. |
|
13th January 2019, 20:37 | #54272 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
do yourself and this thread a favour and move on from this topic.
nev has said anything that needs to be said about this. sending 10 bit RGB is a no issue with AMD cards the nvidia controls are not as flexible as these from AMD in this regard. sending 12 instead 10 would always be fine if the processing in TVs would be solid. |
13th January 2019, 20:47 | #54273 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
|
13th January 2019, 21:05 | #54274 | Link | |
Registered User
Join Date: May 2018
Posts: 259
|
Quote:
|
|
14th January 2019, 02:01 | #54275 | Link |
Registered User
Join Date: Jul 2015
Posts: 6
|
Could anyone recommend me the settings to best lower noise?
I'm wondering if the high sharpness is causing more grain in the videos I'm watching. I'm using 4k hdr videos of high quality bitrate through MPC-BE and MAdVR. But some videos just dont look very good, for example The Prestige looks like a grainy mess, and Infinitywar isnt very great. Are there settings I should use that are simply better then other options? I have LAV hardware decoder set to d3d11. I'm using gtx 1080ti and 8700k so power shouldnt be an issue. |
14th January 2019, 06:37 | #54276 | Link |
Registered User
Join Date: Mar 2009
Posts: 3,650
|
This sounds like film grain, it's meant to be there. Reduce all your sharpening (TV/madVR) and observe, you have 4K high bitrate sources, there is little reason to add even moderate sharpening which may exacerbate noise.
Not sure if you looked but madVR does have a option called 'reduce random noise' under processing -> artifact removal.. Not sure how you missed that. Also your TV itself with have some artifact/noise removal options too. |
14th January 2019, 15:36 | #54278 | Link | |
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
If I have nvidia, and I want to output HDR to TV, should my setting on the nvcp output page be 4:2:2 10bit / 12bit ? or does it not matter, and the Madvr auto pops the screen to NVHDR, which does whatever it does , "which is ??" I also have an ATI card incoming, what should my output be set for ATI
__________________
Ghetto | 2500k 5Ghz |
|
14th January 2019, 15:47 | #54279 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
for best results you should always send RGB if possible.
madVR doesn't care about the output of the GPU it just send images the driver which is always RGB. Quote:
the rest doesn't change. you should not go over 10 bit on AMD use the chance of 10 bit RGB on them for a at least in theory untouched output. |
|
14th January 2019, 21:36 | #54280 | Link |
Registered User
Join Date: May 2003
Posts: 77
|
I'm thinking about sending my Radeon 580 back. How does a GTX 1060 6GB compare to a Radeon 580 8GB?. With the 580 I'm able to run NGU A-A low for chroma upscaling and Jinc with sigmoidal light & anti-ringing for image upscaling. I also have most of the image enhancement and upscaling refinement options enabled (with default values).
Last edited by glc650; 14th January 2019 at 21:52. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|