Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th January 2019, 13:37   #54461  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by Asmodian View Post
There are three places where conversions can happen before the display and they do not interact. The display can be doing conversions internally too, but we cannot control these.

1) LAV video can do a conversion if the render does not support the source format. When using madVR this should never be needed and it is not ideal.
2) madVR will output RGB at whatever bitdepth is set.
3) The GPU driver will do a conversion to whatever you set in its control panel.

madVR will never change what it outputs based on what the GPU is set to and the GPU will never change its output based on what madVR sends it. All these steps simply convert any input to their configured output format without a lot of smarts involved. For example the GPU is happy to convert madVR's 8 bit output to 10 bit. You know for sure what the GPU is outputting based on what it is set to in its drivers.

Also, I can send my TV 8 bit RGB HDR and it will say HDR10, that is just a brand label for the metadata standard, it works with 8 bit RGB too.
So if the NvidiaCP is on 8 bit rgb, and the TV is popped into NVHDR, MADVR set to 10 bit.

What am I seeing, still 8 bit HDR ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 13:40   #54462  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 612
Quote:
Originally Posted by tp4tissue View Post
So if the NvidiaCP is on 8 bit rgb, and the TV is popped into NVHDR, MADVR set to 10 bit.

What am I seeing, still 8 bit HDR ?
8bit. Whatever the GPU output format is set to to is what your TV is receiving (the first line in madVR stats will tell you)
iSeries is offline   Reply With Quote
Old 25th January 2019, 13:41   #54463  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 224
Quote:
Originally Posted by tp4tissue View Post
So if the NvidiaCP is on 8 bit rgb, and the TV is popped into NVHDR, MADVR set to 10 bit.

What am I seeing, still 8 bit HDR ?
Yes. You can get nVidia to 12Bit either easy or hard depending what drivers you use, whether that does anything you can see is another matter.
madjock is offline   Reply With Quote
Old 25th January 2019, 13:52   #54464  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 612
Quote:
Originally Posted by madjock View Post
whether that does anything you can see is another matter.
Indeed, and may actually do more harm than good anyway - definitely for my TV there is more banding at 12bit than 8bit (LG C8).
iSeries is offline   Reply With Quote
Old 25th January 2019, 15:54   #54465  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by iSeries View Post
8bit. Whatever the GPU output format is set to to is what your TV is receiving (the first line in madVR stats will tell you)
Quote:
Originally Posted by madjock View Post
Yes. You can get nVidia to 12Bit either easy or hard depending what drivers you use, whether that does anything you can see is another matter.
Quote:
Originally Posted by iSeries View Post
Indeed, and may actually do more harm than good anyway - definitely for my TV there is more banding at 12bit than 8bit (LG C8).


OK guys, tahnx, I am understands now.. My TV has been Lying to me, that cheatn' whore..


Which driver are y'all guys using for 12 bit ? (nvidia)


Does 10 bit work on all drivers ? (nvidia)
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 15:57.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 16:57   #54466  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 612
Quote:
Originally Posted by tp4tissue View Post
OK guys, tahnx, I am understands now.. My TV has been Lying to me, that cheatn' whore..


Which driver are y'all guys using for 12 bit ? (nvidia)


Does 10 bit work on all drivers ? (nvidia)
385.28 probably works easiest if you are using a custom resolution. With later drivers (not sure when it started), madVR could only create 8bit custom resolutions, which means needing CRU to create it. I found 416.35 to work great if not using a custom resolution (or if creating an 8bit custom res with madVR, or creating a 12bit custom res with CRU). I believe the latest driver will only kick into HDR if madVR sends 10bit, regardless of whether the GPU driver is set to 12bit or 8bit.

You should look and carefully compare 12bit vs 8bit. Many TVs will show colour banding with 12bit input which isn't there when given 8bit.

Last edited by iSeries; 25th January 2019 at 17:40.
iSeries is offline   Reply With Quote
Old 25th January 2019, 17:24   #54467  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 139
Quote:
Originally Posted by HillieSan View Post
Stop using SVP and set the right frequency of your display and your graphics card. This is my experience and it works well.
The fact is that I really like the enhanced temporal resolution. It feels a bit weird at start, but once you get used to it you really appreciate the inprovement (though I admit that for some content it will always look too weird). I have even modified the base avisynth script to get rid of most artifacts.
For HDR that's a problem that you don't know very well where to start. Obviously it's not very reasonable to ask madshi to support tonemapping for input that doesn't identify itself as HDR, and ffdshow is no longer being developed, so there's not an easy solution. I'm testing now DmitriRender and it seems better than SVP in both performance and visual quality. Unfortunately it doesn't support HDR metadata passthrough either, but at least the developer says he's working on it.
Alexkral is offline   Reply With Quote
Old 25th January 2019, 17:41   #54468  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,985
just frame match and let the TV do it.

doing 5/1 instead of 5/2 interpolation is massively better anyway even through i would use neither.
huhn is offline   Reply With Quote
Old 25th January 2019, 17:48   #54469  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Dogway View Post
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
Coil whine is defect of the fan, so NGU can exasperate it, but not cause it. Using very high can push some GPUs into overdrive.
Warner306 is offline   Reply With Quote
Old 25th January 2019, 17:51   #54470  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by madjock View Post
Yes but there are no numbers on this ?

I did try the HDR set you mention, but I could not get the clipping ones to work, or rather get my TV to adjust once in HDR mode, unsure if this was me, or that adjusting for black crush is broken for me on my TV as I read is common in HDR mode.
The numbers are at the top and bottom of the pattern. That pattern is really only for those with 8-bit displays. It is not a great pattern, but it is the only 8-bit HDR10 black clipping pattern I could find. I was able to calibrate black clipping by using it.

It is mostly useful to select the output SDR gamma curve for HDR -> SDR. With most HDR displays, it is not advisable to change the brightness or contrast controls because you can offset the display's tone mapping, which is based on the default settings for both.
Warner306 is offline   Reply With Quote
Old 25th January 2019, 18:19   #54471  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 224
Quote:
Originally Posted by Warner306 View Post
With most HDR displays, it is not advisable to change the brightness or contrast controls because you can offset the display's tone mapping, which is based on the default settings for both.
Thats the part I struggle with, I have lots of presets and Movie is usually a favorite, but unsure what is right or wrong. I turn off all processing and motion options, but when I tried any HDR calibration items, especially black and white I could not achieve anything near the finesse of adjusting for clipping on an SDR display.
madjock is offline   Reply With Quote
Old 25th January 2019, 19:53   #54472  |  Link
hannes69
Registered User
 
Join Date: Nov 2012
Posts: 71
Quote:
Originally Posted by Warner306 View Post
Coil whine is defect of the fan, so NGU can exasperate it, but not cause it..
No.
My passive GPU has coil whine as well. That has nothing to do with a fan.
Additionally the coil whine seems to be triggered by the kind of electric load caused by NGU, it doesnīt depend on the strength of the load but on the 'sort' of load.
Nothing to do against it - beside of some nasty (and maybe not working) decoupling tricks with glue, plastic spray etc.
Coil whine is part of NGU (on the hardware side)
Madshi should implement a totally new scaler that is better in all regards than all other recent scalers and of course coil-whine-proof
hannes69 is offline   Reply With Quote
Old 25th January 2019, 20:28   #54473  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,717
It isn't the fan it is the chokes or coils in the power delivery circuits, hence "coil whine".

The power circuits are contently turning off and on because they use PWM to control power output. Anytime this on/off cycle matches a resonate frequency or harmonic of the coils they can vibrate audibly. Almost any load can cause coil whine if you have coils that can vibrate audibly. The GPU core and memory also use different power circuits so it isn't surprising to have memory load (power draw) cause whine if it is one of those coils that is resonating.

Coil whine is not "part of NGU". NGU is simply a somewhat heavy, but not 100%, load on the card. I do not have any coil whine with NGU (or anything else) because the power circuitry on my card, or my previous one, does not resonate at any audible frequencies under any loads I have run so far. Don't blame NGU, blame whoever made your GPU for selecting chokes that can whine.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2019, 20:41   #54474  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by Asmodian View Post
It isn't the fan it is the chokes or coils in the power delivery circuits, hence "coil whine".

The power circuits are contently turning off and on because they use PWM to control power output. Anytime this on/off cycle matches a resonate frequency or harmonic of the coils they can vibrate audibly. Almost any load can cause coil whine if you have coils that can vibrate audibly. The GPU core and memory also use different power circuits so it isn't surprising to have memory load (power draw) cause whine if it is one of those coils that is resonating.

Coil whine is not "part of NGU". NGU is simply a somewhat heavy, but not 100%, load on the card. I do not have any coil whine with NGU (or anything else) because the power circuitry on my card, or my previous one, does not resonate at any audible frequencies under any loads I have run so far. Don't blame NGU, blame whoever made your GPU for selecting chokes that can whine.

It isn't always the choke, the Core itself is susceptible to piezoelectric effects given the right frequency of operation.

Capacitors also, but usually only big ones.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 21:02   #54475  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,717
Woah! I have never head of that, the core resonating at audible frequencies enough to make noise!?! It seems like that would be really bad. Do GPUs have capacitors big enough to make noise?

It wouldn't be coil whine either, the coils in that description are the chokes.

Anyway my point still stands, this is not NGU's fault, it is simply a common defect in the hardware that can be triggered by NGU or any other GPU load.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2019, 21:33   #54476  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by iSeries View Post
385.28 probably works easiest if you are using a custom resolution. With later drivers (not sure when it started), madVR could only create 8bit custom resolutions, which means needing CRU to create it. I found 416.35 to work great if not using a custom resolution (or if creating an 8bit custom res with madVR, or creating a 12bit custom res with CRU). I believe the latest driver will only kick into HDR if madVR sends 10bit, regardless of whether the GPU driver is set to 12bit or 8bit.

You should look and carefully compare 12bit vs 8bit. Many TVs will show colour banding with 12bit input which isn't there when given 8bit.

Is there a 10bit 8bit gradient test pattern confirmed to distinguish noticeably in Madvr..
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 21:36.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 21:38   #54477  |  Link
hannes69
Registered User
 
Join Date: Nov 2012
Posts: 71
Quote:
Originally Posted by Asmodian View Post
Anyway my point still stands, this is not NGU's fault, it is simply a common defect in the hardware that can be triggered by NGU or any other GPU load.
Yes, you can read in the net here and there about coil whine in PSUs and GPU boards, sometimes more often with certain manufacturers and models, so to speak a kind of hardware defect.
But many people in this thread here reported independantly about coil whine in combination with NGU. And many confirmed that the whine only can be triggered by NGU. So you have coil whine by using NGU with e.g. 85% GPU load and you have no coil whine when using any other scaler with 85% GPU load.
So NGU seems to trigger a kind of 'load distribution' or whatever you call it that triggers coil whine.
A perfect coil should not produce any noise of course. But we donīt live in a perfect world
My GPU NEVER makes any noise (independant of load) with the exception of NGU (-> coil whine).
Interesting physical phenomena
Of course I donīt blame NGU or madshi. The coil should be mute in all possible situations, but NGU seems to be a strong trigger for unknown reasons.

Last edited by hannes69; 25th January 2019 at 21:40.
hannes69 is offline   Reply With Quote
Old 25th January 2019, 22:23   #54478  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,717
madVR is a very spiky load. After the buffers are full it renders one frame as fast as possible then idles until the next frame needs to be rendered. NGU simply needs more work done by the GPU so these odd peaks last longer. I think this is why NGU seems to trigger coil whine in a way that many GPU bound applications do not.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 26th January 2019, 00:14   #54479  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Asmodian View Post
It isn't the fan it is the chokes or coils in the power delivery circuits, hence "coil whine".

The power circuits are contently turning off and on because they use PWM to control power output. Anytime this on/off cycle matches a resonate frequency or harmonic of the coils they can vibrate audibly. Almost any load can cause coil whine if you have coils that can vibrate audibly. The GPU core and memory also use different power circuits so it isn't surprising to have memory load (power draw) cause whine if it is one of those coils that is resonating.

Coil whine is not "part of NGU". NGU is simply a somewhat heavy, but not 100%, load on the card. I do not have any coil whine with NGU (or anything else) because the power circuitry on my card, or my previous one, does not resonate at any audible frequencies under any loads I have run so far. Don't blame NGU, blame whoever made your GPU for selecting chokes that can whine.
That is informative. I've never experienced coil whine, either. I did have an idea of what it sounds like, but the GPU has always managed to arrive in working order. I would have hoped this would be less common in new GPUs by this point.
Warner306 is offline   Reply With Quote
Old 26th January 2019, 00:18   #54480  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by tp4tissue View Post
Is there a 10bit 8bit gradient test pattern confirmed to distinguish noticeably in Madvr..
Come on man, you are bordering on spamming this thread with random stuff.

There is one gradient test in the madVR guide in my signature that can be of some help, but you might have to find some blue skies in UHD Blu-rays to prove or disprove banding.
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:42.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.