Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd March 2019, 22:45   #55141  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by huhn View Post
this wouldn't the first time everyone with nvidia card got banding with WFS 10 bit. the banding with sending 10 bit is hard to see i don't expect many other user to see it the only reason i even think it has anything todo with the GPU/WDM is the fact is gone in FSE. i still have a problem understanding how a GPU could produce it at input 10 bit 12 bit output.

BTW. there is no need to act elitist here no one forces you to answer anything and no one can read your mind.
not using FSE is not a small unimportant thing with the history by nvidia. known all 4 states(GPU driver, madVR OSD, HDfury and end device) for the signal is helping a lot. yes you know this stuff because you are sitting next to it i don't.

i found a potential fix for your banding issue if you don't like it/can't use it well to bad but this should help other users at least.

even the 8 bit banding is pretty hard to see on the 6 bit TN i'm testing right now.
so if someone wants to test: http://www.bealecorner.org/red/test-...ient-16bit.png
Mate, you need to read more carefully:

I HAVE NO BANDING ISSUE IN 8BITS OR IN 12BITS!

How do you want me to spell it?

I'm not elitist, I'm fed up having to answer all your questions.

I have no idea why you think I have any banding issue, or why you keep wanting to give me solutions for a problem I don't have.

I spend enough time at the moment trying to resolve problems I do have.

As I said, provided I set the bit depth correctly in madVR, there is no banding, in 8bits or in 12bits.

I do spend a lot of time to look for it, and I see it even when it's minimal. I have clips in Allied and in The Revenant that I use to look for banding.

Now please, help me like you did with the colorspace issue, that was useful and appreciated, but please get off my back if you only read half of what I write and force me to repeat, rephrase and lose patience.

Sorry, I'm not in a good mood, but the last thing I need at the moment is wasting time answering you not helping on problems I don't have
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 4th March 2019 at 09:08.
Manni is offline   Reply With Quote
Old 4th March 2019, 03:30   #55142  |  Link
Jinkai
Registered User
 
Join Date: Feb 2019
Posts: 2
Hi, I tried going over this thread but I can’t seem the answer to setting different Gamma Presets using a hotkey. My desired power gamma curve value is 2.35 (color & gamma) and my monitor is calibrated to power curve value 2.2 (calibration) using BT. 709.

How do I make a shortcut key that toggles turning on the desired power gamma curve value 2.35 for night time and turning on power curve value of 2.2 for daytime if possible. Please advise, thank you
Jinkai is offline   Reply With Quote
Old 4th March 2019, 04:07   #55143  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
devices -> right click on "your device name" ->create profile group -> check color & gamma -> right click profile group -> add profile.

add a shortcut and gamma to the profiles done.
huhn is offline   Reply With Quote
Old 4th March 2019, 04:53   #55144  |  Link
maxkill
Registered User
 
Join Date: Jul 2012
Posts: 53
Attachment 16759

Attachment 16760

Attachment 16761
Any setting to remove that slight banding effect?

I can only use 8bpc output color depth unfortunately (but still very happy with results).

Last edited by manono; 1st May 2019 at 09:15.
maxkill is offline   Reply With Quote
Old 4th March 2019, 05:11   #55145  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by Warner306 View Post
Try "hdr" or "bitDepth."
Warner, as always, spot on! "bitDepth" actually works
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 4th March 2019, 07:47   #55146  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by VBB View Post
Warner, as always, spot on! "bitDepth" actually works
I use this to change 3dlut with the TV device, note the lowercase names:
Code:
if (HDR) "hdr"
else "sdr"
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 4th March 2019, 10:09   #55147  |  Link
wushantao
Registered User
 
Join Date: Oct 2011
Posts: 142
crash when saving thumbnails, but not every time, and I don't know why……

mpc be 4450 + lav 0.73.1-28 + madvr 0.92.17

and save image is good x10 times

without madvr is good

settings.bin(same file,5 links)
http://www.solidfiles.com/d/pWa58q7DBwXDZ
https://1fichier.com/?hngi8xezjmvgke18iq68
https://bayfiles.com/G7O7L3v4b8/settings_bin
https://anonfile.com/HaO0Lavfbe/settings_bin
https://ddl.to/wlbbnvntkboh
wushantao is offline   Reply With Quote
Old 4th March 2019, 13:11   #55148  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Guys quick question.. I currently have a Sapphire Nitro + 580 4GB GPU but I have an opportunity to trade this card and pay £50 on top in exchange for a Sapphire Nitro + Vega 56 8GB card .. Would this be a good move, a good improvement for madvr hdr tone mapping or will there be no point in changing my current card? Or will I be better off to pay more and buy a RTX 2060? I only watch 4K HDR and the occasional 1080p upscaled with madvr to 2160p.. Would appreciate your advice.. Thank you
mkohman is offline   Reply With Quote
Old 4th March 2019, 15:48   #55149  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
The Vega 56 is much better than the 580 but a 2060 is better still, it is up to you how high up the price performance curve you want to go. A 2060 is probably better for the future of madVR, but even that is hard to be sure about. Is £50 a lot of money for you? If not I would do that upgrade and see if it does everything you need. Going above NGU high probably isn't worth a lot of money.

If it is, and madshi comes out with something new that wants a 2060 or better, the Vega 56 could disappoint. It is hard to give advice.

Get the best you can comfortably afford.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 4th March 2019, 16:57   #55150  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Quote:
Originally Posted by Asmodian View Post
The Vega 56 is much better than the 580 but a 2060 is better still, it is up to you how high up the price performance curve you want to go. A 2060 is probably better for the future of madVR, but even that is hard to be sure about. Is £50 a lot of money for you? If not I would do that upgrade and see if it does everything you need. Going above NGU high probably isn't worth a lot of money.

If it is, and madshi comes out with something new that wants a 2060 or better, the Vega 56 could disappoint. It is hard to give advice.

Get the best you can comfortably afford.
Thank you so much.. I really appreciate your help.. Currently with my 580 I can do NGU AA MEDİUM for 4K and High for 1080p with the Vega 56 I could probably do NGU AA HIGH for 4K and very high for 1080p.. My rendering is around 30ms for 4K and 29ms.for 1080p upscale to 4K.

Why would the RTX 2060 be better just out of curiosity? The Vega 56 has more stream cores or is this not useful for madvr? Thank you.
mkohman is offline   Reply With Quote
Old 4th March 2019, 17:10   #55151  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the number of "cores" is only comparable in the same architecture at the same clock. nvidia turing and pascal card reach easily 1800 mhz out of the box a vega 56 or 64 can't do that.

the new turing cards can do a lot of "maths" types a lot faster we talk about stuff like x128 faster here then older generation GPUs.
if stuff liek this get's utilized the newer turing cards will run exclusive stuff only they can do and/or do it faster.

so there is a simple solution to this problem. if you don't need a new card don't get one and wait when this "stuff" gets used.
huhn is offline   Reply With Quote
Old 4th March 2019, 18:05   #55152  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by chros View Post
I use this to change 3dlut with the TV device, note the lowercase names:
Code:
if (HDR) "hdr"
else "sdr"
I could not get madVR to detect content properly based on (HDR) or (hdr). With (bitDepth), it will differentiate between 8-bit and 10-bit content, which in return will switch the TV into proper HDR mode when playing HDR content. Another workaround

I'm wondering why you use (HDR) to swap LUTs, though. You have a non-HDR display, so wouldn't having a separate LUT for each color space do the trick? I use DisplayCAL to create LUTs for each SDR slot in madVR, and madVR picks the appropriate one depending on the source.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 4th March 2019, 18:16   #55153  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by VBB View Post
I could not get madVR to detect content properly based on (HDR) or (hdr).
The variable is capital HDR. It works for me (OSD reads the filename of the used 3dlut). Which madVR version do you use?

Quote:
Originally Posted by VBB View Post
I'm wondering why you use (HDR) to swap LUTs, though. You have a non-HDR display, so wouldn't having a separate LUT for each color space do the trick?
Not because of that but because HDR pixelshader needs gamma 2.2 3dlut all the time (with current madvr test builds), as I was told , otherwise I use gamma 2.4 during the evening with sdr content.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 4th March 2019, 18:54   #55154  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by chros View Post
The variable is capital HDR. It works for me (OSD reads the filename of the used 3dlut). Which madVR version do you use?


Not because of that but because HDR pixelshader needs gamma 2.2 3dlut all the time (with current madvr test builds), as I was told , otherwise I use gamma 2.4 during the evening with sdr content.
I thought that's why you did it that way. You really need to upgrade your old LG. I had the 60" 550 from 2011 - 2013 *Just checked the old AVS thread, and I bought it back in 2010 LOL

I use the latest available test version of madVR. Same for LAV, MPC-HC, etc.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex

Last edited by VBB; 4th March 2019 at 18:59.
VBB is offline   Reply With Quote
Old 4th March 2019, 22:24   #55155  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Quote:
Originally Posted by huhn View Post
the number of "cores" is only comparable in the same architecture at the same clock. nvidia turing and pascal card reach easily 1800 mhz out of the box a vega 56 or 64 can't do that.

the new turing cards can do a lot of "maths" types a lot faster we talk about stuff like x128 faster here then older generation GPUs.
if stuff liek this get's utilized the newer turing cards will run exclusive stuff only they can do and/or do it faster.

so there is a simple solution to this problem. if you don't need a new card don't get one and wait when this "stuff" gets used.
Thank you so much.. This makes complete sense.. Maybe it's time I gave nividia another chance

Currently I don't need a new GPU, however the reason I am making enquiries is because I can return my RX 580 to amazon until the 17th march for a full refund.. After that I'm stuck with it until I want to really upgrade..

I'm happy with it but I was wondering if its would be wise to get a refund and replace with a rtx 2060? That's what I was wondering.. And if so.. Is there a specific 2060 model that is recommended? Asus Strix comes to mind but I may be wrong.. Thanks
mkohman is offline   Reply With Quote
Old 4th March 2019, 22:33   #55156  |  Link
blackjack12
Registered User
 
blackjack12's Avatar
 
Join Date: Aug 2012
Location: Silicon Valley
Posts: 46
D3D11 - MadVR support for interlaced media ... any idea when might be available

In my experience, the D3D11 hardware decoder is much more efficient when used correctly. Especially with internal Intel GPU's (like the 7 series and up). It allows 4K, HDR material to be played with no issues where any other setting does not.

madVR has produced the best video and color quality across the board when used with MPC-HC and MPC-BE as tested ...

madVR currently does not support interlaced formats with the D3D11 decoder ... deinterlacing with madVR and DXVA2 works very well.

Apologize if this has been addressed earlier, but have not had time to search the entire madVR site.

Any idea when/if interlaced formats will be supported by madVR with the D3D11 decoder?
__________________
MPC-HC and MPC-BE (latest), MadVR 0.92.17, LAV 0.73.1
Intel NUC w_650 internal, Roku Ultra, Nvidia Shield, Apple TV 4K
PLEX Server with QUADRO 2000
Windows 10 Pro (all latest updates)

Last edited by blackjack12; 4th March 2019 at 22:44.
blackjack12 is offline   Reply With Quote
Old 4th March 2019, 23:53   #55157  |  Link
vanden
Registered User
 
Join Date: Sep 2007
Posts: 104
Hello,

Is it possible to use DXVA resizing for chroma when LAV is in software mode ?

I explain in MadVR (scaling algo/chroma upscal) I can not select DXVA, it works if LAV is in DXVA/DXVA Copy-Back mode but not in software mode.
The problem is that in H265 my card is not able to decode more than 1920x1080 (and not all) so for 1440p or 2160p this is not possible ... no problem the CPU gets there ... But no algo for chroma (in scaling algo/chroma upscal) does work so I can not look at native resolution !

For example a film in 1440p x265 10bit HDR seen on my screen (in 2560x1440) even in Bilinear Chroma resizing, there are lost frames (MPC HC x64) :

https://i.goopics.net/djLWO.jpg

On the other hand the same film 1440p works perfectly on my screen (in 2944x1656) but resizing chroma and image DXVA (MPC HC x64) :

https://i.goopics.net/wEv2y.jpg

When there is an upscale or downscale of the image (luma) then DXVA is also used for chroma (LAV in DXVA or Software). On the other hand, when there is no resizing of the image (luma) and LAV is in Software mode the Chroma does not use DXVA ...

Even a movie 3840x1606 x265 10bit HDR seen on my screen (in 3104x1756) with DXVA (chroma and image) resizing work well (MPC HC x86 + Reeclock) :

https://i.goopics.net/eP7WP.jpg

But impossible to put in MPC Video Frame/Normal Size ... Chroma upscaling is texture unit or pixel shader ...


Is there a solution ?

Last edited by vanden; 10th March 2019 at 12:47.
vanden is offline   Reply With Quote
Old 5th March 2019, 00:02   #55158  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by mkohman View Post

I'm happy with it but I was wondering if its would be wise to get a refund and replace with a rtx 2060? That's what I was wondering.. And if so.. Is there a specific 2060 model that is recommended? Asus Strix comes to mind but I may be wrong.. Thanks
I would. I'd probably pick the first card from ASUS, MSI or Gigabyte that a user says has little to zero coil whine.

Last edited by ryrynz; 5th March 2019 at 03:37.
ryrynz is offline   Reply With Quote
Old 5th March 2019, 02:18   #55159  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
@vanden
if you set upscale and downscale to DXVA and chroma to bilinear you should get DXVA for everything.

@blackjack12
known limitation but no kown ETA at all.
huhn is offline   Reply With Quote
Old 5th March 2019, 04:09   #55160  |  Link
70MM
X Cinema Projectionist NZ
 
Join Date: Feb 2006
Location: Auckland NZ
Posts: 310
Can someone please tell me if I calculated the nits for my screen correctly please for using the target in madvr dynamic tone mapping.
My screen is a Studiotek curved microperf 145" diag with a gain of 1.2

I took a reading at the screen using 100% white, meter facing the JVC NX9 which was on half lamp.
The reading was in F.C. result 17.5 average over the scope screen with Isco IIIL lens in place.
I then converted FC to FL by multiplying by the screen gain 17.5 FL x 1.2 gain = 21 FL

Then converted FL to nits 1FL = 3.4263 nits
Result 21FL = 71.95 nits

Have I dont this correctly. so I use say 71nits for my target in madvr tone mapping?

I dont need to use the size of the screen in the calculations???
70MM is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 02:33.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.