Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 13th August 2017, 07:22   #321  |  Link
MariaX9
Registered User
 
Join Date: May 2016
Posts: 27
Question because I do not understand it. I have at the reduce banding artifacts setting low/medium, should i tick "dont render frames when fade in/out is detected"?
MariaX9 is offline   Reply With Quote
Old 13th August 2017, 15:24   #322  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,650
well the LG OLED screens are more known for not dithering at all than doing heavy dithering.

http://i.rtings.com/images/reviews/c...ding-large.jpg
http://i.rtings.com/images/reviews/a...ding-large.jpg
huhn is offline   Reply With Quote
Old 13th August 2017, 19:38   #323  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,998
Quote:
Originally Posted by MariaX9 View Post
Question because I do not understand it. I have at the reduce banding artifacts setting low/medium, should i tick "dont render frames when fade in/out is detected"?
I do tick "dont render frames when fade in/out is detected". If you are using easy settings for your GPU you do get 5 more medium debanded frames during the start of a fade with it not ticked but your GPU has to render 5 extra frames every time it detects a fade. This spike in load can cause a dropped frame or presentation glitch.

Quote:
Originally Posted by huhn View Post
well the LG OLED screens are more known for not dithering at all than doing heavy dithering.

http://i.rtings.com/images/reviews/c...ding-large.jpg
http://i.rtings.com/images/reviews/a...ding-large.jpg
It definitely dithers to display shadows, like a plasma, you can see it quite easily.

Banding was a huge problem when setting it up but the banding does not mean it is not dithering, only that it is bad at it. When I set it to 6 bit in madVR the display's banding goes away pretty much entirely, even with default settings, and when set to 10 bit the banding is more common (I can see more of the steps). Banding test patterns all look very different at 6, 8, or 10 bit so I am sure the display is receiving different bit depths but it seems to have no idea how to go about displaying them without banding. Rounded looks much worse all the time, each step is visible and sharper, so it is not simply rounding anywhere. How do you get visible banding with a 16-bit gradient dithered to 10-bit on a 0-95 cd/m^2 display?!?

I had to tweak contrast and brightness a lot before finding settings that minimize the banding, which helped but still didn't eliminate it completely. I also have to be careful when watching real content and blaming banding on the display, some blurays I thought were fine turned out to have a lot of banding while others do not. If high debanding in madVR fixes it I blame the source but if it doesn't change it noticeably I blame the display. For my final calibration the contrast ratio was reported as 9528804659:1 by CalMAN, which does expose banding in the source.

All together this display does look very good, the black level and color saturation are both amazing, but banding is its major weakness.
__________________
madVR options explained

Last edited by Asmodian; 13th August 2017 at 19:55.
Asmodian is offline   Reply With Quote
Old 13th August 2017, 23:08   #324  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 814
Just a quick update re 3D, nVidia and MadVR.

The key to get this working seems to be to enable steroscopic 3D in the nVdia drivers while in 1080p, run the nVidia 3D Setup, and then set the desktop to whichever resolution/frame rate you actually use. My default is 4K@23p. MPC-BE/LAV/MadVR switch to 1080p3DFP to play the film automatically. They don't go back to 4K23 automatically, I have to leave MyMovies for that to happen. Sometimes they don't go back to 2D either, but in that case I only have to start playing a 2D movie.

The experimental custom rs tool Madshi made available recently has done wonders for my 2D bluray playback: I went from one frame repeat every 3-5 minutes to 1 frame drop every 2 hours or more, so very happy with that.

I still have to play a bit with the settings to find what works best with my new 1080 Ti, I really like the bluray to 4K upscaling using NGU Anti-Alias.

Overall super happy with the upgrade to the 1080 Ti from my HD7870.

Thanks again for all the help.
__________________
Win10 Pro x64 b2004
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz RGB Full 8bits 442.74
madVR/jRiver/MyMovies/CMC
Denon X8500H>madVR Envy Extreme>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 14th August 2017, 17:39   #325  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,998
Thanks for the update on 3D!

I too have been very happy with NGU Anti-Alias.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd August 2017, 07:52   #326  |  Link
cyber201
Registered User
 
Join Date: Feb 2017
Posts: 32
hi Asmodian, I'm waiting your settings for the new version of madvr....
Yeahhhh

Thanks
Bye
cyber201 is offline   Reply With Quote
Old 27th August 2017, 08:11   #327  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,998
Finally updated for 92.2. A lot of fixes along with the amazing new custom resolution tool in display modes.

Also DX11 native DXVA2 decoding, no quality loss but still no IVTC or black bar detection.

I also rebuilt my settings, they are still the same settings but I reset to default with the new version and reconfigured from scratch. There were some people reporting crashes with old settings files (and even those looks to have been fixed in 92.2) and while I didn't have any issues I figured a fresh start was the safest option.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 27th August 2017, 15:22   #328  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,746
Native DXVA2 decoding doesn't show the chroma blurring when DXVA2 chroma scaling is used at the same time.
Since DXVA2 scaling seems to equal some kind of bicubic on Nvidia, it's actually not too bad afterall and dimensions better than bilinear.
There don't seem to be any other quality drawbacks with recent madVR version on Windows 10 Creators Update, no banding problems.

New DX11 decoding API btw. isn't called DXVA anymore, afaik, but D3D11VA.
aufkrawall is offline   Reply With Quote
Old 27th August 2017, 17:58   #329  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
FWIW, I'm not sure if the custom modes are really limited to 8bit. This will depend on the driver. So it might be totally different with Nvidia, AMD and Intel.
madshi is offline   Reply With Quote
Old 27th August 2017, 18:01   #330  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 814
Quote:
Originally Posted by madshi View Post
FWIW, I'm not sure if the custom modes are really limited to 8bit. This will depend on the driver. So it might be totally different with Nvidia, AMD and Intel.
Definitely not limited to 8bits with nVidia. I have a custom mode running for 4K23p @ 12bits 4:4:4
__________________
Win10 Pro x64 b2004
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz RGB Full 8bits 442.74
madVR/jRiver/MyMovies/CMC
Denon X8500H>madVR Envy Extreme>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 27th August 2017, 18:52   #331  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,998
Can you set the custom mode to be 12-bit in the Nvidia control panel? I can also get 12-bit RGB to work but I have to be in a 12-bit mode already before switching to it. I do not have the option to set the custom mode to 12-bit in the control panel.

Quote:
Originally Posted by aufkrawall View Post
Native DXVA2 decoding doesn't show the chroma blurring when DXVA2 chroma scaling is used at the same time.
Since DXVA2 scaling seems to equal some kind of bicubic on Nvidia, it's actually not too bad afterall and dimensions better than bilinear.
There don't seem to be any other quality drawbacks with recent madVR version on Windows 10 Creators Update, no banding problems.

New DX11 decoding API btw. isn't called DXVA anymore, afaik, but D3D11VA.
I tried to capture this information, is there anywhere you think I need to add something? Thanks.

Quote:
Originally Posted by madshi View Post
FWIW, I'm not sure if the custom modes are really limited to 8bit. This will depend on the driver. So it might be totally different with Nvidia, AMD and Intel.
Good point, I will edit my entry to reflect my limited knowledge of the situation.
__________________
madVR options explained

Last edited by Asmodian; 27th August 2017 at 19:49.
Asmodian is offline   Reply With Quote
Old 27th August 2017, 20:48   #332  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,650
Quote:
Originally Posted by aufkrawall View Post
Native DXVA2 decoding doesn't show the chroma blurring when DXVA2 chroma scaling is used at the same time.
Since DXVA2 scaling seems to equal some kind of bicubic on Nvidia, it's actually not too bad afterall and dimensions better than bilinear.
There don't seem to be any other quality drawbacks with recent madVR version on Windows 10 Creators Update, no banding problems.

New DX11 decoding API btw. isn't called DXVA anymore, afaik, but D3D11VA.
doing a short test:

dxva: https://abload.de/img/dxvaoeob0.png
bilinear: https://abload.de/img/bilineary8of4.png
bicubic: https://abload.de/img/bicubicz8qci.png
they are all the same frame the shift comes from the scaler.
it is as bad as ever and bilinear is doing better.
huhn is offline   Reply With Quote
Old 27th August 2017, 21:11   #333  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,746
wth is this example even supposed to portray? It's far off any reality.
It's completely nonsense to think chroma would be blurrier than with bilinear.

software decoding + bilinear:


DXVA2 native decoding + bilinear:


DXVA2 native decoding + DXVA2 scaling:


Oh, and bilinear downscaling also gives you one aliased mess for every target which is not exactly 1/4 or 1/16 etc. of source resolution btw...
aufkrawall is offline   Reply With Quote
Old 28th August 2017, 01:35   #334  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,650
so this BD is not a real source? yes chroma doesn't matter much for this image but that doesn't change what it is doing to the chroma channel.

what this is suppose to show?
obviously that there is something terrible wrong with DXVA chroma scaling.
just to make that clear i was only using DXVA chroma scaling nothing else in that image there is a reason i add the OSD.

and it is nonesense to think it is blurrier than bilinear? it clearly is in this case...

and about banding DXVA is still using low bitdeep and is adding banding you can check this with the BW.avi. where this will be notable is a different story but still it is still terrible at it.
huhn is offline   Reply With Quote
Old 28th August 2017, 02:22   #335  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,746
Quote:
Originally Posted by huhn View Post
so this BD is not a real source?
yes chroma doesn't matter much for this image but that doesn't change what it is doing to the chroma channel.

obviously that there is something terrible wrong with DXVA chroma scaling.

and it is nonesense to think it is blurrier than bilinear? it clearly is in this case...
The image is grayscale only (at least to the eye), but actually even between bicubic and bilinear chroma scaling there is a huge difference in sharpness.
So what does it prove? Only that a soft chroma scaler softens the image here. There is already lots of aliasing in the source, and of course a soft scaler has a strong influence on that.

If there was a real problem with chroma blur & DXVA scaling, my cartoon example would look like blurry garbage. But actually, both chroma and luma are way sharper than with bilinear (it's 640x368 -> 2532x1440).

I have the suspicion that DXVA2 scaling uses different sharpness levels for different source resolutions. I tried a 720p non-cartoon sample a few days back and here DXVA2 scaling behaved mostly like madVR's softcubic60.
This is not bad quality, it may be just not to your likings.

Quote:
Originally Posted by huhn View Post
and about banding DXVA is still using low bitdeep and is adding banding you can check this with the BW.avi. where this will be notable is a different story but still it is still terrible at it.
I don't see it:
aufkrawall is offline   Reply With Quote
Old 28th August 2017, 05:19   #336  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 6,650
my screens are Cr rendered as Y.
it's an madVR feature to look up the chroma channels directly.
so if i want to know it DXVA got fixed i just use that.


but the real issue is that DXVA is known to ruin the chroma channel on nvidia cards and in my test nothing has changed so it is still bad in my book.

and now about your "10" bit image. first of all it is pretty much a solid color so a chroma scale is not adding a lot if at all banding here. second the source has banding you can see it in the 10 bit parts all 1024 steps and this get's removed by madVRs debanding making it a really bad test pattern for banding. that fact that the 8 bit is rounded is the nail in the coffin not a fair comparison at all.
huhn is offline   Reply With Quote
Old 28th August 2017, 10:36   #337  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 814
Quote:
Originally Posted by Asmodian View Post
Can you set the custom mode to be 12-bit in the Nvidia control panel? I can also get 12-bit RGB to work but I have to be in a 12-bit mode already before switching to it. I do not have the option to set the custom mode to 12-bit in the control panel.
I can absolutely select 12bits in the nVidia control panel when my 2160p23 custom mode is selected. It's the one made from the EDID values, as none of the others would work for me here.

Of course I have to select "use nVidia colour settings" not "use default color settings" for this to work, but that's the same with non-custom modes so you should already have this selected.
__________________
Win10 Pro x64 b2004
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz RGB Full 8bits 442.74
madVR/jRiver/MyMovies/CMC
Denon X8500H>madVR Envy Extreme>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 28th August 2017, 11:43   #338  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 814
Correction: I was still under 385.28. I updated to 385.41 and I now have the same behaviour (can't select 12 bits from custom mode).
Reverting to 385.28
__________________
Win10 Pro x64 b2004
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz RGB Full 8bits 442.74
madVR/jRiver/MyMovies/CMC
Denon X8500H>madVR Envy Extreme>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 28th August 2017, 15:03   #339  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,746
Quote:
Originally Posted by huhn View Post
my screens are Cr rendered as Y.
it's an madVR feature to look up the chroma channels directly.
so if i want to know it DXVA got fixed i just use that.
Nope, I'm sticking to real world tests (except of gradient images).
Especially since we are talking about DXVA scaling for low-end GPUs (it's called compromise).

Quote:
Originally Posted by huhn View Post
but the real issue is that DXVA is known to ruin the chroma channel on nvidia cards and in my test nothing has changed so it is still bad in my book.
You haven't told me why the blur would be bad, nor why I do not encounter it with my cartoon example, where the blur by DXVA2 decoding is easily visible. Compare with softcubic?

Quote:
Originally Posted by huhn View Post
and now about your "10" bit image. first of all it is pretty much a solid color so a chroma scale is not adding a lot if at all banding here. second the source has banding you can see it in the 10 bit parts all 1024 steps and this get's removed by madVRs debanding making it a really bad test pattern for banding. that fact that the 8 bit is rounded is the nail in the coffin not a fair comparison at all.
Provide a test sample please.
aufkrawall is offline   Reply With Quote
Old 28th August 2017, 15:08   #340  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,013
Quote:
Originally Posted by huhn View Post
but the real issue is that DXVA is known to ruin the chroma channel on nvidia cards and in my test nothing has changed so it is still bad in my book.
That isn't an inherent property of DXVA scaling or anything, its just an artifact if you try to access the 4:2:0 "untouched", because there is no 4:2:0 texture format in D3D9, but when you use DXVA to upscale chroma (or the entire image) anyway, then this doesn't really apply. Its not like the chroma is blurred first and then upscaled, thats not how it works. The method madVR uses to try to extract 4:2:0 subsampled chroma just causes blurring.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 28th August 2017 at 15:10.
nevcairiel is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:06.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.