Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 10th January 2019, 09:56   #54201  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
Quote:
No. I only have one monitor/tv/display hooked up. But I also have an AVR hooked up (but for audio only).
that's a dual screen setup for windows and that can easily produce presentation glitches.
Quote:
What OSD? The TVs?
control+j the madVR OSD where you saw the presentation gliches.
huhn is offline   Reply With Quote
Old 10th January 2019, 10:17   #54202  |  Link
zapatista
Registered User
 
Join Date: Nov 2016
Posts: 16
can some of you video experten please comment on my Q below. i am going slightly troppo trying to setup mpc-hc madvr for playing 4k video files on my HTPC

setup. intel 4770k i7 (haswell) , 32 gb ram, 1060 3gb, with latest lav and madvr on win 8.1 64 bit
display: 1080 tv with 10/12 bit RGB 4.4.4 60fpsec input setting confirmed as working correctly over hdmi 1.4 input

Q: with some of the earlier advice given in this thread i can now play many of my 4k video files fairly well (will need further tweaking in madvr for quality improvement) eg many 4k AVC and 4k HEVC files at different frame rates, mbsec, and bit depth rates
- cpu load +/- 20%, GPU load 50-60 % with only 50 - 60 % of 3GB vram used usually

but some 4k video files play very poorly (lots of stutters) , and their specs all seem to be 4k AVC 10 bit rec 709 with HDR
.
- cpu load goes bezerk to 98%, but GPU activity drops to 10% and only 20% of gpu vram used
- i am aware 4k bluray discs should be using HEVC, but some of the 4k test files i found online seem to be AVC and include official 4k promotion files provided by samsung or sony etc..

do i have a setup error in mpc-hc madvr options that is shifting the workload to the CPU ? how can i divert this to the GPU ?

or am i using some "out of spec" test video files
(found online) that are not compliant with the video decoder instruction set of my pascal GPU ?
these nvidia spec sheets would seem to suggest that for 4k AVC video files the limit might be 8 bit ?
https://developer.nvidia.com/nvidia-video-codec-sdk
https://developer.nvidia.com/video-e...support-matrix
- the confusing part is that the next newer nvidia series (2070 etc) according to those same spec sheets seem to have similar limitation with 4k 10 bit AVC files, and i would have expected newer hardware to gradually increase features

and if my logic so far is correct, does the pascal GPU simply then defaults to letting the CPU try and deal with the video file and this overloads its processing ? (explaining the stuttering video on screen, but madvr is not reporting any dropped frames ?), and are newer intel cpu's able to cope with hardware decoding/playing this file format ?

Last edited by zapatista; 10th January 2019 at 10:20.
zapatista is offline   Reply With Quote
Old 10th January 2019, 10:28   #54203  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
AVC 10 bit isn't supported by hardware decoder and this will nearly for sure never change.

so looks like your CPU can't handle it. AVC decoding scales pretty well with more core so they a newer high end consumer CPU may play it in realtime.
huhn is offline   Reply With Quote
Old 10th January 2019, 10:36   #54204  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,817
Quote:
Originally Posted by zapatista View Post
the confusing part is that the next newer nvidia series (2070 etc) according to those same spec sheets seem to have similar limitation with 4k 10 bit AVC files, and i would have expected newer hardware to gradually increase features
10-bit AVC/H.264 has never been used commercially, and practically no consumer hardware supports it. Additionally, no-one is going to invest into improving AVC support now, because if you want 10-bit, you use newer codecs like HEVC, VP9 or maybe even AV1 already, all of which have (or will have) hardware support.
Honestly if anyone makes 10-bit AVC files with HDR, they are just doing it wrong. Use HEVC, or VP9.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is online now   Reply With Quote
Old 10th January 2019, 11:28   #54205  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 44
Quote:
Originally Posted by huhn View Post
that's a dual screen setup for windows and that can easily produce presentation glitches.
Except it didn't with my previous video card. And it doesn't with the new video *IF* HDMI scaling is at 0. And I have to cable my system up this way as I've had too many issues when using my AVR as and HDMI switch.

Quote:
Originally Posted by huhn View Post
control+j the madVR OSD where you saw the presentation gliches.
Well now I'm getting another result with scaling on lots of dropped frames and high queue times. Before it was lots of glitches with normal queue times.

https://1drv.ms/u/s!AnsGKXR_EKR0hCfOm_6m0n5aXCFL
glc650 is offline   Reply With Quote
Old 10th January 2019, 12:29   #54206  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,487
Quote:
Originally Posted by tp4tissue View Post
What does this mean ?
I had to read @huhn's answer couple of times as well to understand it

Quote:
Originally Posted by tp4tissue View Post
I have a rec709 3dlut (2.4 gamma)
...
Should I make a rec709 lut for 2.2 gamma ?
Exactly that's why I asked it, thanks!

What target nits setting do you use with gamma 2.4 3dlut? And what's the actual brightness of your display? (See below why ...)

Quote:
Originally Posted by sat4all View Post
fhoech (Displaycal): ...

So you should make an rec709 lut for 2.2 gamma.
Thanks for quoting! Perfect answer!

I watched contents with 2.4 gamma 3dlut + hdr2sdr pixelshader on SDR TV in the last 3 months: screen brightness is ~120 nits, I use target nits 120 (!) in madVR. The result is vivid/alive and still nothing too bright with these settings. (The only (obvious) title was (out of those I watched) The Meg with which I had to modify the target nits value.)

After reading your answer and creating a 2.2 gamma 3dlut@120nits for the TV yesterday, I compared couple of frames (although it's not that easy since you have to look at them on the actual display, after changing every settings in madVR and on the TV as well) with the new and previous settings. This is what happened (I compared the 2.2 frames to the 2.4 ones, since I got used to latter one in the past months):
- to reach similar (but not the same!) results with gamma 2.2 I had to raise the target nits to at least 150 but with some frames in the same movie even to 180/200
- images are more alive, have more pop with 2.4 settings than with the raised 2.2 settings (maybe over-saturation, higher gamma, etc ?)

It's like gamma 2.4 would balance the brightness of the content more.
I'm not telling that it's the correct way to use it, all I'm telling is how it behaves.

Quote:
Originally Posted by huhn View Post
as i said before i'm not a friend of this inconsistent behaviour between HDR and SDR.
Thanks, what do you think would be a better approach?
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)

Last edited by chros; 10th January 2019 at 12:33.
chros is offline   Reply With Quote
Old 10th January 2019, 12:30   #54207  |  Link
zapatista
Registered User
 
Join Date: Nov 2016
Posts: 16
@ nevcairiel & huhn,

thank you for the clarification
zapatista is offline   Reply With Quote
Old 10th January 2019, 13:00   #54208  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 842
Quote:
Originally Posted by glc650 View Post
Except it didn't with my previous video card. And it doesn't with the new video *IF* HDMI scaling is at 0. And I have to cable my system up this way as I've had too many issues when using my AVR as and HDMI switch.

Well now I'm getting another result with scaling on lots of dropped frames and high queue times. Before it was lots of glitches with normal queue times.

https://1drv.ms/u/s!AnsGKXR_EKR0hCfOm_6m0n5aXCFL
its unusual to have scaling issues these days with a HDTV, have you got an odd aspect ratio on your TV, what model is it, assume you are not using two displays?


You may also have better luck using FSE, turning off motion smoothing and using refresh rate switching.
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.9.2
KODI DS - MAD/LAV 92.14/0.74.1 - 3D / DIRECT3D11 / MADVR 10bit

Last edited by mclingo; 10th January 2019 at 14:18.
mclingo is offline   Reply With Quote
Old 10th January 2019, 14:03   #54209  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
Quote:
Originally Posted by chros View Post
Thanks, what do you think would be a better approach?
move it to color & gamma by adding a HDR-SDR gamma processing that is ticked by default with pure gamma curve 2.20 and a warning to not change it.
this makes sure you can still properly gamma process SDR and don't ruin HDR with this option.

this makes sure gamma is only processed with this tab and well 3D LUTs so it is consistent as it should be.

the next thing i would add is an info in the 3d LUT what the gamma this 3D LUT target is so this will be treated like "this display is already calibrated" and you are done with 3D LUT and get the correct gamma.

i guess for BT 1886 you take the effective gamma and add this as an info. it's usually between 2.1-2.4 and this is better then nothing.

@glc650
windows is changing the WDDM driver like they have nothing better to do. your old card may just use an older better working WDDM i have no deeper look into this i just know they change it and this can produce presentation issue with are pretty normal with 2 screens.

try dx9 FSE. overlay would be a workaround too. but i doubt AMD added it.
it's clearly an presentation issue.
if it comes from the scaling alone and no other combinations of settings you need to send an custom resolution and hope the TV except it this is an advanced topic. doesn't look like amd has an resize feature someone else would have pointed out the name of it in this thread already the HDMI scaling or custom resolution are not the same.
huhn is offline   Reply With Quote
Old 10th January 2019, 14:23   #54210  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 44
Quote:
Originally Posted by mclingo View Post
its unusual to have scaling issues these days with a HDTV, have you got an odd aspect ratio on your TV, what model is it, assume you are not using two displays?


You may also have better luck using FSE, turning off motion smoothing and using refresh rate switching.
It's 16:9. Mitsubishi LaserView 65". I have a TV attached to the only HDMI port on the card and an AVR attached to one of the display ports with a DP to HDMI adapter (I can't use my AVR as an HDMI switch). FSE doesn't work for me even with scaling at 0. I have to use motion smoothing as my TV doesn't handle 23/24 properly.
glc650 is offline   Reply With Quote
Old 10th January 2019, 14:28   #54211  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 842
ah right, there is nothing you can do on the TV itself to correct the scaling, have you tried this since replacing the card?
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.9.2
KODI DS - MAD/LAV 92.14/0.74.1 - 3D / DIRECT3D11 / MADVR 10bit
mclingo is offline   Reply With Quote
Old 10th January 2019, 15:03   #54212  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 44
Quote:
Originally Posted by mclingo View Post
ah right, there is nothing you can do on the TV itself to correct the scaling, have you tried this since replacing the card?
Tried what since replacing the card?
glc650 is offline   Reply With Quote
Old 10th January 2019, 15:27   #54213  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 842
have you looked into why your HTPC image isnt fitting on your TV correctly, is it a driver issue on your PC or does the TV had an odd quirk which means it cannot show the aspect ratio correctly?, I'm guessing you've already cycled through all the aspect ratios options on the TV as you've no doubt had it a number of years now.
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.9.2
KODI DS - MAD/LAV 92.14/0.74.1 - 3D / DIRECT3D11 / MADVR 10bit
mclingo is offline   Reply With Quote
Old 10th January 2019, 15:57   #54214  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 842
@MADSHI - I wonder if you could maybe create a benchmark tool for MADVR, be interesting for people to properly compare card stats.
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.9.2
KODI DS - MAD/LAV 92.14/0.74.1 - 3D / DIRECT3D11 / MADVR 10bit
mclingo is offline   Reply With Quote
Old 10th January 2019, 16:26   #54215  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,122
Quote:
Originally Posted by huhn View Post
the brightness level of the none compressed parts match at gamma 2.2.

so if you want the correct brightness levels you set this screen is already calibrated to your screens response where unlike SDR madVR will automatic change your "gamma" for HDR sources to 2.2.

as i said before i'm not a friend of this inconsistent behaviour between HDR and SDR.
I know one user was getting what looked like correct HDR gamma tracking at 2.20 with madVR set to clipping. But I sent that person a black clipping pattern and they were clipping some black. So I'm not sure if 2.20 is "correctly" following the PQ curve in all circumstances.

None of the SDR gamma curves have equal response to the original PQ curve:

Image: Gamma 2.20 vs. PQ ST.2084

I would assume madVR would have to tell the display to flash the correct amount of voltage to get a PQ value from an SDR gamma curve and use all kinds of dithering to fill in all of the extra shades of gray required at the low end of the curve.

When tone mapping, changing the target nits also radically changes the perceived gamma, so there is more to the gamma response than just choosing 2.20 or 2.40. It is more customizable than that. As long as you can see reference black, you can make the image darker or brighter as required to get a smooth transition from dark to light. The original tone curve will be compressed either way.

I think there are some who seem to be getting an accurate image with both 2.20 and 2.40. And there are some that I know of that get some black crush when madVR is set to 2.20. You have to test black clipping or you may notice any crush.
Warner306 is offline   Reply With Quote
Old 10th January 2019, 16:57   #54216  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
the thing is actually pretty simple. when madVR tone maps to SDR it will map to 2.2 and there is nothing wrong with that you have to choice a gamma. the problem i have now start when you are using gamma processing HDR can go totally nuts and tries to reach "some" gamma and that's not a good thing. the whole gamma matching with the TV makes total sense but one totally normal setting in it will deny this even when properly used for SDR content and the reason for this is simple the point when madVR applys gamma correction is done at two different spot instead of doing it in one spot different for HDR and SDR.

so if you have a properly 3D LUT calibrated screen it doesn't matter if you switch between 2.4 or 2.2 it will not clip. that'S not my problem anyway here. and even if your screen starts clipping for what ever reason that doesn't mean the 2.2 target is incorrect.
huhn is offline   Reply With Quote
Old 10th January 2019, 17:06   #54217  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,122
2.20 was simply too dark to watch, even if the display was set to 2.20. Near black detail matters a lot when watching HDR content because the image is being tone mapped. SDR content is currently mastered at 2.40 on the most popular mastering displays and then processed to 2.20 by the display. I found the same combination does still work with HDR, even if it isn't considered correct.
Warner306 is offline   Reply With Quote
Old 10th January 2019, 17:17   #54218  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,882
then you shouldn't have a problem with this:
Quote:
move it to color & gamma by adding a HDR-SDR gamma processing that is ticked by default with pure gamma curve 2.20 and a warning to not change it.
this makes sure you can still properly gamma process SDR and don't ruin HDR with this option.
lying to madVR what your display is really calibrated to to get a different result just for HDR is not optimal.

using a gamma processing option to get mathematical incorrect results is totally fine by me it' the users choice even if it is just used for the same reason we use different gammas for SDR to make dark part better visible for example.
huhn is offline   Reply With Quote
Old 10th January 2019, 17:35   #54219  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 406
Quote:
Originally Posted by Warner306 View Post
2.20 was simply too dark to watch, even if the display was set to 2.20. Near black detail matters a lot when watching HDR content because the image is being tone mapped. SDR content is currently mastered at 2.40 on the most popular mastering displays and then processed to 2.20 by the display. I found the same combination does still work with HDR, even if it isn't considered correct.
could that not just be the display ?

which probe are you using. My i1d3 works well, but my spyder5 messes up dark tones and causes crush.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 10th January 2019 at 21:04.
tp4tissue is offline   Reply With Quote
Old 10th January 2019, 20:59   #54220  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,214
Quote:
Originally Posted by mclingo View Post
@MADSHI - I wonder if you could maybe create a benchmark tool for MADVR, be interesting for people to properly compare card stats.
Been asked many multiple times over the years.
ryrynz is online now   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:29.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.