Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 11th March 2019, 15:12   #81  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Manni View Post
So I'm back to 398.11, which indeed still passes through HDR metadata, using CRU 1.4.1 and custom res.
@Manni, do/did you have the same behaivor as this?
I always have to switch to 60Hz at first to madvr correctly recognise the custom 23p.
Thanks
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 11th March 2019, 17:48   #82  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by chros View Post
@Manni, do/did you have the same behaivor as this?
I always have to switch to 60Hz at first to madvr correctly recognise the custom 23p.
Thanks
The reporting of 24p for the custom refresh rate instead of 23p in the NCP is a known bug, but it's cosmetic only.

I set the 23p custom rate (created by CRU) as my default, and I don't have the issue you mention.

MadVR switches to 60p (or other rates) if necessary, but I mostly play 23p content and my custom rate is always selected by defaut (now that I've put the DIsplay ID custom rate at the top of the list, thanks iSeries!).

But I don't use my HTPC for gaming or anything else, and I used CRU to create the custom refresh rate, not madVR because with recent drivers madVR created custom rates don't work in 12bits (8bits is forced in the NCP).

So quite a few variables/difference from what you're doing.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 13th March 2019, 17:41   #83  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 655
So we're past nine months without new Nvidia drivers that correctly passthrough HDR data? Am I getting this right? Nvidia really doesn't give a damn about people with HTPC/gaming setups...
ashlar42 is offline   Reply With Quote
Old 14th March 2019, 00:11   #84  |  Link
grendelrt
Registered User
 
Join Date: Sep 2017
Posts: 15
Quote:
Originally Posted by ashlar42 View Post
So we're past nine months without new Nvidia drivers that correctly passthrough HDR data? Am I getting this right? Nvidia really doesn't give a damn about people with HTPC/gaming setups...
Madshi had posted on the official Geforce forums he had passed it on to a dev friend at Nvidia last month. Wonder if he ever got any confirmation back.
grendelrt is offline   Reply With Quote
Old 14th March 2019, 00:13   #85  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by grendelrt View Post
Madshi had posted on the official Geforce forums he had passed it on to a dev friend at Nvidia last month. Wonder if he ever got any confirmation back.
Nevcairiel said earlier in the thread that it was unlikely to be corrected before the next major branch, maybe in April.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 14th March 2019, 00:16   #86  |  Link
grendelrt
Registered User
 
Join Date: Sep 2017
Posts: 15
Quote:
Originally Posted by Manni View Post
Nevcairiel said above that it was unlikely to be corrected before the next major branch, maybe in April.
It does also affect gaming, so I am surprised it is taking so long to fix. Probably because most people don't pay attention to the metadata coming in (consoles don't even send any). I will cross my fingers super hard and hope they fix it
grendelrt is offline   Reply With Quote
Old 14th March 2019, 01:39   #87  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
There will be a new branch in April as it brings some pretty big changes. The end of Kepler support and no more 3D vision support.

Probably should have added a source for that rather bold statement. LOL Sorry about that.

https://www.engadget.com/2019/03/11/...ision-support/
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED

Last edited by SamuriHL; 14th March 2019 at 01:44.
SamuriHL is offline   Reply With Quote
Old 14th March 2019, 05:21   #88  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
don't forget that for nvidia HDR movies are pretty unimportant to say it friendly but to get HDR games to look as "good" as they want and work as good as possible.
huhn is offline   Reply With Quote
Old 14th March 2019, 19:39   #89  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
I realize us HTPC users are a small segment of their market, but, after their crypto nonsense went bust they probably shouldn't completely discount us.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 16th March 2019, 11:51   #90  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by chros View Post
@Manni, do/did you have the same behaivor as this?
Quote:
Originally Posted by iSeries View Post
You have to add a DisplayID extension block in CRU, and within that you can add a new detailed resolution with higher limits.
Quote:
Originally Posted by Manni View Post
I set the 23p custom rate (created by CRU) as my default, and I don't have the issue you mention.

MadVR switches to 60p (or other rates) if necessary, but I mostly play 23p content and my custom rate is always selected by defaut (now that I've put the DIsplay ID custom rate at the top of the list, thanks iSeries!).

But I don't use my HTPC for gaming or anything else, and I used CRU to create the custom refresh rate, not madVR because with recent drivers madVR created custom rates don't work in 12bits (8bits is forced in the NCP).
Thank You, guys, CRU 1.4.1 with DisplayID also solved my above mentioned issue. That's what I did:
- wrote down the madvr timings, removed the madvr created custom resolution, then restart
- in CRU 1.4.1
-- add DisplayID to Extension blocks and set it as the 1st entry
-- set madvr timings and set Pixel clock and Native as well

It's a big help to get rid of a major annoyance, now I can set 23p as the default refresh rate of the TV

Couple of notes:

The difference between CRU and nvidia created (by madvr) custom timing is:
- nvidia adds a new one to the already existing ones: that can confuse applications which one to use (e.g. madvr)
- CRU replaces an existing one with the custom one on the OS level, so there won't be any confusion

When you create custom timings with madvr:
- always start from the default one (EDID) and pay special attention to the "sync" attributes (e.g. + , +):
-- don't select a mode that is different!
- don't deviate a lot from the default EDID values
-- remember, this approach is modifying "back porch" (horizontal and vertical values) only!
- the above 2 wrong settings can easily modify the response of the panel! (e.g. modifying gamma curve, causing black crush, etc.)
- if you get >= 10 hours after the first optimisation then it's already really good, you don't need to do more runs

Examples, original EDID timing:
Code:
EDID/CTA:
	front	sywi    back		pixels	       sync
hor	1276     88	296	1660     3480	5500	+
ver	 8	 10	 72	 90	 2160	2250	+
picl 296.70 mhz
	 23.9757575757576
OSD result: 23.97791Hz (~4.41 minutes)
Result after the first optimisation run:
Code:
Custom:
	front	sywi    back		pixels	       sync
hor	1276     88     356	1720     3480   5560	+
ver	 8	 10	 72	 90	 2160	2250	+
picl 299.94 mhz
	 23.9760191846523
OSD result: 23.97537Hz (~22 hours - 1 day)

About nvidia driver versions:
- some of them can block custom timings completely (doesn't matter how they were created)
- so always check in madvr OSD whether your timing is applied if you install a new driver

Btw, I still use the Display Changer II util to easily manage 2 displays.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config

Last edited by chros; 16th August 2019 at 11:28.
chros is offline   Reply With Quote
Old 16th March 2019, 13:21   #91  |  Link
KoD
Registered User
 
Join Date: Mar 2006
Posts: 567
398.82 is the first driver in a very very long time where they fixed the Dolby Vision switch on LG OLEDs with firmware newer than 4.70.x. Says so in the driver release notes, and I've seen this in Mass Effect: Andromeda, this is the first driver version in a long time where DolbyVision works again. So, for some gamers, using anything older (like 398.11) is not a good choice. And 398.11 was, in fact, the previous released driver version.

As a side-note, it would be nice if madshi would enable switching on DolbyVision in the renderer for nVidia cards. The DV metadata is preserved in mp4 and mkv files, and the x265 encoder can now store DV metadata in the stream as well. There's documentation from nVidia on how to do that using the NvAPI, like in the slides from this GDC presentation: slides, video. I don't know if there is a similar documented way for AMD graphic cards, except contacting Dolby and asking for their DolbyVision gaming SDK (they have a Dolby Developer site).

Btw, I recommend anyone interested to have a look at the video presentation and the slides, it is a nice overview on what is HDR, what are color spaces, tone mapping, how programmers should configure the backbuffer and the swap chains and what data to use in order to have HDR content displayed. It also shows what the rendering pipeline is in the nVidia driver (have you noticed that the nvidia driver does dithering sometimes? well, you can see here where it happens). The nVidia HDR whitepaper that can be downloaded from this page is also a great introduction in all this.



@Manni:
I do have one question though: have you tried to see what the HDR metadata values are when you connect a LCD monitor to your graphics card instead of your projector? Because the driver takes into account the capabilities of the display as reported by EDID. Digital cinema projectors have a max luminance of 48 nits, so that 20 nits average frame luminance value you noticed seems suitable for a projector, and might be in fact the value reported by the EDID of your projector. If that's the case, what we see might in fact be just a workaround used in the driver to avoid having the displays misbehave when presented with values outside their supported range, as the displays are supposed to tonemap themselves according to these values and their capabilities. I agree, that's not "passtrough" of the metadata though, but it might make the display device not apply any of its custom tone mapping up to that average frame luminance - so it displays the video as it was tonemapped by the source up to that luminance value - a passtrough of the source video through the display device, so to say.

Last edited by KoD; 16th March 2019 at 14:21. Reason: additional notes
KoD is offline   Reply With Quote
Old 16th March 2019, 13:35   #92  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
I knew about the mp4 dolby vision remuxes (which work when fed directly to the TV) but I was under the impression that it wasn't possible within an mkv container.
But anyway, if DV playback could be made working with madvr that would really be amazing.
j82k is offline   Reply With Quote
Old 16th March 2019, 13:44   #93  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
MKV definitely doesn't store out-of-band Dolby Vision metadata. If its part of the video stream itself, it would work with any container, since remuxes are not supposed to take anything out.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 16th March 2019, 14:12   #94  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
So, theoretically could at least the DV mp4 remuxes made with mp4muxer (https://github.com/DolbyLaboratories...ree/master/bin) be made playable from a PC with madVR?
The DV metadata is a seperate video stream within the mp4 file.
What exactly needs to happen to make this possible? Would this be LAVs or madVRs job?
j82k is offline   Reply With Quote
Old 16th March 2019, 18:02   #95  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
It'd be both. LAV needs to know how to decode it and madvr needs to know how to render it. In theory I think I saw somewhere that nVidia added preliminary Dolby Vision support to their API but I wouldn't get all excited about that. Does not mean we're anywhere even close.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 16th March 2019, 19:59   #96  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Manni View Post
So I'm back to 398.11
I wanted to try out this version but I couldn't create custom resolution with it (not in madvr, not in cru), I always got ~12 minutes for 23p.
So, I'm back to 385.28, all is good, having ~13 hours for 23p (used cru).

Quote:
Originally Posted by ChaosKing View Post
Here's also a good tool if you don't want to install too much crap which are bundled nowadays in nvidia drivers such es telemetry https://www.techpowerup.com/forums/t...-alpha.249085/
Remove your old driver with DDU first
Thanks for this, I also used nvcleanstall util to install both drivers, selected Recommended option (only display/audio driver and physics were selected and MSI Afterburner works fine for underclocking/volting).
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config

Last edited by chros; 17th March 2019 at 11:43.
chros is offline   Reply With Quote
Old 16th March 2019, 20:02   #97  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by KoD View Post
@Manni:
I do have one question though: have you tried to see what the HDR metadata values are when you connect a LCD monitor to your graphics card instead of your projector? Because the driver takes into account the capabilities of the display as reported by EDID. Digital cinema projectors have a max luminance of 48 nits, so that 20 nits average frame luminance value you noticed seems suitable for a projector, and might be in fact the value reported by the EDID of your projector.
This isn't the case for a few reasons:
1) I have an HD Fury Maestro in the chain that reports to all sources the EDID with the same full capabilities (HDR, BT2020, 600Mhz bandwidth, full sound etc) and then deals with what might need to be done to comply with different displays connected. So it doesn't matter one bit which display is actually connected. And both my rs2000 and my LG 4K monitor can take the full, unchanged content, without requiring any down-scaling or conversion.
2) This metadata isn't supposed to be changed in passthrough mode. It's not the business of the source to change the metadata, at least when playing UHD content. Remember that I get the *same* bogus metadata irrespective of the *known* metadata for each title.
3) Digital cinema projectors, especially consumer ones, are not limited to 48nits. My rs2000 can go up to 200-250nits and more depending on settings and screen. The EDID is certainly not reporting a 48nits max. Even Modern digital cinemas go above this, which is the SDR reference white. Dolby Cinema uses a peak of 107nits for example with HDR titles.

So for all these reasons (and a few others), I can confirm that the metadata sent in passthrough by the recent drivers is simply bogus. It isn't what it should be, especially when playing UHD bluray content, where the original mastering metadata should *always* be reported, as it's not the business of the source to know about the limitations of the display if it's not doing the tonemapping.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 16th March 2019 at 20:04.
Manni is offline   Reply With Quote
Old 16th March 2019, 20:07   #98  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by chros View Post
I wanted to try out this version but I couldn't create custom resolution with it (not in madvr, not in cru), I always got ~12 minutes for 23p.
Are you sure that the custom resolution is actually applied and that the display is active both in CRU and in madVR?

Otherwise it can be a limitation of your display.

Here 398.11 works just as well as any other version with CRU after our debugging group session here a few days ago.

Did you try to export your CRU config valid in 385.85 and import it in 398.11, to make sure you rule out any user error in recreating the refresh rate?
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 17th March 2019, 11:50   #99  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by Manni View Post
Are you sure that the custom resolution is actually applied and that the display is active both in CRU and in madVR?

Here 398.11 works just as well as any other version with CRU after our debugging group session here a few days ago.

Did you try to export your CRU config valid in 385.85 and import it in 398.11, to make sure you rule out any user error in recreating the refresh rate?
Thanks Manni, I can't state that it wasn't a user error but since it worked straight away with v385.28 both times ...
Anyway, since there's no performance difference between the 2 driver versions and I don't use any of the affected issues (hdr passthrough, 3d, etc.), I stick with v385.28 until I change something in my system.

I also created the following custom resolutions for my BenQ GL2450HM 1080p monitor that only exposes 60Hz (in CRU with madvr's help):
1080p47, 1080p48, 1080p50, 1080p59, 1080p60
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 17th March 2019, 20:29   #100  |  Link
KoD
Registered User
 
Join Date: Mar 2006
Posts: 567
Quote:
Originally Posted by Manni View Post
This isn't the case for a few reasons:
1) I have an HD Fury Maestro in the chain that reports to all sources the EDID with the same full capabilities (HDR, BT2020, 600Mhz bandwidth, full sound etc) and then deals with what might need to be done to comply with different displays connected. So it doesn't matter one bit which display is actually connected. And both my rs2000 and my LG 4K monitor can take the full, unchanged content, without requiring any down-scaling or conversion.
2) This metadata isn't supposed to be changed in passthrough mode. It's not the business of the source to change the metadata, at least when playing UHD content. Remember that I get the *same* bogus metadata irrespective of the *known* metadata for each title.
3) Digital cinema projectors, especially consumer ones, are not limited to 48nits. My rs2000 can go up to 200-250nits and more depending on settings and screen. The EDID is certainly not reporting a 48nits max. Even Modern digital cinemas go above this, which is the SDR reference white. Dolby Cinema uses a peak of 107nits for example with HDR titles.

So for all these reasons (and a few others), I can confirm that the metadata sent in passthrough by the recent drivers is simply bogus. It isn't what it should be, especially when playing UHD bluray content, where the original mastering metadata should *always* be reported, as it's not the business of the source to know about the limitations of the display if it's not doing the tonemapping.
I see. The 48 nits value for Digital Cinema was in the nVidia presentation, btw. I do realize that home projectors are able of more though.

Then it's either a bug, or working as intended. While initially HDR was only supported in Windows in full screen exclusive mode (and in this mode it makes sense to pass the application set min/max/avg values to the display), Microsoft then added support for windowed mode as well, both borderless and with borders. This means that one application window which was created for SDR content looks as it did before while another application window on the same display is able to display wide gamut content with high luminosity. In order to do that, the compositing engine has to work in wide gamut mode all the time, for everything in Windows.

What this means: if one of the applications tries to say "oh, my min/max/avg luminance is <this>", that's not information that should be passed to the display, because your other SDR application that you have on the screen at the same time must be rendered properly too.

It could be that Windows decides what are the min/max/avg luminance values that will be used for the tonemapping of the entire Windows desktop, and that's the value that gets reported to the display. The video player's advertised values will only be used to render the video player output properly in this shared desktop backbuffer.

Seeing it like this, the label of "passthrough" in madVR is a bit misleading, because madVR does not control what the compositing engine does. It's meaning is more like "madVR will not tonemap the material itself, and will specify for the swapchain the HDR metadata values of the stream" but then Windows decides what happens with this.



Regarding Dolby Vision, there's clearly support in the graphics drivers for sending the metadata to displays (TVs, as I don't know of any monitor supporting DV), otherwise all the games that make use of it would not work. It's possible however that the DV metadata during a gaming session is sent to the display only by means of a Dolby supplied library, for which a license is required. The code in the slides posted by me shows only how to switch the display to DV mode and how to configure it initially, it does not show how to send DV metadata while the video game is running.

Dolby has a Dolby Developer website, as I was saying. In its whitepaper found there it says the metadata and the video can be in a single stream, but there's a two stream approach as well. It also says they have DolbyVision decoders, and display managers, for PCs, mobile devices, game consoles, etc. So they clearly have a way to display this content on PCs. And if you make an account on their Dev site to access the DolbyVision FAQ, a contact is given for requesting access to the SDK and an example to try out. I would be surprised if the next CyberLink PowerDVD version that should be announced in a month or two will not have DV support.

Last edited by KoD; 17th March 2019 at 20:54.
KoD is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 00:58.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.