Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th January 2018, 11:58   #48401  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
@Madshi:

One idea: is there any way to have MadVR reporting SDR BT2020 in the HDMI stream when doing a conversion? If that was the case, it would allow the Vertex to pick it up and select the correct baseline/calibration in the PJ, without having to deal with profiles/batch files. This is what happens for example if I set the UB900 to do the HDR to SDR conversion, SDR BT2020 is detected and the correct calibration is called. If we can also apply MadVR's 3D LUT, all the issues are solved in one go, at least for Vertex users. Thanks!
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 17th January 2018, 13:21   #48402  |  Link
-Hitman-
Registered User
 
Join Date: Jun 2009
Posts: 18
Quote:
Originally Posted by madshi View Post
Uh, that's weird! And is the raw data of all those identifications identical? Try deleting *all* of them. Does that stop the madness?
Yes all the data under each ident is exact and reports the denon avr everytime.

I do recall this also happening with my old intel 6700 machine and had to revert back to the version of madvr that worked to resolve it, possibly the madvr version that had just got HDR first added, worked fine.

I may have already tried to delete the idents with the old machine to try and get it working and tried again but still everytime my denon/tv display is used a new ident appeared everytime.

I'll try to delete them again tonight on this machine and report back.

Also i might add that the Dell display is on a displayport to hdmi dongle, whether this is attributing to this issue?

I need to use the convertor as i only have 1 display port and 1 hdmi.

Thanks.
-Hitman- is offline   Reply With Quote
Old 17th January 2018, 13:48   #48403  |  Link
Cinemancave
Registered User
 
Join Date: Dec 2012
Posts: 33
Quote:
Originally Posted by Manni View Post
I just wanted to say "well done!", the result is significantly better than what I've been able to get with any custom curves, especially regarding saturation in shadow detail and highlights.
I have a similar setup with an X7000 and Vertex in the chain. So are you saying that it's better to let MadVR do the HDR-to-SDR conversion for all regular UHD HDR-content and then applying a BT2020 SDR-config in the JVC instead of using one of the custom curves created by Arves tool, for us JVC owners? Should I use MadVR regular settings for the conversion or modify it in any way?
Cinemancave is offline   Reply With Quote
Old 17th January 2018, 16:29   #48404  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Silly question time?: for those with crappy GPU's, is there any way the rendering load can be shared between the GPU and the CPU, currently when playing a movie My GPU is running at 90% and my i7 CPU below 10%
mclingo is offline   Reply With Quote
Old 17th January 2018, 17:05   #48405  |  Link
Siso
Soul Seeker
 
Siso's Avatar
 
Join Date: Sep 2013
Posts: 716
@madshi

In a case where the EDID of the monitor is not detected correctly pic ---> http://imgbox.com/0Ba1cnK7 is it ok if I manually choose 8 bit which is my monitor's depth, "auto" seems odd to me?

Last edited by Siso; 17th January 2018 at 17:13.
Siso is offline   Reply With Quote
Old 17th January 2018, 17:42   #48406  |  Link
LUCi5R
Registered User
 
LUCi5R's Avatar
 
Join Date: Jan 2008
Location: The Valley, Southern California
Posts: 20
Quote:
Originally Posted by mytbyte View Post
OK, so GT 1030 does have the VP9 decoder but no profile 2...I'll live without the latter...thx guyz

P.S. just found this screenshot posted way earlier on this forum: http://jpegshare.net/d2/3a/d23a630cf...5056c.jpg.html

seem like profile 2 is covered as well...seems like quite a little beast
Hi there,

I was reading some of your posts and I was wondering if you could help. I've come across posts & people who had difficulty playing back 2160p media using MadVR/MPC-BE combo with a GT 1030 GPU, and then I've come across others who are able to achieve flawless playback using the same.

I don't believe GT 1030 is "incapable" of UHD HDR (2160p) playback using MadVR/MPC-BE otherwise no one would be able to; but there is some golden settings that achieve this.

Unfortunately I'm experiencing terrible lag/frame rate issue where media is practically unplayable/unwatchable.

I'm not doing any upscaling / downscaling, but simple 1:1 2160p playback of UHD HDR media on a 4K HDR display.

The HTPC has an i5-2600K CPU OC'd to 4Ghz, 8GB DDR4 RAM, and GT 1030 GPU.

Is it possible for you to share your MadVR settings or any advice you can offer?

Thanks!
LUCi5R is offline   Reply With Quote
Old 17th January 2018, 17:52   #48407  |  Link
Clammerz
Registered User
 
Join Date: Aug 2005
Posts: 54
Quote:
Originally Posted by mclingo View Post
is there any way the rendering load can be shared between the GPU and the CPU
No, as that would defeat the purpose of using the hardware that's built for graphical number crunching. Your GPU is leagues ahead of what your CPU can do with graphics.

Quote:
Originally Posted by Manni View Post
A user has posted how to do it in Python in the JVC thread, but I don't know how to call a python script from a batch file.
Code:
python <filename>.py
Of if you can't use a physical file for some reason:
Code:
python -c "<insert code here>"
(You'll need to escape or use different quote style in the code)
Clammerz is offline   Reply With Quote
Old 17th January 2018, 18:10   #48408  |  Link
clsid
*****
 
Join Date: Feb 2005
Posts: 5,647
Quote:
Originally Posted by madshi View Post
Hmmmm... Interesting idea. What is the motivation for such a new option? Wouldn't you get the same result by creating a simple profile based on video resolution (e.g. disable RCA for 4K content, but enable it for lower res content)?
I would like to avoid using profiles just for something like this. There is a big performance difference between regular RCA and the integrated one, which is not something that is obvious to most users. I want to be able to switch to a different NGU variant without having to disable (integrated) RCA.

Quote:
That should already be the case! The only exception is if you have a "?" device, in that case the settings dialog always jumps there, to motivate you to tell madVR which display type it is.
You are right. But this is certainly not obvious. Perhaps you could a red warning text on that page that says something like "please select the correct device type".
clsid is offline   Reply With Quote
Old 17th January 2018, 18:44   #48409  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Manni View Post
Thanks, but isn't this assuming the SAME profile/calibration is used in the display for all the 3D LUTs? As I can't do this, I have to use the custom profile external command to enable the correct calibration in the display, so that I can have iris open / high lamp / filter for HDR (SDR BT 2020) and iris closed / low lamp / no filter for SDR, correct?
I'm not sure I understand your problems with the madVR settings. Please expect everything to work just the way you'd want it to, and then you probably understood how it works.

Usually, HDR content is BT.2020, and SDR content is BT.709. So it should be easy for you to create a conventional SDR calibration 3DLUT for BT.709 SDR content, as you always did. For HDR content, you can switch your projector into a different mode, open the iris etc and then create a new calibration 3DLUT for that, based on madVR's HDR -> SDR pixel shader math. You would do that by creating an *SDR* calibration 3DLUT for BT.2020 with Calman/LightSpace/ArgyllCMS, with the iris fully open etc. Of course the 3DLUT needs to be made for SDR, because madVR converts the HDR content to SDR before feeding it into the 3DLUT. This way the BT.709 3DLUT would be for iris closed and the BT.2020 3DLUT would be for iris open.

If you want to support SDR BT.2020 content, too, with closed iris, then even that is easily possible, as well. Just make 2 calibration profiles, one for open iris and one for closed iris, auto select the open iris for HDR content and the closed iris for SDR content, then create proper SDR calibration 3DLUT for BT.709 and BT.2020 with open *and* closed iris (so 4 overall) and place them in the right slots in those 2 profiles. Done. I'm not sure where you see the problems here?

Quote:
Originally Posted by Manni View Post
is there any way to have MadVR reporting SDR BT2020 in the HDMI stream when doing a conversion?
The OS doesn't give me control over that. Maybe it would be possible with private GPU APIs, but I've not looked into that yet.

Quote:
Originally Posted by -Hitman- View Post
Also i might add that the Dell display is on a displayport to hdmi dongle, whether this is attributing to this issue?
I don't see why that would be an issue.

If the problem is still there after you deleted all the identification cards, could you try to find out which exact madVR build introduced the problem? You can download all older builds here:

http://www.videohelp.com/tools/madVR...sions#download

Quote:
Originally Posted by Siso View Post
In a case where the EDID of the monitor is not detected correctly pic ---> http://imgbox.com/0Ba1cnK7 is it ok if I manually choose 8 bit which is my monitor's depth, "auto" seems odd to me?
I'm not sure why the detection fails, it should work. But anyway, of course you can manually choose whatever value you want. In a case like this "auto" should default to 8bit, anyway, though.

Quote:
Originally Posted by LUCi5R View Post
I was reading some of your posts and I was wondering if you could help. I've come across posts & people who had difficulty playing back 2160p media using MadVR/MPC-BE combo with a GT 1030 GPU, and then I've come across others who are able to achieve flawless playback using the same.

I don't believe GT 1030 is "incapable" of UHD HDR (2160p) playback using MadVR/MPC-BE otherwise no one would be able to; but there is some golden settings that achieve this.

Unfortunately I'm experiencing terrible lag/frame rate issue where media is practically unplayable/unwatchable.

I'm not doing any upscaling / downscaling, but simple 1:1 2160p playback of UHD HDR media on a 4K HDR display.

The HTPC has an i5-2600K CPU OC'd to 4Ghz, 8GB DDR4 RAM, and GT 1030 GPU.
First of all you need to get decoding right. Decoding HEVC is very demanding, so your CPU most likely won't be able to handle it. You need to let the GPU do that. So in LAV Video Decoder pick either DXVA Copyback, DXVA Native or (with a newer LAV Nightly build) D3D11 decoding. Then in madVR press Ctrl+J and check the decoder queue during playback. It should be nearly full during playback.

Once you've got the decoder sorted out, next is rendering. You could start by restoring madVR default settings, then activate all the "trade quality for performance" settings. If playback is already smooth that way, you can uncheck the "trade quality" options one by one, starting at the bottom, moving up, until playback starts to judder again.

Quote:
Originally Posted by clsid View Post
I would like to avoid using profiles just for something like this. There is a big performance difference between regular RCA and the integrated one, which is not something that is obvious to most users. I want to be able to switch to a different NGU variant without having to disable (integrated) RCA.
I fully understand. For me a big problem is that there are a LOT of settings where I could add additional options to avoid the need to use profiles. But adding more and more options always makes things more complicated for the average user. Of course using profiles is even more complicated for the average user, though. So it's not easy to decide which options to add and which not.

Any opions about this anyone? Would it be a good idea to offer an "use RCA only if its comes for free" option?

Quote:
Originally Posted by clsid View Post
You are right. But this is certainly not obvious. Perhaps you could a red warning text on that page that says something like "please select the correct device type".
Haha, maybe I should. I thought users experienced with the device manager (? devices are BAD) would know what to do...
madshi is offline   Reply With Quote
Old 17th January 2018, 18:46   #48410  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Anybody interested in an XySubFilter bugfix for external SRT subtitles with italic text in them?

http://madshi.net/XySubFilter746SrtFix.zip

This is not tested well, and it's a totally unofficial build, so please use on your own risk. If you do decide to give it a try, let me know if the "move subtitles" option now works properly in all situations.

I've asked cyberbeing to create a new official XySubFilter build with my fix, but it'd be great if some users would test it first.
madshi is offline   Reply With Quote
Old 17th January 2018, 18:52   #48411  |  Link
clsid
*****
 
Join Date: Feb 2005
Posts: 5,647
You could make it a tri-state checkbox. Semi-checked would be NGU only.

I don't usually customize any of the device settings, so I assumed leaving it at "unknown" would not have any negative effects and just use same defaults.
clsid is offline   Reply With Quote
Old 17th January 2018, 19:02   #48412  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Any opions about this anyone? Would it be a good idea to offer an "use RCA only if its comes for free" option?

I think so. Those with strong GPU's and a good understanding of the setting probably would prefer it left as a selectable option though otherwise they can't utilize the full potential of their GPU (unless the option would be included with ticking RCA on or off as it is now). Like a sub option under it. Ticking enable and then 'only for free' wouldn't be greyed out anymore.
On the other hand, those with weaker GPU's would be doing more harm than good so enabling your option 'only for free' is a no brainer imo. I have a weak GPU. I'd use it.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 17th January 2018, 19:36   #48413  |  Link
LUCi5R
Registered User
 
LUCi5R's Avatar
 
Join Date: Jan 2008
Location: The Valley, Southern California
Posts: 20
Quote:
Originally Posted by madshi View Post
First of all you need to get decoding right. Decoding HEVC is very demanding, so your CPU most likely won't be able to handle it. You need to let the GPU do that. So in LAV Video Decoder pick either DXVA Copyback, DXVA Native or (with a newer LAV Nightly build) D3D11 decoding. Then in madVR press Ctrl+J and check the decoder queue during playback. It should be nearly full during playback.

Once you've got the decoder sorted out, next is rendering. You could start by restoring madVR default settings, then activate all the "trade quality for performance" settings. If playback is already smooth that way, you can uncheck the "trade quality" options one by one, starting at the bottom, moving up, until playback starts to judder again.
Super thanks for the advice!! I'll go ahead & set this up and do some testing. Will reach back.

Thanks again.
LUCi5R is offline   Reply With Quote
Old 17th January 2018, 19:52   #48414  |  Link
Neo-XP
Registered User
 
Neo-XP's Avatar
 
Join Date: Mar 2016
Location: Switzerland
Posts: 140
Quote:
Originally Posted by madshi View Post
Any opions about this anyone? Would it be a good idea to offer an "use RCA only if its comes for free" option?
Yes, because right now it is very complicated for new users to know when RCA comes for free.

To quote you :

Quote:
Originally Posted by madshi View Post
A few details about how RCA/NGU fusion works:
1) Both RCA and NGU can run on just the luma channel or just the chroma channels or both. madVR automatically runs a combined algo if your settings allow it. It's possible that only luma "RCA + NGU" will be fused, or only chroma, or both, depending on your settings.
2) Currently fused RCA+NGU is only supported for NGU Sharp, but not for NGU AA, NGU Standard or NGU Soft.
3) If you have a 4:2:0 or 4:2:2 source and you activate "NGU Sharp" for chroma upscaling, and if you also activate RCA + "process chroma channels, too", then the chroma RCA will run as part of the chroma NGU Sharp upscaling. But if you select any other chroma upscaler, there's no chance for a fused NGU Sharp + RCA, so in that case RCA will still be performed separately for the chroma channels, followed by the chroma upscaler you've selected.
4) If you activate "NGU Sharp" for Image Upscaling/Doubling, then RCA for luma will run as part of NGU Sharp. But if you select a different Image Upscaling algo, or if image upscaling isn't performed at all, then RCA will be run separately.
5) If you select RCA "medium" quality, NGU Sharp quality is automatically upgraded to at least "high" quality. If you select RCA "high" quality, NGU Sharp is automatically upgraded to "very high" quality. It should still be faster overall than separate RCA + NGU Sharp Low/Medium processing.
It would be a quick way to ensure that you will not have a massive drop in performances if you do not use profiles and know exactly all these conditions when RCA is enabled.

This new option would be useless of course for users who know all this, but useful to some others I guess.

Last edited by Neo-XP; 17th January 2018 at 20:02.
Neo-XP is offline   Reply With Quote
Old 17th January 2018, 20:08   #48415  |  Link
-Hitman-
Registered User
 
Join Date: Jun 2009
Posts: 18
Quote:
Originally Posted by madshi View Post
If the problem is still there after you deleted all the identification cards, could you try to find out which exact madVR build introduced the problem? You can download all older builds here:

http://www.videohelp.com/tools/madVR...sions#download
I tried deleting the entries, however active displays/entries cannot be deleted (Denon AVR hdmi remains active in standby), which left 2 entries under the Dell display /after deleting the inactive cards.

I pulled the plug on the Denon to remove it and then my dell monitor + correct dell edid was added as a new 2nd dell entry eg. Dell (1).

I then was able to delete the original dell entry (due to now being inactive) and renamed the Dell (1) back to Dell, so good so far.

Then I re-powered the Denon, it was detected but then a second entry appeared under the Dell display and contained the Dell edid etc..., no sign of the Denon edid or a separate display entry for it :/

See image of what I have now...

https://i.imgur.com/cNLDzeG.png

I will revert back to versions to try and find the version that causes this wired issue to re-appear as you suggest.

Thanks again.

Last edited by -Hitman-; 17th January 2018 at 20:10.
-Hitman- is offline   Reply With Quote
Old 17th January 2018, 20:36   #48416  |  Link
-Hitman-
Registered User
 
Join Date: Jun 2009
Posts: 18
Update... @madshi

Ok, madvr works fine - version 0.91.11, two separate displays show with correct id/edid under each.

Dell U2413
Denon-AVRHD

Install next version up - V0.92.1 and only the Dell display is now present /with the Denon ID and edid under it.

Also with v0.92.1, the Dell display entry does not keep adding idents every time the Denon/tv is used, it just remains - Dell display with a single indent containing the Denon info+edid data.
-Hitman- is offline   Reply With Quote
Old 17th January 2018, 20:37   #48417  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by Clammerz View Post
Code:
python <filename>.py
Thanks, that's great.

@Madshi: thanks for the replies above, I think I have all I need to experiment. Creating the calibrations or the 3D LUTs isn't the issue, I'm used to doing that. I'll get back to you to report success or if I hit any problem trying to apply the correct calibration and correct 3D LUT.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 17th January 2018, 20:38   #48418  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by brazen1 View Post
good idea to offer a "use rca only if it comes for free" option?
+1

Last edited by leeperry; 17th January 2018 at 21:33.
leeperry is offline   Reply With Quote
Old 17th January 2018, 20:48   #48419  |  Link
amayra
Quality Checker
 
amayra's Avatar
 
Join Date: Aug 2013
Posts: 285
Does Madvr have Performance Impact after Meltdown, Spectre bug patch ?
__________________
I love Doom9
amayra is offline   Reply With Quote
Old 17th January 2018, 21:01   #48420  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
it'S GPU based.
it's not a data center.

so no.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:44.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.