Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 29th September 2018, 00:47   #52821  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Please no more discussion here about MicroLED, ABLs etc, unless it's really madVR related, thanks!

Quote:
Originally Posted by YGPMOLE View Post
The nit value to use into the "target nit" box in madVR should be exactly the amount of the nit value the TV is capable of, am I right?
Not really. Pick any value which looks good to you. Lowering the target nit will make the image brighter, on the cost of losing HDR highlights.
madshi is offline   Reply With Quote
Old 29th September 2018, 03:57   #52822  |  Link
jmone
Registered User
 
Join Date: Dec 2007
Posts: 613
Great work on the HDR -> SDR work, looks good across a range of screens (JVC x7500, LG OLED, Sony LCD). Also tested and working well with a range of GPU's (a lowly iGPU (NUC), 970, 1070 using some preset settings.bin that nev just added over at JRiver Media Center for various "levels" of GPU).

Thanks to Madshi and all over at AVS.
jmone is offline   Reply With Quote
Old 29th September 2018, 03:58   #52823  |  Link
jmone
Registered User
 
Join Date: Dec 2007
Posts: 613
Is it possible to bring up the madVR Settings GUI from a command line?
jmone is offline   Reply With Quote
Old 29th September 2018, 04:35   #52824  |  Link
alps006
Registered User
 
Join Date: Sep 2018
Posts: 22
Quote:
Originally Posted by SamuriHL View Post
Anyone try 411.70 yet to see if the HDR issues have been resolved? I'm downloading it now.

EDIT: Nope....still a broken mess for HDR. How nice. Back to 399 I go.
Just an FYI. I got really frustrated with Winows 10 having so many issues recently with nvidia drivers. I switched back to Windows 8.1, to tell you the truth, with the latest nvidia driver, playback of 4K HDR is fantastic without even a single glitch. I have correct composition rate (In Windows 10 it was always 23,980). HDR- SDR conversion is also working great. Really happy with the results. Thanks madshi for his nice work.
alps006 is offline   Reply With Quote
Old 29th September 2018, 06:33   #52825  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,889
when i change the gamma in calibration it changes the image.
this is unexpected behaviour because if i calibrated a screen to gamma x i expect madVR to honor this and not change something related to the image.

example why this is a problem.

when i put 2 screen next to each other one with gamma 2.4 and one with gamma 2.2 than i expect an image that is mathematical different with a gamma of 0.2.

but this can't be true because if i change the gamma setting under calibration the image changes and to get this results madVR has to send both screen a bit identical image.

the gamma of the image should not be change just by the calibration setting it should only be changed in conjunction with "color & gamma -> enable gamma processing". that's assuming it is just a gamma setting and has nothing todo with internal HDR calculation. if it does something different in HDR calibration well... than this is not the correct place to do that.

edit: this is an issue with HDR-> SDR conversation. SDR is working as expected.

Last edited by huhn; 29th September 2018 at 06:48.
huhn is offline   Reply With Quote
Old 29th September 2018, 06:49   #52826  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 4,294
Quote:
Originally Posted by alps006 View Post
Just an FYI. I got really frustrated with Winows 10 having so many issues recently with nvidia drivers. I switched back to Windows 8.1, to tell you the truth, with the latest nvidia driver, playback of 4K HDR is fantastic without even a single glitch. I have correct composition rate (In Windows 10 it was always 23,980). HDR- SDR conversion is also working great. Really happy with the results. Thanks madshi for his nice work.
That's awesome. I wish I could go back to 8.1 but it's not a really for my htpc unfortunately. Glad it's working for you though! Madshi always says to use 8.1.

Sent from my Pixel XL using Tapatalk
__________________
HTPC: Windows 10, I9 9900k, RTX 2070 Founder's Edition, Pioneer Elite VSX-LX303, LG C8 65" OLED
SamuriHL is offline   Reply With Quote
Old 29th September 2018, 08:46   #52827  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
@huhn: but you need to be able to tell madvr the target gamma of the tv, and you can set this independently for each screen , also HDR-SDR conversion is also some kind of "gamma processing" as under "color & gamma" section...

Here I would like to point out that addition of the option for BT.1886 would be in order for TV's with non-0 blacks

Sent from my GM 5 Plus d using Tapatalk

Last edited by mytbyte; 29th September 2018 at 08:57.
mytbyte is offline   Reply With Quote
Old 29th September 2018, 08:51   #52828  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,889
Quote:
but you need to be able to tell madvr the target gamma of the tv, not quite sure what you're aiming at...
what is the point of telling the gamma if it is not honored?
if madVR changes the image than your gamma is not your gamma anymore it is something else.

Quote:
here I would like to note that addition of the option for BT.1886 would be in order.
that's impossible because bt 1886 can be quite a lot of different effective gamma.
huhn is offline   Reply With Quote
Old 29th September 2018, 09:24   #52829  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
Quote:
Originally Posted by huhn View Post
what is the point of telling the gamma if it is not honored?
if madVR changes the image than your gamma is not your gamma anymore it is something else.



that's impossible because bt 1886 can be quite a lot of different effective gamma.
Second point: Damn you're right, in that case madvr should be told the measured black and white level of the tv but that falls under "processing" section actually..

On the first point - still don't understand why you think it's not honored? There is nothing to be honored - HDR processing needs to know the TV's gamma, while SDR video has no gamma defined and gamma setting has no effect, but you need to manipulate HDR gamma.

Sent from my GM 5 Plus d using Tapatalk
mytbyte is offline   Reply With Quote
Old 29th September 2018, 09:43   #52830  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,889
for what do you know what type of gamma a TV has? and why should you instantly change something. change means you don't get your "gamma" anymore.

the HDR -> SDR conversation produces an SDR image.
if you tick the default setting "disable calibration controls for this display" you will get on a gamma 2.2 calibrated TV a relative gamma of 2.2 on a 2.4 a gamma 2.4 you know what i mean. because every screen gets a bit identical image and everything is working as it should.

but as soon as you select "this display is already calibrated" with an option that is not 2.2 you will get an altered image. so the 2.4 calibrated TV doesn't get an 2.4 gamma relative to a gamma 2.2 calibrated TV. HDR or not this is incorrect if 2.4 is not 2.4 relative to 2.2 it is not 2.4 it's that simple.

don't change the image by just setting a gamma at this option or user don't get there "gamma" of choice.
huhn is offline   Reply With Quote
Old 29th September 2018, 10:16   #52831  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
@huhn: I think I understand now what you want to say but default values are for people who don't know their gamma, white and black levels and that (must be) ok. But HDR levels are absolute and those values won't be converted to absolute brightness levels unless you specify actual SDR gamma you calibrated your tv to if you have the possibility to do that. I know, you'll say that top brightness differs from Tv to Tv (even HDR models) anyway so why stick to absolute levels but for my way of thinking, HDR-SDR conversion is also designed to simulate, on an SDR TV with high brightness, as close as possible how it would look on a HDR display, not just losely convert to SDR, if you calibrated your SDR tv and know it's characteristics. Of course, it means multiple processing, quality depends on TV's gamma precission and lut bitdepth and thus banding is a risk.

Sent from my GM 5 Plus d using Tapatalk

Last edited by mytbyte; 29th September 2018 at 10:21.
mytbyte is offline   Reply With Quote
Old 29th September 2018, 11:37   #52832  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 705
Quote:
Originally Posted by alps006 View Post
Just an FYI. I got really frustrated with Winows 10 having so many issues recently with nvidia drivers. I switched back to Windows 8.1, to tell you the truth, with the latest nvidia driver, playback of 4K HDR is fantastic without even a single glitch. I have correct composition rate (In Windows 10 it was always 23,980).
You do know it's the Nvidia driver which is broken?
Why is 399.xx working fine for HDR? Because the 399.xx driver is fine.

You get another installer for Win 8.1 which is also smaller:

Version: 411.70 WHQL
Freigabedatum: 2018.9.27
Betriebssystem: Windows 7 64-bit, Windows 8.1 64-bit, Windows 8 64-bit
Dateigröße: 469.19 MB
--------------------------------------------------------------------------------------------
Version: 411.70 WHQL
Freigabedatum: 2018.9.27
Betriebssystem: Windows 10 64-bit
Dateigröße: 520.35 MB

But I have the same opinion that Win 8.1 is better suited for HTPC use, but since I am using Win10 for over 3 years now, I think Nvidia should be able to deliver a working driver and not brk things repeatedly.

I stay with Win 10 for the next time.
Klaus1189 is offline   Reply With Quote
Old 29th September 2018, 12:59   #52833  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
madVR v0.92.17 released

http://madshi.net/madVR.zip

Code:
* modified/simplified HDR tone mapping settings page
* small HDR tone mapping saturation improvement
* OSD now also shows the measured luminance of the current frame (in addition to the average)
* fixed: render & present queues didn't always fill in Windows 10 build 1803
* fixed: using XySubFilter sometimes resulted in black screen / freeze
* fixed: using HDR "processing" resulted in dark and red-ish image
* fixed: using BT.601/709 gamma curve with HDR tone mapping produced gray-ish image
* fixed: settings dialog sometimes crashed on display mode / custom mode tab
The HDR settings dialog changes *could* maybe introduce some new bugs, but I hope not. I've modified the HDR settings to make it less confusing. I don't want users to think that they somehow lose HDR by doing tone mapping.
madshi is offline   Reply With Quote
Old 29th September 2018, 13:09   #52834  |  Link
magic144
Registered User
 
Join Date: May 2005
Posts: 386
Thanks for the swift XySubFilter-associated fix madshi, very much appreciated!
magic144 is offline   Reply With Quote
Old 29th September 2018, 14:05   #52835  |  Link
jespermart
Registered User
 
Join Date: Mar 2018
Posts: 19
madVR display device

I get all sort of display devices in madvr, for the moment i have Yamaha RX2070, Intel Vertex twice and Visio M50-E1
and for the moment only the Visio M50 are active but I don't own a Visio M50, so what is happening with my devices and how do i get rid of an active device I don' own?
jespermart is offline   Reply With Quote
Old 29th September 2018, 14:30   #52836  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 417
Quote:
Originally Posted by madshi View Post
No, because you *can* test HDR on your Kuro, by using madVR's HDR -> SDR conversion.


First of all, I never stated that *all* TVs are not smart enough, I'm usually careful enough to talk about "many" or "most" TVs. There may be TVs which are smart enough. Maybe the latest Panasonic OLED could be, I don't know.

If the TV is not smart enough, it will simply apply a compression curve to every pixel. If madVR has already applied tone mapping before, that means the TV will compress the content even further. It might not be a dramatic problem, but it's far from optimal.


What does ABL have to do with tone mapping? I don't think that the tone mapping algo in the OLED TVs is smart enough to consider ABL. As such, there's no reason to think that madVR couldn't provide a better tone mapping result than the internal system.

You don't seem to understand the whole tone mapping concept. What do you think the 2017 LG OLED does internally when you feed it HDR? In case you don't know: It applies tone mapping! Same as what madVR does when you activate HDR -> SDR conversion.

Probably I should rename the options in madVR, because they seem to be confusing for users. People seem to think that HDR TVs can somehow do magic, and letting madVR convert HDR to SDR will produce worse results than if the HDR TV receives the full HDR content. In reality the HDR TV will do the same processing madVR does - only in worse quality.
I fear there's a misunderstanding.

First: I've never doubted that madVR could do a better tone mapping job than what internally LG OLEDs achieve. The internal SoC is most likely no match for a powerful enough GPU. *And* I have better faith in your algorithms than LG's.

You might be right that LG's algorithm isn't smart enough to take into account ABL. But that's no reason for madVR not to do it, correct? Assuming RTings values are correct (and they do provide them for many TVs out there), it would be a great option to have to provide even better tone mapping, wouldn't it? Having peak nits values according to "screen space" occupied by high nits content. Maybe it's too hard to do that calculation in real time, I don't know. Or maybe it's useless, I don't know.
Although, if it's impossibile to disable tonemapping on the TV, from what you state I conclude that it would be better to turn off the option in madVR anyway, am I right?

Lastly, although I know the above was not directed to me, I have pretty clear in mind the fact that all OLEDs tone map HDR content. There's 1000 nits mastered content and 4000 nits mastered content. Peak nits highlights in OLEDs don't even reach 900, so tone mapping is a must.
ashlar42 is offline   Reply With Quote
Old 29th September 2018, 14:55   #52837  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 445
Quote:
Originally Posted by madshi View Post
Code:
* fixed: render & present queues didn't always fill in Windows 10 build 1803
Just tested and I can confirm that my problems are gone! Thanks for the great work
Ver Greeneyes is offline   Reply With Quote
Old 29th September 2018, 15:52   #52838  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 50
Quote:
Originally Posted by thighhighs View Post
windowed mode OSD (rendering time) still is broken for me. I think Windows 10 1803 (x64) introduce this bug. OSD show ~9ms rendering, but this impossible for my old Kepler GPU. Also i'm get different stast for FSE with same settings: ~9ms (wrong) vs ~20ms FSE (looks like truth).
This bug is now fixed
thighhighs is offline   Reply With Quote
Old 29th September 2018, 16:11   #52839  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,137
Quote:
Originally Posted by ashlar42 View Post
I fear there's a misunderstanding.

First: I've never doubted that madVR could do a better tone mapping job than what internally LG OLEDs achieve.
Yes, you did! You wrote:

> I doubt that madVR can provide a better tone mapping
> results than the internal system in an OLED screen

Quote:
Originally Posted by ashlar42 View Post
You might be right that LG's algorithm isn't smart enough to take into account ABL. But that's no reason for madVR not to do it, correct? Assuming RTings values are correct (and they do provide them for many TVs out there), it would be a great option to have to provide even better tone mapping, wouldn't it? Having peak nits values according to "screen space" occupied by high nits content. Maybe it's too hard to do that calculation in real time, I don't know. Or maybe it's useless, I don't know.
Taking the ABL into account would only make sense if I knew exactly how the ABL was implemented. But I don't. It could differ from LG to Panasonic to Sony (all using the same LG OLED panel). It could differ from LG generation to LG generation. It could even differ from firmware version to firmware version!

If I don't know the *exact* way the ABL works, then trying to adjust to the ABL may make things worse than better.

RTings might measure some thing, but they don't measure everything. E.g. does the ABL react to the brightest subpixel? Or to the combined brightness of all subpixels? What happens if there's very bright green, but blue and red are off? Which number of pixels exactly have to surpass a specific threshold to activate ABL? And is it a fixed threshold, or a "fuzzy" logic? I would basically need access to the exact formulas used by the ABL, for this to make any sense.

Quote:
Originally Posted by ashlar42 View Post
Although, if it's impossibile to disable tonemapping on the TV, from what you state I conclude that it would be better to turn off the option in madVR anyway, am I right?
No. Even when sending HDR to the display, double tone mapping *could* be better than not letting madVR doing any tone mapping at all. Or maybe not. It's impossible to say without testing it. Furthermore, you can let madVR tone map and then send the video as SDR to the display. This way tone mapping in the TV should definitely be disabled.

Quote:
Originally Posted by ashlar42 View Post
Lastly, although I know the above was not directed to me, I have pretty clear in mind the fact that all OLEDs tone map HDR content. There's 1000 nits mastered content and 4000 nits mastered content. Peak nits highlights in OLEDs don't even reach 900, so tone mapping is a must.
That's not true, either. There are several UHD HDR Blu-Rays out there which have a MaxCLL value (brightest subpixel in the whole movie) below what current OLEDs can do. So a good OLED tone mapping implementation could detect this situation and then completely disable tone mapping.

Quote:
Originally Posted by magic144 View Post
Thanks for the swift XySubFilter-associated fix madshi, very much appreciated!
Quote:
Originally Posted by Ver Greeneyes View Post
Just tested and I can confirm that my problems are gone! Thanks for the great work
Quote:
Originally Posted by thighhighs View Post
This bug is now fixed


Quote:
Originally Posted by jespermart View Post
I get all sort of display devices in madvr, for the moment i have Yamaha RX2070, Intel Vertex twice and Visio M50-E1
and for the moment only the Visio M50 are active but I don't own a Visio M50, so what is happening with my devices and how do i get rid of an active device I don' own?
The devices come from the EDIDs that your GPUs or the OS report. Yamaha RX2070 sounds like your receiver? What kind of display are you using? It seems the EDID of your display reports itself as "Visio M50", for whatever reason. Of course you can manually rename the display in the madVR settings. madVR just sets the names by default, based on what the EDID reports.
madshi is offline   Reply With Quote
Old 29th September 2018, 17:16   #52840  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,889
Quote:
Originally Posted by mytbyte View Post
@huhn: I think I understand now what you want to say but default values are for people who don't know their gamma, white and black levels and that (must be) ok. But HDR levels are absolute and those values won't be converted to absolute brightness levels unless you specify actual SDR gamma you calibrated your tv to if you have the possibility to do that. I know, you'll say that top brightness differs from Tv to Tv (even HDR models) anyway so why stick to absolute levels but for my way of thinking, HDR-SDR conversion is also designed to simulate, on an SDR TV with high brightness, as close as possible how it would look on a HDR display, not just losely convert to SDR, if you calibrated your SDR tv and know it's characteristics. Of course, it means multiple processing, quality depends on TV's gamma precission and lut bitdepth and thus banding is a risk.

Sent from my GM 5 Plus d using Tapatalk
i'm not even saying this option should be removed. the problem is the place it is used.

i even can turn the whole thing on the top.
what about bt 1886 or a 3D LUT so we just do nothing in this case?

or why is gamma 2.4 used in the first place over 2.2 by user in a dark room? ambient light sources and so much more gamma related things. nothing of this changes even if the source is HDR. even the brightness in an SDR file is absolute if someone would follow would follow it exactly and even that doesn't change that a gamma of 2.1 and lower can be quite handy depending on other factor.

and how does any of this help user that don't know there gamma? that the calibration tab not something to guess about.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:09.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.