Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 29th September 2018, 09:16   #52821  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 212
@huhn: I think I understand now what you want to say but default values are for people who don't know their gamma, white and black levels and that (must be) ok. But HDR levels are absolute and those values won't be converted to absolute brightness levels unless you specify actual SDR gamma you calibrated your tv to if you have the possibility to do that. I know, you'll say that top brightness differs from Tv to Tv (even HDR models) anyway so why stick to absolute levels but for my way of thinking, HDR-SDR conversion is also designed to simulate, on an SDR TV with high brightness, as close as possible how it would look on a HDR display, not just losely convert to SDR, if you calibrated your SDR tv and know it's characteristics. Of course, it means multiple processing, quality depends on TV's gamma precission and lut bitdepth and thus banding is a risk.

Sent from my GM 5 Plus d using Tapatalk

Last edited by mytbyte; 29th September 2018 at 09:21.
mytbyte is offline   Reply With Quote
Old 29th September 2018, 10:37   #52822  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,666
Quote:
Originally Posted by alps006 View Post
Just an FYI. I got really frustrated with Winows 10 having so many issues recently with nvidia drivers. I switched back to Windows 8.1, to tell you the truth, with the latest nvidia driver, playback of 4K HDR is fantastic without even a single glitch. I have correct composition rate (In Windows 10 it was always 23,980).
You do know it's the Nvidia driver which is broken?
Why is 399.xx working fine for HDR? Because the 399.xx driver is fine.

You get another installer for Win 8.1 which is also smaller:

Version: 411.70 WHQL
Freigabedatum: 2018.9.27
Betriebssystem: Windows 7 64-bit, Windows 8.1 64-bit, Windows 8 64-bit
Dateigröße: 469.19 MB
--------------------------------------------------------------------------------------------
Version: 411.70 WHQL
Freigabedatum: 2018.9.27
Betriebssystem: Windows 10 64-bit
Dateigröße: 520.35 MB

But I have the same opinion that Win 8.1 is better suited for HTPC use, but since I am using Win10 for over 3 years now, I think Nvidia should be able to deliver a working driver and not brk things repeatedly.

I stay with Win 10 for the next time.
Klaus1189 is offline   Reply With Quote
Old 29th September 2018, 11:59   #52823  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
madVR v0.92.17 released

http://madshi.net/madVR.zip

Code:
* modified/simplified HDR tone mapping settings page
* small HDR tone mapping saturation improvement
* OSD now also shows the measured luminance of the current frame (in addition to the average)
* fixed: render & present queues didn't always fill in Windows 10 build 1803
* fixed: using XySubFilter sometimes resulted in black screen / freeze
* fixed: using HDR "processing" resulted in dark and red-ish image
* fixed: using BT.601/709 gamma curve with HDR tone mapping produced gray-ish image
* fixed: settings dialog sometimes crashed on display mode / custom mode tab
The HDR settings dialog changes *could* maybe introduce some new bugs, but I hope not. I've modified the HDR settings to make it less confusing. I don't want users to think that they somehow lose HDR by doing tone mapping.
madshi is offline   Reply With Quote
Old 29th September 2018, 12:09   #52824  |  Link
magic144
Registered User
 
Join Date: May 2005
Posts: 395
Thanks for the swift XySubFilter-associated fix madshi, very much appreciated!
magic144 is offline   Reply With Quote
Old 29th September 2018, 13:05   #52825  |  Link
jespermart
Registered User
 
Join Date: Mar 2018
Posts: 22
madVR display device

I get all sort of display devices in madvr, for the moment i have Yamaha RX2070, Intel Vertex twice and Visio M50-E1
and for the moment only the Visio M50 are active but I don't own a Visio M50, so what is happening with my devices and how do i get rid of an active device I don' own?
jespermart is offline   Reply With Quote
Old 29th September 2018, 13:30   #52826  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 652
Quote:
Originally Posted by madshi View Post
No, because you *can* test HDR on your Kuro, by using madVR's HDR -> SDR conversion.


First of all, I never stated that *all* TVs are not smart enough, I'm usually careful enough to talk about "many" or "most" TVs. There may be TVs which are smart enough. Maybe the latest Panasonic OLED could be, I don't know.

If the TV is not smart enough, it will simply apply a compression curve to every pixel. If madVR has already applied tone mapping before, that means the TV will compress the content even further. It might not be a dramatic problem, but it's far from optimal.


What does ABL have to do with tone mapping? I don't think that the tone mapping algo in the OLED TVs is smart enough to consider ABL. As such, there's no reason to think that madVR couldn't provide a better tone mapping result than the internal system.

You don't seem to understand the whole tone mapping concept. What do you think the 2017 LG OLED does internally when you feed it HDR? In case you don't know: It applies tone mapping! Same as what madVR does when you activate HDR -> SDR conversion.

Probably I should rename the options in madVR, because they seem to be confusing for users. People seem to think that HDR TVs can somehow do magic, and letting madVR convert HDR to SDR will produce worse results than if the HDR TV receives the full HDR content. In reality the HDR TV will do the same processing madVR does - only in worse quality.
I fear there's a misunderstanding.

First: I've never doubted that madVR could do a better tone mapping job than what internally LG OLEDs achieve. The internal SoC is most likely no match for a powerful enough GPU. *And* I have better faith in your algorithms than LG's.

You might be right that LG's algorithm isn't smart enough to take into account ABL. But that's no reason for madVR not to do it, correct? Assuming RTings values are correct (and they do provide them for many TVs out there), it would be a great option to have to provide even better tone mapping, wouldn't it? Having peak nits values according to "screen space" occupied by high nits content. Maybe it's too hard to do that calculation in real time, I don't know. Or maybe it's useless, I don't know.
Although, if it's impossibile to disable tonemapping on the TV, from what you state I conclude that it would be better to turn off the option in madVR anyway, am I right?

Lastly, although I know the above was not directed to me, I have pretty clear in mind the fact that all OLEDs tone map HDR content. There's 1000 nits mastered content and 4000 nits mastered content. Peak nits highlights in OLEDs don't even reach 900, so tone mapping is a must.
ashlar42 is offline   Reply With Quote
Old 29th September 2018, 13:55   #52827  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Quote:
Originally Posted by madshi View Post
Code:
* fixed: render & present queues didn't always fill in Windows 10 build 1803
Just tested and I can confirm that my problems are gone! Thanks for the great work
Ver Greeneyes is offline   Reply With Quote
Old 29th September 2018, 14:52   #52828  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 70
Quote:
Originally Posted by thighhighs View Post
windowed mode OSD (rendering time) still is broken for me. I think Windows 10 1803 (x64) introduce this bug. OSD show ~9ms rendering, but this impossible for my old Kepler GPU. Also i'm get different stast for FSE with same settings: ~9ms (wrong) vs ~20ms FSE (looks like truth).
This bug is now fixed
thighhighs is offline   Reply With Quote
Old 29th September 2018, 15:11   #52829  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by ashlar42 View Post
I fear there's a misunderstanding.

First: I've never doubted that madVR could do a better tone mapping job than what internally LG OLEDs achieve.
Yes, you did! You wrote:

> I doubt that madVR can provide a better tone mapping
> results than the internal system in an OLED screen

Quote:
Originally Posted by ashlar42 View Post
You might be right that LG's algorithm isn't smart enough to take into account ABL. But that's no reason for madVR not to do it, correct? Assuming RTings values are correct (and they do provide them for many TVs out there), it would be a great option to have to provide even better tone mapping, wouldn't it? Having peak nits values according to "screen space" occupied by high nits content. Maybe it's too hard to do that calculation in real time, I don't know. Or maybe it's useless, I don't know.
Taking the ABL into account would only make sense if I knew exactly how the ABL was implemented. But I don't. It could differ from LG to Panasonic to Sony (all using the same LG OLED panel). It could differ from LG generation to LG generation. It could even differ from firmware version to firmware version!

If I don't know the *exact* way the ABL works, then trying to adjust to the ABL may make things worse than better.

RTings might measure some thing, but they don't measure everything. E.g. does the ABL react to the brightest subpixel? Or to the combined brightness of all subpixels? What happens if there's very bright green, but blue and red are off? Which number of pixels exactly have to surpass a specific threshold to activate ABL? And is it a fixed threshold, or a "fuzzy" logic? I would basically need access to the exact formulas used by the ABL, for this to make any sense.

Quote:
Originally Posted by ashlar42 View Post
Although, if it's impossibile to disable tonemapping on the TV, from what you state I conclude that it would be better to turn off the option in madVR anyway, am I right?
No. Even when sending HDR to the display, double tone mapping *could* be better than not letting madVR doing any tone mapping at all. Or maybe not. It's impossible to say without testing it. Furthermore, you can let madVR tone map and then send the video as SDR to the display. This way tone mapping in the TV should definitely be disabled.

Quote:
Originally Posted by ashlar42 View Post
Lastly, although I know the above was not directed to me, I have pretty clear in mind the fact that all OLEDs tone map HDR content. There's 1000 nits mastered content and 4000 nits mastered content. Peak nits highlights in OLEDs don't even reach 900, so tone mapping is a must.
That's not true, either. There are several UHD HDR Blu-Rays out there which have a MaxCLL value (brightest subpixel in the whole movie) below what current OLEDs can do. So a good OLED tone mapping implementation could detect this situation and then completely disable tone mapping.

Quote:
Originally Posted by magic144 View Post
Thanks for the swift XySubFilter-associated fix madshi, very much appreciated!
Quote:
Originally Posted by Ver Greeneyes View Post
Just tested and I can confirm that my problems are gone! Thanks for the great work
Quote:
Originally Posted by thighhighs View Post
This bug is now fixed


Quote:
Originally Posted by jespermart View Post
I get all sort of display devices in madvr, for the moment i have Yamaha RX2070, Intel Vertex twice and Visio M50-E1
and for the moment only the Visio M50 are active but I don't own a Visio M50, so what is happening with my devices and how do i get rid of an active device I don' own?
The devices come from the EDIDs that your GPUs or the OS report. Yamaha RX2070 sounds like your receiver? What kind of display are you using? It seems the EDID of your display reports itself as "Visio M50", for whatever reason. Of course you can manually rename the display in the madVR settings. madVR just sets the names by default, based on what the EDID reports.
madshi is offline   Reply With Quote
Old 29th September 2018, 16:16   #52830  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by mytbyte View Post
@huhn: I think I understand now what you want to say but default values are for people who don't know their gamma, white and black levels and that (must be) ok. But HDR levels are absolute and those values won't be converted to absolute brightness levels unless you specify actual SDR gamma you calibrated your tv to if you have the possibility to do that. I know, you'll say that top brightness differs from Tv to Tv (even HDR models) anyway so why stick to absolute levels but for my way of thinking, HDR-SDR conversion is also designed to simulate, on an SDR TV with high brightness, as close as possible how it would look on a HDR display, not just losely convert to SDR, if you calibrated your SDR tv and know it's characteristics. Of course, it means multiple processing, quality depends on TV's gamma precission and lut bitdepth and thus banding is a risk.

Sent from my GM 5 Plus d using Tapatalk
i'm not even saying this option should be removed. the problem is the place it is used.

i even can turn the whole thing on the top.
what about bt 1886 or a 3D LUT so we just do nothing in this case?

or why is gamma 2.4 used in the first place over 2.2 by user in a dark room? ambient light sources and so much more gamma related things. nothing of this changes even if the source is HDR. even the brightness in an SDR file is absolute if someone would follow would follow it exactly and even that doesn't change that a gamma of 2.1 and lower can be quite handy depending on other factor.

and how does any of this help user that don't know there gamma? that the calibration tab not something to guess about.
huhn is offline   Reply With Quote
Old 29th September 2018, 16:32   #52831  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
The only thing the user should know is that this setting will not impact SDR content. Most of that content today is 2.40 based on the calibration of the most popular mastering monitors out there. You have to guess at what your calibrated gamma might be for HDR -> SDR because there is no way for madVR to measure this. Even then, you may get the best result with a gamma other than 2.20. For some people (me included), 2.20 crushes black when PQ is converted to pure power gamma. Not everyone would understand they can experiment with this setting without upsetting SDR content.
Warner306 is offline   Reply With Quote
Old 29th September 2018, 16:48   #52832  |  Link
jespermart
Registered User
 
Join Date: Mar 2018
Posts: 22
Quote:
Originally Posted by madshi View Post


The devices come from the EDIDs that your GPUs or the OS report. Yamaha RX2070 sounds like your receiver? What kind of display are you using? It seems the EDID of your display reports itself as "Visio M50", for whatever reason. Of course you can manually rename the display in the madVR settings. madVR just sets the names by default, based on what the EDID reports.
I have a samsung 59" plasma screen and a Jvc X5000 connected via a HdFury Vertex to a Yamaha RX-A2070 reciever
jespermart is offline   Reply With Quote
Old 29th September 2018, 16:49   #52833  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
would be nice if gamma would be such a simple topic than bt 1886 wouldn't be the "new" thing.

gamma 2.4 is a "bat cave" gamma and that's not a good idea in a daylight filled room and one of the reason is not generally correct and why there is no correct answer. getting the same gamma as a mastering screen is not always getting you proper result on a screen.
huhn is offline   Reply With Quote
Old 29th September 2018, 17:36   #52834  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 39
On the topic of gamma, my SDR mode has always been calibrated to BT.1886 (which is power 2.40 on an OLED), and has been the perfect option from a pitch dark to a dim/low light room. As long as there wasn't direct sunlight or glare, details were properly visible. However, HDR mode is forced into power 2.2 on my OLED, without the option to change it. I took this to imply that all HDR content is mastered in 2.2, and the picture is perfect under this gamma. How do the HDR to SDR or just HDR pixel processed modes take this into account? Does SDR have to be calibrated for 2.2 or does madVR process it correctly into BT.1886? How does madVR know the TV gamma if you disable calibration controls (or what is the default)?
blu3wh0 is offline   Reply With Quote
Old 29th September 2018, 20:04   #52835  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
HDR mode is always absolute PQ and not 2.20. HDR -> SDR is up to user preference and should match your known calibrated SDR gamma, but you could choose anything you want.

SDR content is mastered to a specific gamma as set by the mastering display, so madVR doesn't have to convert anything. As mentioned previously, the most commonly used mastering monitors today are a perfectly flat 2.40.
Warner306 is offline   Reply With Quote
Old 29th September 2018, 20:05   #52836  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 212
Quote:
Originally Posted by blu3wh0 View Post
However, HDR mode is forced into power 2.2 on my OLED, without the option to change it. I took this to imply that all HDR content is mastered in 2.2, and the picture is perfect under this gamma. How do the HDR to SDR or just HDR pixel processed modes take this into account? Does SDR have to be calibrated for 2.2 or does madVR process it correctly into BT.1886? How does madVR know the TV gamma if you disable calibration controls (or what is the default)?
Bold: That's weird - how do you come to this conclusion? In HDR mode PQ curve is in effect and the TV should inform you so - it should have nothing to do with gamma actually...it shouldn't get converted to gamma first to get converted for display, it goes straight from PQ to display (digital displays are linear)...

You calibrate to PQ or ideally, to BT.2390 curve that defines the tone mapping for TV's that can't reach required brightness.
MadVR doesn't know the gamma of your TV but is doesn't need to if your TV is HDR and you feed it HDR signal.
mytbyte is offline   Reply With Quote
Old 29th September 2018, 20:38   #52837  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 652
Quote:
Originally Posted by madshi View Post
Yes, you did! You wrote:

> I doubt that madVR can provide a better tone mapping
> results than the internal system in an OLED screen
Ok, you got me!

In my defense I meant it in the contest of being unable to take ABL into account (which is something I hope the TV internally does, I don't know for sure though) while I was asking if that could somehow be implemented in madVR. In any case, I'll stop here. I don't have an OLED screen yet (still plasma for me, last model Kuro), so I'm just discussing to stay "up to date" of where everything is going.
ashlar42 is offline   Reply With Quote
Old 29th September 2018, 21:00   #52838  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 39
Sorry, my mistake. The configuration is greyed out at 2.2, but I should have taken it as disabled. You can ignore the previous comment.
blu3wh0 is offline   Reply With Quote
Old 30th September 2018, 03:04   #52839  |  Link
alps006
Registered User
 
Join Date: Sep 2018
Posts: 22
I don't think the driver size matters as long as it works as expected, and it does work well on Windows 8.1. Just think why Madshi keeps recommending 8.1 for HTPC, that is for a reason. You can see most users here and other forums are having issues with Windows 10.Anyway, all I can say is 8.1 resolved all the issues that I was having on windows 10.
Quote:
Originally Posted by Klaus1189 View Post
You do know it's the Nvidia driver which is broken?
Why is 399.xx working fine for HDR? Because the 399.xx driver is fine.

You get another installer for Win 8.1 which is also smaller:

Version: 411.70 WHQL
Freigabedatum: 2018.9.27
Betriebssystem: Windows 7 64-bit, Windows 8.1 64-bit, Windows 8 64-bit
Dateigröße: 469.19 MB
--------------------------------------------------------------------------------------------
Version: 411.70 WHQL
Freigabedatum: 2018.9.27
Betriebssystem: Windows 10 64-bit
Dateigröße: 520.35 MB

But I have the same opinion that Win 8.1 is better suited for HTPC use, but since I am using Win10 for over 3 years now, I think Nvidia should be able to deliver a working driver and not brk things repeatedly.

I stay with Win 10 for the next time.
alps006 is offline   Reply With Quote
Old 30th September 2018, 03:53   #52840  |  Link
suanm
Registered User
 
Join Date: Apr 2011
Posts: 121
I played several movies just now with MadVR v0.92.17.The visual effects between the HDR mode and the non-HDR mode(I can't regard it as HDR->SDR mode any more,LOL,because of modified name) look undifferentiated when I set peak nits to 1099,899,699,499,299,200 separately.I guess the non-HDR mode should be devoted to the oled TV set with lower brightness.Unfortunatedly,Running the highlights recovery algorithm will consume pc resources tremendously so that the playback of HDR movies gets stammering and stuttering very much.I'm expecting the algorithm will improve dramatically soon later.
Thanks to super master Madshi for your diligent work.
suanm is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:55.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.