Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 21st February 2019, 00:08   #54921  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 300
Quote:
Originally Posted by Grimsdyke View Post
It's way too dark !! For example, the 'Rabbit in Red' matchbook at around 9:12 is almost completely in the dark.
Unfortunately I can't use 'copyback' because my gpu is not fast enough. So I hope that Madshi might fix this soon !! Thx
measure nits only work with Cpu decode, or Copyback dxva2 or dx11.

It does not work with native dxva or native dx11, because it can not measure the frame nits.

There's no way around this. if you want to use dynamic hdr + highlight recovery, you'll have to upgrade your GPU.

Which gpu do u have.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 21st February 2019 at 00:11.
tp4tissue is offline   Reply With Quote
Old 21st February 2019, 00:31   #54922  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 404
Measure nits work with native DXVA however black bar detection doesn't, so maybe the two features are interacting in a bad way.
__________________
HTPC: W10 1809, E7400, 1050 Ti, DVB-C, Denon 2310, Panasonic GT60 | Desktop: W10 1809, 4690K, HD 7870, Dell U2713HM | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR
el Filou is offline   Reply With Quote
Old 21st February 2019, 08:53   #54923  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 300
Quote:
Originally Posted by el Filou View Post
Measure nits work with native DXVA however black bar detection doesn't, so maybe the two features are interacting in a bad way.
Why does native dxva cause banding on RX580 though..
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 21st February 2019, 10:15   #54924  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 24
Quote:
Originally Posted by tp4tissue View Post
Why does native dxva cause banding on RX580 though..
I have same issue on my Radeon Vega8. In addition DXVA(native) is more resource-intensive than CB.

Last edited by DMU; 21st February 2019 at 10:36.
DMU is offline   Reply With Quote
Old 21st February 2019, 10:21   #54925  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 660
Did anyone notice that 12bits on nVidia seems to distort patterns when using MadTPG as a pattern source (in 4K UHD SDR at 23p) compared to 8bits?
Could anyone with access to calibration software/equipment confirm this?
This is most visible when tracking gamut saturation and luminance linearity, but it also impacts greyscale. White balance isn't impacted significantly.
8bits is very linear and doesn't distort.
12bits breaks linearity and distorts the native gamut significantly.
It would be great if someone could confirm.
I don't use FSE, this is in full screen windowed mode.
If you don't experience this, please specify OS, nVidia driver version, and software used to calibrate/measure. Although unlikely, it could be caused by a specific setting in madVR, but I haven't had the time to investigate this yet. I have all the performance settings unchecked.
Details of rig in my sig (except driver, I was testing 385.28 for 12bits as I cant' get a custom refresh rate to work with CRU on the new JVC models).
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 430.39 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 21st February 2019 at 10:50.
Manni is offline   Reply With Quote
Old 21st February 2019, 11:04   #54926  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,455
can you be a bit more precise so i can do the test quicker?
huhn is offline   Reply With Quote
Old 21st February 2019, 11:57   #54927  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 660
Quote:
Originally Posted by huhn View Post
can you be a bit more precise so i can do the test quicker?
I can't be more precise. I can ask differently though

Using MadTPG as a source for patterns, in 4K23p SDR, full screen windowed, using the widest gamut your display can fully cover, preferably DCI-P3:

1) Measure gamut saturations (in 20% or 25% steps) with nVidia CP set to 8bits (after checking that levels are correct).
2) Measure gamut saturation (in 20% or 25% steps) with nVidia CP set to 12bits (after checking that levels are still correct, they might need to be changed when switching between 8bits and 12bits).

Is there any significant difference?

Please report OS, driver, GPU, calibration software used with your results. I used 385.28 for my tests, otherwise rig as in my sig.

Thanks!
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 430.39 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 21st February 2019 at 12:00.
Manni is offline   Reply With Quote
Old 21st February 2019, 12:01   #54928  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,455
is send BT 2020 on or off?
huhn is offline   Reply With Quote
Old 21st February 2019, 13:15   #54929  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 660
Quote:
Originally Posted by huhn View Post
is send BT 2020 on or off?
It's on when I play HDR content, not when I calibrate. And it doesn't matter if 8bits or 12bits, as the profile is the same: SDR when calibrating, as I use SDR patterns to calibrated my SDR DCI-P3 calibration, HDR when playing content.

In any case, it can't make any difference, I select the calibration manually when I calibrate.

"Send BT2020" is only a metadata flag in the SDR HDMI stream that I asked madshi to implement to help selecting a calibration automatically. It doesn't change anything in the content/patterns themselves. The only thing it can achieve is getting the display to switch to a different calibration, that all.

So as long as the proper calibration is selected when calibrating, no difference can come from that. I know it's the case here, so make sure it's the case when you run the test.

If you need BT2020 to be on for your display to select the widest gamut, then use it. Just make sure the same calibration is enabled in 8bits and 12bits, so that there shouldn't be any difference when measuring it.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 430.39 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 21st February 2019 at 13:18.
Manni is offline   Reply With Quote
Old 21st February 2019, 13:23   #54930  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,455
399.07 960
displaycal 3.7.1.4
windows 10 17134
no correction
simulation profile DCI P3 D65
send bt 2020 was not used
screen with native gamut

biggest delta E between 8 bit at 8 bit and 10 bit output at 12 bit 0.2.
the used vertification testchart only uses 50% steps(didn't know that before) i do a proper one on the weekend or maybe today.

BTW. i'm not even aware of a LCD screen that can do DCI p3 100 %.
huhn is offline   Reply With Quote
Old 21st February 2019, 13:30   #54931  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,455
Quote:
Originally Posted by Manni View Post
It's on when I play HDR content, not when I calibrate. And it doesn't matter if 8bits or 12bits, as the profile is the same: SDR when calibrating, as I use SDR patterns to calibrated my SDR DCI-P3 calibration, HDR when playing content.

In any case, it can't make any difference, I select the calibration manually when I calibrate.

"Send BT2020" is only a metadata flag in the SDR HDMI stream that I asked madshi to implement to help selecting a calibration automatically. It doesn't change anything in the content/patterns themselves. The only thing it can achieve is getting the display to switch to a different calibration, that all.

So as long as the proper calibration is selected when calibrating, no difference can come from that. I know it's the case here, so make sure it's the case when you run the test.

If you need BT2020 to be on for your display to select the widest gamut, then use it. Just make sure the same calibration is enabled in 8bits and 12bits, so that there shouldn't be any difference when measuring it.
when a screen think the input image is bt 2020 it has to change it processing because it has to clip/roll of or just squish it in that's totally different from BT 709 or just switching a screen to "native" where the screen still believes the input is "bt 709" or at least should believe that.

or with other word your device may or may not behave totally different between send bt 2020 or native.

but you don't use this option to calibrate and or verify so it shouldn't matter in this case
huhn is offline   Reply With Quote
Old 21st February 2019, 13:35   #54932  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 660
Quote:
Originally Posted by huhn View Post
399.07 960
displaycal 3.7.1.4
windows 10 17134
no correction
simulation profile DCI P3 D65
send bt 2020 was not used
screen with native gamut

biggest delta E between 8 bit at 8 bit and 10 bit output at 12 bit 0.2.
the used vertification testchart only uses 50% steps(didn't know that before) i do a proper one on the weekend or maybe today.

BTW. i'm not even aware of a LCD screen that can do DCI p3 100 %.
Thanks a lot for checking this that quickly. Much appreciated.

I'll run some tests in displayCAL with your driver version to try to isolate where the issue might be coming from.

I never measure LCD monitors because I don't calibrate them. I only calibrate what I use to watch films, and that's never a monitor or even a TV.

I use a JVC RS2000 projector and with the P3 filter it can reach 100% of P3 in high lamp. Some units (not mine) can reach up to 108% of P3 with the filter. The Z1/RS4500 (laser) can reach even wider than that. On the new models such as my previous rs500, there are two filters, one green and one red (well, a dichroic filter if you prefer). On earlier models (that were also able to reach 100% of P3), there was only one green filter. Without the filters, they reach around 85-95% of P3, depending on the unit.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 430.39 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 21st February 2019 at 13:37.
Manni is offline   Reply With Quote
Old 21st February 2019, 13:41   #54933  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 660
Quote:
Originally Posted by huhn View Post
when a screen think the input image is bt 2020 it has to change it processing because it has to clip/roll of or just squish it in that's totally different from BT 709 or just switching a screen to "native" where the screen still believes the input is "bt 709" or at least should believe that.

or with other word your device may or may not behave totally different between send bt 2020 or native.

but you don't use this option to calibrate and or verify so it shouldn't matter in this case
Of course the display will process the content differently depending on wheter it's rec-709 or BT2020.

But the BT2020 flag by itself doesn't change anything in the content. As I explained, the only thing it can change is which display mode will be automatically selected by the display when it detects the flag. It's only a flag in the SDR HDMI stream. So if you select the display mode manually, or if the BT2020 flag has no effect on your display, the use of the flag doesn't account for any difference in measurements.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 430.39 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 21st February 2019, 15:14   #54934  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,455
let me say it this way my screen disagrees and behaves very different between native and send BT 2020.
huhn is offline   Reply With Quote
Old 21st February 2019, 16:01   #54935  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 202
Quote:
Originally Posted by huhn View Post
let me say it this way my screen disagrees and behaves very different between native and send BT 2020.
Cool one more thing to confuse me now
__________________
Windows 10-1809 | i5-3570k | GTX 1070 Windforce OC Rev2 8GB : 385.28 | Yamaha RX-V377 | Philips 65PUS6703 - 65"
madjock is offline   Reply With Quote
Old 21st February 2019, 16:17   #54936  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 404
So does that mean the differences Manni is seeing could be explained by a different processing of 8-bit and 12-bit input by the display?
__________________
HTPC: W10 1809, E7400, 1050 Ti, DVB-C, Denon 2310, Panasonic GT60 | Desktop: W10 1809, 4690K, HD 7870, Dell U2713HM | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR
el Filou is offline   Reply With Quote
Old 21st February 2019, 16:27   #54937  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,455
could be. but there so so many more possibilities.
not going to put a finger on it yet not until i have better readings from more than one person.
huhn is offline   Reply With Quote
Old 21st February 2019, 17:06   #54938  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 300
Quote:
Originally Posted by Manni View Post
Did anyone notice that 12bits on nVidia seems to distort patterns when using MadTPG as a pattern source (in 4K UHD SDR at 23p) compared to 8bits?
Could anyone with access to calibration software/equipment confirm this?
This is most visible when tracking gamut saturation and luminance linearity, but it also impacts greyscale. White balance isn't impacted significantly.
8bits is very linear and doesn't distort.
12bits breaks linearity and distorts the native gamut significantly.
It would be great if someone could confirm.
I don't use FSE, this is in full screen windowed mode.
If you don't experience this, please specify OS, nVidia driver version, and software used to calibrate/measure. Although unlikely, it could be caused by a specific setting in madVR, but I haven't had the time to investigate this yet. I have all the performance settings unchecked.
Details of rig in my sig (except driver, I was testing 385.28 for 12bits as I cant' get a custom refresh rate to work with CRU on the new JVC models).

I noticed this in the form of less gradient smoothness.

But I don't think it's nvidia's issue, the TV probably doesn't react very well to 10 or 12 bit. Because its frc dithering algorithm may have some inherent sharpening or debanding which doesn't allow for perfectly smooth gradient.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 21st February 2019, 17:10   #54939  |  Link
YGPMOLE
Registered User
 
Join Date: Nov 2012
Posts: 29
A couple of stupid questions (sorry!).

1) Now I'm trying the beta version (.54) with an AMD RX480 in full RGB 8Bit, LAV in DX3D11, madVR 8bit, but the on-screen display shows the output DX3D11 10Bit that goes at 8Bit only when I open the mený (and goes back to 10 when I close it) even with SD or FullHD files 8Bit as input: it's that normal?

2) I have all the displays mode inserted (from 1080p23 to 2160p60), but I got no switches with 23.976, 24.000 or 25.000 files like it was with the .17 version: still normal?
__________________
Best Regards! Leo!
YGPMOLE is offline   Reply With Quote
Old 21st February 2019, 17:10   #54940  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 660
Quote:
Originally Posted by huhn View Post
let me say it this way my screen disagrees and behaves very different between native and send BT 2020.
yes, and it should if you don't have another way to select a calibration manually.

This is not the way it works with all displays.

And again, the way the display behaves has nothing to do with the content. The content is identical even when the enable BT2020 flag is sent.

It is, again, only a single metadata flag.

So either your screen reacts to it, and selects a different calibration.

Or your screen doesn't react to it.

You are really making things confusing with your statements.

Your are confusing the display's response and the content.

The content is 100% the same whether the report BT2020 checkbox is enabled or not.

Enabling it *might* trigger a different calibration/mode in your display.

If it does, then of course the display's response will be different. Hopefully, it will be more appropriate to the content. But the content itself will be 100% identical.

So if you can select manually the mode that *might* be selected automatically when the BT2020 flag is enable, there should be zero difference, because the content is the same.

Please can you confirm this instead of implying that things might be different? It will help everyone to understand what is happening and not cause unnecessary confusion.

The issue I am reporting, which apparently you are not experiencing, has nothing to do with the BT2020 flag. At least on my display, I can tell you it's 100% unrelated. It simply can't be.

What you are stating, which is that your display doesn 't respond the same whether you enable the BT2020 flag or not, is 100% unrelated. It is to be expected with most displays.

As long as the flag remains the same during measurements between 8bits and 12bits, and that levels are the same, that makes zero difference. Enable if you need it to access your wide/native gamut, disable otherwise, just make sure it's consistent between the two measurements.

@El filou: yes what I'm seeing could be explained by either a different processing by the display between 8bits and 12bits, or by a bug in the drivers/OS/calibration software. That's why I need to first see if I can get the expected results in 12bits, the way Huhn does, and then see what produces the wrong results, changing one parameter at a time (driver, calibration software). If I still have the issue, then it might be the processing in the display and I'll have to try with another display, until I find the cause.

But it's great to know that it's not an issue for everyone, at least if gives me something to try. So thanks again to huhn for that.

If anyone has a JVC projector and would like to test and report if they see the same behaviour I do (or not), that would be great too.

I don't have the version Huhn used, but I'll try with the latest before that, which is 397.93. 398.11 changed a lot of things, so 397.97 should be similar to 398.07.
__________________
Win10 Pro x64 b1806 MCE
i7 3770K@4.0Ghz 32Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 430.39 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.24
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 18:05.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.