Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 31st May 2018, 00:03   #51121  |  Link
Polopretress
Registered User
 
Join Date: Sep 2017
Posts: 28
Quote:
Originally Posted by nevcairiel View Post
Clock deviation makes no real difference. madVR accounts for it when calculating the repeat/drop timer, you just need to play long enough for it to stabilize, and the ultimate test is for every setup to just play a 2 hour movie and check the stats afterwards (clear them after launch to get rid of the first drops during startup).
I agree but not in the conclusion.
Clock deviation is tacken into account in the calculation but the result will not be "no drop frame expected" when the display will be at 23.97602 Hz if clock deviation is not null.

That means that , with clock deviation, you if you want to target "no drop frame expected", you will not be at 23.97602.
(or if you want to target 23.97602, you will not be at "no drop frame expected")

And let's try the both players (MPC and potplayer) in pcm with the same settings and you wil see that "display" is at the same value and the tilme counter is different

So which player is giving me the truth ?
For me , it is clear that the value of the "display" is the truth because the both players give the same result.
Then my conclusion is that it is the time counter that is wrong using MPC-BE with clock deviation not null.

That's the reason why, i said, the judge is the value of the "display" and not the time counter if clock deviation is not null.
then i think that clock deviation is more tacken into account in the value of the "display" than in the value of the time counter.


If i am wrong , do you mean that i need to follow the time counter instead of the "display" ?
If it is the case, how to explain that value of "display" are the same with a player with no clock deviation and another one with clock deviation not null ?


Quote:
I also assume you are not Bitstreaming DTS-HD or TrueHD via HDMI to get 0.000000% and POT player uses some sort of re-clock, which is no use to people who want to Bitstream.
Exact i am in pcm.
Potplayer in bitstream works fine but clock deviation is not null (like mpc-be).
Nervertheless... the value of "display" is the same and perfectly adjusted to 23.97602 (but the time counter is different and very far from "no drop frame expected".

Same question that previously. Which one is the truth ?
For me it is the "display" and not the counter.

Show me an example with a display at 23.97602 and the counter at "no drop frame expected" while the clock deviation is not null and i will revise my judgement

Last edited by Polopretress; 31st May 2018 at 00:15.
Polopretress is offline   Reply With Quote
Old 31st May 2018, 00:15   #51122  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,493
Quote:
Originally Posted by Polopretress View Post
I agree but not in the conclusion.
Clock deviation is tacken into account in the calculation but the result will not be "no drop frame expected" when the display will be at 23.97602 Hz if clock deviation is not null.

That means that , with clock deviation, you if you want to target "no drop frame expected", you will not be at 23.97602.
(or if you want to target 23.97602, you will not be at "no drop frame expected")
But thats perfectly fine. The goal is to not have any frame drops or repeats, not to hit some special magical number. Its perfectly expected to require a slightly different refresh rate to compensate for the clock deviation.

The perfect refresh rate would be where the display refresh rate multiplied by the clock deviation would be 23.97602 (without software tricks enabled), so for example if you have a small clock deviation of 0.003, you would want a refresh rate of 23.9767 instead (because 23.9760 + 0.003% = 23.9767)

And as others have said, in PCM mode PotPlayer just "cheats" by modifying the audio to perfectly match the video clock, ie. like ReClock. You could get it to match any refresh rate then. But it does modify the audio, so its no longer "bit-exact". Its probably not something anyone can hear and it usually works just fine, but you need to be aware that your software works around the hardware problem in this manner. The clock deviation does not go away, its just "hidden" from madVR and compensated for on the audio side instead. And of course this does not work when bitstreaming.

The entire drop/repeat frame mechanic only exists because of audio. If we had no audio, we could just show the video at 23.977 instead of 23.976 and no person in the world would ever notice. But we do have audio, and the audio would lose sync with the video. So to maintain sync, the video renderer has to drop/repeat a frame when appropriate. Now the alternative solution is to change the audio instead, which is what PotPlayer is doing, and why its claiming a zero clock deviation - because it handles that, and madVR does not have to.

The only number that really matters is the number of dropped/repeated frames at the end of the movie. Everything else is just there as information.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 31st May 2018 at 00:46.
nevcairiel is online now   Reply With Quote
Old 31st May 2018, 00:35   #51123  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 331
Quote:
Originally Posted by Manni View Post
HDR to SDR, when done well, is the same as HDR. It's just that the source (MadVR here) does the conversion instead of the display.
If that really was the case then why would TV reviewers who test stuff like 2% window white find very different brightness levels in SDR and HDR modes (last two that come to mind were 400 cd/mē SDR / 700 HDR on an LG OLED, and 750 SDR / 1250 HDR on a Sony LCD)?
__________________
HTPC: W10 1803, E7400, NVIDIA 1050 Ti, DVB-C, Panasonic GT60 | Desktop: W10 1803, 4690K, AMD 7870, Dell U2713HM | Laptop: Insider Slow, i5-2520M | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR
el Filou is offline   Reply With Quote
Old 31st May 2018, 01:03   #51124  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 549
Quote:
Originally Posted by el Filou View Post
If that really was the case then why would TV reviewers who test stuff like 2% window white find very different brightness levels in SDR and HDR modes (last two that come to mind were 400 cd/mē SDR / 700 HDR on an LG OLED, and 750 SDR / 1250 HDR on a Sony LCD)?
Is this with MadVR?

Some players do an HDR to SDR conversion that doesn't exploit the whole dynamic range. Others don't even use the wider gamut.

I'm talking about pixel shader with MadVR, given that it's the topic of the thread. I don't know about OLEDs/LCDs, I can only comment about what I see on my JVC projector. Some displays might not be able to provide the SDR BT2020 mode with full brightness that the JVC delivers to be used as a baseline for MadVR's HDR to SDR conversion (pixel shader).
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000
Manni is offline   Reply With Quote
Old 31st May 2018, 01:34   #51125  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,298
Yes, your projector probably does not have a seperate HDR mode in the same way TVs do. All non-projectors I have seen (not that many) do not work this way. You cannot use pixel shaders to get a similar image when the display is in its SDR mode. For one thing it will not go as bright, reasonably given that SDR at 700 nits peak white would be pretty bright and the display couldn't handle it. Projectors tend to be a lot dimmer so maybe they use the same peak brightness for both HDR and SDR.

I think we need to remember that HDR works very differently in projectors compared to backlit displays and to not assume we can transfer our experiance directly.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 31st May 2018, 05:13   #51126  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 935
For the HDR -> SDR for 1080p question,

I have two 1080p displays. One is calibrated to about 150 nits, the other is probably closer to 200 nits. I find a target nits between 400 to 500 nits seems to achieve adequate brightness.

Set the tone mapping curve to BT.2390.

Gamut mapping can be set to any of the comlex scientific modes or dumb mode - convert gamut late. It can be hard to immediately notice a difference between any of the gamut mapping algorithms, so this choice may not be critical to you.

Check the rest of the boxes at the bottom.

It is really that simple and not that hard to figure out. Adjust the target nits to your own tastes.

It is an adequate way to show HDR on a non-HDR screen if you accept the limitations of tone mapping. It is certainly a viable way to watch 4K UHD content.

As for the levels thing, I think this is a JVC projector oddity. I was trying to help someone with a JVC projector at AVSForums and his projector displays the strangest behavior. Every time he would fix something, something else would break. It is the most bizzare machine I have come across that runs madVR. His graphics card is a GTX 1050. He actually posted in this forum, so he might be reading this. I know you haven't had as many problems with your JVC, but it doesn't seem to like Windows or anything output from a GPU.

Last edited by Warner306; 31st May 2018 at 05:17.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 06:01   #51127  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 2,970
Is there a HDR test video that one could use for setting ideal target nits?
ryrynz is offline   Reply With Quote
Old 31st May 2018, 08:03   #51128  |  Link
sauma144
Registered User
 
Join Date: Sep 2016
Posts: 89
Quote:
Originally Posted by nevcairiel View Post
In PCM mode PotPlayer just "cheats" by modifying the audio to perfectly match the video clock, ie. like ReClock. You could get it to match any refresh rate then.
Seriously? I never noticed.
sauma144 is offline   Reply With Quote
Old 31st May 2018, 10:31   #51129  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 549
Quote:
Originally Posted by Warner306 View Post
As for the levels thing, I think this is a JVC projector oddity. I was trying to help someone with a JVC projector at AVSForums and his projector displays the strangest behavior. Every time he would fix something, something else would break. It is the most bizzare machine I have come across that runs madVR. His graphics card is a GTX 1050. He actually posted in this forum, so he might be reading this. I know you haven't had as many problems with your JVC, but it doesn't seem to like Windows or anything output from a GPU.
I was in touch with him by PM. This is because he was using a recent driver (39x.x). As I reported yesterday, each 39x version broke something new, at least here. The best driver to use with a JVC is 385.28. With this, you can get correct levels in RGB Full, 12bits in SDR and HDR, and he will only get the magenta bug (which can be fixed, as I told him, by disabling "Sending HDR metadata) at 4K60p.

With a recent driver, he can solve things if he disables "Send HDR Metadata" and uses 8bits in the GPU. But that's not an option for me at this stage (I need the metadata for the Vertex), so I went back to 385.28 for now.

The main downside of 385.28 is 3D (1 frame drop repeat every 3mn vs every 13mn with 397.93), and very minor banding in HDR passthrough (not sure when that was introduced by nVidia, but it's still here with 39x.x). You can solve most of the banding in passthrough using 9bits instead of 10bits dithering in MadVR, unfortunately that's not possible with the latest MadVR build, so you have to revert to the previous one.

The JVCs do have some quirks, but they can be used fine if you know how to drive them
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000

Last edited by Manni; 31st May 2018 at 11:01.
Manni is offline   Reply With Quote
Old 31st May 2018, 10:52   #51130  |  Link
SuLyMaN
Registered User
 
Join Date: Jul 2007
Posts: 156
Will madvr work with an intel HD 630 onboard graphics? Looking at the requirements, I'd say yes...but you never know. Anybody has tried it?
SuLyMaN is offline   Reply With Quote
Old 31st May 2018, 12:27   #51131  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 107
Quote:
Originally Posted by SuLyMaN View Post
Will madvr work with an intel HD 630 onboard graphics? Looking at the requirements, I'd say yes...but you never know. Anybody has tried it?
Depends what you mean by work, since it is not a dedicated card with its own memory then you will be very limited on what you could do with madVR.

Last edited by madjock; 31st May 2018 at 12:35.
madjock is offline   Reply With Quote
Old 31st May 2018, 15:46   #51132  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 935
Quote:
Originally Posted by Manni View Post
I was in touch with him by PM. This is because he was using a recent driver (39x.x). As I reported yesterday, each 39x version broke something new, at least here. The best driver to use with a JVC is 385.28. With this, you can get correct levels in RGB Full, 12bits in SDR and HDR, and he will only get the magenta bug (which can be fixed, as I told him, by disabling "Sending HDR metadata) at 4K60p.

With a recent driver, he can solve things if he disables "Send HDR Metadata" and uses 8bits in the GPU. But that's not an option for me at this stage (I need the metadata for the Vertex), so I went back to 385.28 for now.

The main downside of 385.28 is 3D (1 frame drop repeat every 3mn vs every 13mn with 397.93), and very minor banding in HDR passthrough (not sure when that was introduced by nVidia, but it's still here with 39x.x). You can solve most of the banding in passthrough using 9bits instead of 10bits dithering in MadVR, unfortunately that's not possible with the latest MadVR build, so you have to revert to the previous one.

The JVCs do have some quirks, but they can be used fine if you know how to drive them
Well I'm glad you helped him. He was using 385.28 when I talked to him and was still having black screens and some oddities with 3D playback, so I'm not sure if it has all been ironed out.

Last edited by Warner306; 31st May 2018 at 16:44.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 16:20   #51133  |  Link
veggav
Registered User
 
Join Date: Mar 2008
Posts: 76
Quick question

My TV has the following color spaces to choose from
sRGB/BT.709
DCI
Adobe RGB
BT.2020

And MadVR has the following options in "the display is calibrated to the following primaries / gamut:
BT.709
SMPTE C
EBU / Pal
BT2020
DCI-P3

I let madVR change my display resolution to 2160p23 for movies.
So my question is if I'm playing a blu-ray that is bt709 and have DCI on my display settings and DCI-P3 on MadVR, do I get better colors?


Also, when using the pure power curve option set to 2.40 would that translate to the gamma option -2 of the display.
I mean 2.30 = -1 and 2.20 = 0?
veggav is offline   Reply With Quote
Old 31st May 2018, 16:25   #51134  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 935
Quote:
Originally Posted by veggav View Post
Quick question

My TV has the following color spaces to choose from
sRGB/BT.709
DCI
Adobe RGB
BT.2020

And MadVR has the following options in "the display is calibrated to the following primaries / gamut:
BT.709
SMPTE C
EBU / Pal
BT2020
DCI-P3

I let madVR change my display resolution to 2160p23 for movies.
So my question is if I'm playing a blu-ray that is bt709 and have DCI on my display settings and DCI-P3 on MadVR, do I get better colors?


Also, when using the pure power curve option set to 2.40 would that translate to the gamma option -2 of the display.
I mean 2.30 = -1 and 2.20 = 0?
The calibration setting only applies to SDR content like 1080p Blu-ray. Yes, madVR will upconvert a BT.709 source to DCI-P3 or BT.2020 (which is the recommended setting). I'm not sure you will get better colors, but you will get different colors. It depends on whether you like this effect or not.

The gamma setting is only in effect when enable gamma processing is selected. It is generally advised to set the gamma at the display level rather than the media player.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 16:43   #51135  |  Link
arcspin
Registered User
 
Join Date: Apr 2018
Location: Stockholm, Sweden
Posts: 25
Quote:
Originally Posted by Manni View Post
I was in touch with him by PM. This is because he was using a recent driver (39x.x). As I reported yesterday, each 39x version broke something new, at least here. The best driver to use with a JVC is 385.28. With this, you can get correct levels in RGB Full, 12bits in SDR and HDR, and he will only get the magenta bug (which can be fixed, as I told him, by disabling "Sending HDR metadata) at 4K60p.
Hi Manni,
I have a question regarding when I set the NVIDIA GPU Driver to 12 bits the green colors and blacks/whites are all messed up, when I switch back to 8 bits all is well.

I have tried this in Full screen exklusive mode (to take into account the missing 10 bit feature in the current MadVR).
I have, with a great deal of help from Asmodian, in another thread debugged my system and the end result is that I have to set NVIDIA GPU Driver to 8 bits to get the colors and blacks/whites to work properly.

The end result from our de bugging session is here:
https://forum.doom9.org/showthread.p...08#post1841508

I have my NVIDIA GPU Driver set to:
NVIDIA RGB Full 8bpc > Madvr custom level 16-255 > JVC Enhanced.
I get the same result in: NVIDIA RGB Full 8bpc > Madvr Tv level > JVC Standard
(My projector is professionally calibrated to JVC Enhanced)


This is what I use:
WIN 10 64-bit, version 1709 (no spring update) with NVIDIA GTX 1060 (385.28 driver)
I have also in NVIDIA control panel made a custom resolution for 3840 x 2160 23.978hz to get a low clock deviation in MadVR.
JRiver (24.0.20, 64-bit) with Madvr (0.92.14)
JVC RS420/X5500 projector (4K, capable of receiving 12-bit).



Do you have an idea why I cant get the green and blacks/whites to work in 12 bits?


Best regards,

//Peter
arcspin is offline   Reply With Quote
Old 31st May 2018, 16:52   #51136  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 935
Quote:
Originally Posted by ryrynz View Post
Is there a HDR test video that one could use for setting ideal target nits?
A colorimeter would do the job of telling you how bright your display is. Maybe you could adjust the target nits with HDR grayscale test patches up to 10,000 nits, but I don't know how that would work. Such patterns might help with adjusting the brightness for lower nits.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 17:00   #51137  |  Link
veggav
Registered User
 
Join Date: Mar 2008
Posts: 76
Quote:
Originally Posted by Warner306 View Post
The calibration setting only applies to SDR content like 1080p Blu-ray. Yes, madVR will upconvert a BT.709 source to DCI-P3 or BT.2020 (which is the recommended setting). I'm not sure you will get better colors, but you will get different colors. It depends on whether you like this effect or not.

The gamma setting is only in effect when enable gamma processing is selected. It is generally advised to set the gamma at the display level rather than the media player.
Thanks for the information.

About gamma there's no option to change it to disable on MadVR.
Only pure power curve and bt709.
At the tab color & gamma I have enable gamma processing unchecked.
Still I can see some difference changing pure power curve from 2.2 to 2.4 while watching a movie.
I guess if you set calibration to this diplay is already calibrated you need to set gamma.

The only other option I see is disable GPU gamma ramps, that is unchecked here. Is this what you are reffering?
veggav is offline   Reply With Quote
Old 31st May 2018, 17:54   #51138  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 549
Quote:
Originally Posted by arcspin View Post
Do you have an idea why I cant get the green and blacks/whites to work in 12 bits?
No idea. All is fine here with 385.28 in 12bits, but I do use RGB Full > MadVR 16-235 > HDMI Standard on the JVC. I calibrate the JVC with Calman using MadTPG as a pattern source (best as the patterns come from the source I'll be using) and it only works with video levels, not with enhanced levels.

The only other significant difference I can think of is that I use MadVR's refresh rate custom modes, not nVidia's or CRU. I use the EDID/CTA option and it gives me 50-60mn between a frame drop, which is not perfect but good enough for me.

At the moment, there is nothing to lose using 8bits though, so I'd just set the GPU to 8bits, MadVR to 8bits dithering and enjoy, as MadVR doesn't support 10bits in Windowed mode anymore, and Exclusive is not usable with the JVCs due to the time to do the HDMI resync everytime you use the player's interface.
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000

Last edited by Manni; 31st May 2018 at 18:03.
Manni is offline   Reply With Quote
Old 31st May 2018, 19:18   #51139  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 242
For you guys every Windows restart the Dynamic Output Range revert back to FULL ? even when it was Limited before windows restart with 397.93 Also the Digital Audio always change to LG TV instead Turn off audio... 2 annoying bugs.
x7007 is offline   Reply With Quote
Old 31st May 2018, 19:35   #51140  |  Link
arcspin
Registered User
 
Join Date: Apr 2018
Location: Stockholm, Sweden
Posts: 25
Quote:
Originally Posted by Manni View Post
No idea. All is fine here with 385.28 in 12bits, but I do use RGB Full > MadVR 16-235 > HDMI Standard on the JVC. I calibrate the JVC with Calman using MadTPG as a pattern source (best as the patterns come from the source I'll be using) and it only works with video levels, not with enhanced levels.

The only other significant difference I can think of is that I use MadVR's refresh rate custom modes, not nVidia's or CRU. I use the EDID/CTA option and it gives me 50-60mn between a frame drop, which is not perfect but good enough for me.

At the moment, there is nothing to lose using 8bits though, so I'd just set the GPU to 8bits, MadVR to 8bits dithering and enjoy, as MadVR doesn't support 10bits in Windowed mode anymore, and Exclusive is not usable with the JVCs due to the time to do the HDMI resync everytime you use the player's interface.
Ok, thanx for answering back.
Yup, I'm all good in 8 bits, it just annoys me a little bit not to know the cause.

I will test the MadVR refresh rate and see if that might do some good.
I got some great results with NVIDIAS custom refresh rates and only have 2 dropped frames in a 2h43m 1080p movie.


Question:
I quite don't understand where to set MadVR to 8bits dithering. The only place I can find to set bitdepth in MadVR is in devices (my JVC), properties and "the native display bitdepth is:"
What is improved between setting it to "8 bit" or "10 bit or higher", if GPU is set to 8 bit?

Last edited by arcspin; 31st May 2018 at 19:55.
arcspin is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 17:42.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.