Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 30th May 2018, 22:47   #51081  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 219
@Manni

Just been reading this thread
http://www.avsforum.com/forum/24-dig...jector-24.html

Lots of good information, but if anyone asked about anything other than projectors they were told to use the correct thread.

I have not seen a lot of information on using the HDR->SDR options for a standard 1080p HDTV and doing 2160p->1080p.

I don't have fancy calibration equipment and the likes but from all the things you have learned with all the discussion, what would you class as a good standard for HDR->SDR with the latest MadVR, as the more I read the more it seemed to jump between person to person, which I get, but it does seem to be a grey area.


I guess I am also a little confused by HDR->SDR conversion, as although I get the comparisons with SDR, is that what everyone aims at, would we not hope to get it looking better than SDR if thats possible on a non SDR set ?
madjock is offline   Reply With Quote
Old 30th May 2018, 23:04   #51082  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 700
Quote:
Originally Posted by madshi View Post
Just a short comment while still having piles of commercial work on my desk:

Thanks to my great Nvidia driver contact, I managed to get a fix for 1080p23 timings into newer driver versions. So stock 1080p23 timings should be much better now with newer drivers, probably also for 3D. I know, the newest drivers come with their own problems, so many are still using older driver versions. But JFYI...
Just tested 397.93, and I have good news and bad news:

Good news is that your friend delivered (partly) in 1080p23 for 3D (I didn't test 2D). I now get 13min between frame drops, which is a significant improvement compared to the 3min of previous drivers (at least those I had tested).

Bad news is that the levels are still borked (both in SDR and HDR), however I found why I have the issue and many others don't: the levels are only borked in 12bits, not in 8bits. So it looks like between the banding in passthrough and the borked levels, nVidia shows little love for their 12bits setting.

You would think that the solution is easy: use 8bits in the GPU... Well, that works fine for 2D and 3D SDR, unfortunately Passthrough HDR is borked in 8bits on the JVCs (not in 12bits!) and you get at all refresh rates (at least in UHD) the magenta bug that I only have on 4K60p with previous drivers. The only way to get rid of it is, like with the 4K60 magenta bug, is to disable the "send HDR metadata", which as you know is a no no for me until we get the ability to switch HDR profiles according to max brightness with pixel shader, as the Vertex relies on this to select my custom curve. It is, however, a viable workaround for those whose display doesn't need HDR Metadata.

So pick your poison:

385.28: everything works but 1 frame drop every 3mn in 3D.

397.93: better 3D (13min between frame drops), Asio4all compatibility still broken, levels still borked in 12bits (at least with the JVCs), HDR passthrough borked on JVCs in 8bits.

Aaaargh!

Thanks a lot for your efforts and your friends though, but I'm back to 385.28 if I don't find a way to get rid of the magenta bug in 8bits, at least until I can switch to pixel shader.

397.93 is probably 100% good news for non JVC users who don't care about Asio4All.

Quote:
Originally Posted by madjock View Post
@Manni
I have not seen a lot of information on using the HDR->SDR options for a standard 1080p HDTV and doing 2160p->1080p.

I don't have fancy calibration equipment and the likes but from all the things you have learned with all the discussion, what would you class as a good standard for HDR->SDR with the latest MadVR, as the more I read the more it seemed to jump between person to person, which I get, but it does seem to be a grey area.
I'll let others reply. The AVS thread is indeed reserved to ongoing work on MadVR's HDR to SDR conversion with pixel shader for projectors, and it's not a support thread for MadVR projector users either, at least not at this stage, as it's a work in progress. HDR10 is a grey area because there is no standard and every display is different. HDR to SDR, when done well, is the same as HDR. It's just that the source (MadVR here) does the conversion instead of the display. But provided you use the correct settings, it can look as good or better than the HDR mode on the display.

So I suggest you post your display model and hope that someone can make suggestions for you.
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 31st May 2018 at 10:38.
Manni is offline   Reply With Quote
Old 30th May 2018, 23:11   #51083  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 219
Quote:
Originally Posted by Manni View Post
Just tested 397.93, and I have good news and bad news:

Good news is that your friend delivered (partly) in 1080p23 for 3D (I didn't test 2D). I now get 13min between frame drops, which is a significant improvement compared to the 3-5min of previous drivers (at least those I had tested).

Bad news is that the levels are still borked, however I found why I have the issue and many others don't: the levels are only borked in 12bits, not in 8bits. So it looks like between the banding in passthrough and the borked levels, nVidia shows little love for their 12bits setting.

You would think that the solution is easy: use 8bits in the GPU... Well, That works fine for 2D and 3D SDR, unfortunately HDR is borked in 8bits on the JVCs and you get at all refresh rates (at least in UHD) the magenta bug that I only have on 4K60p with previous drivers. The only way to get rid of it is, like with the 4K60 magenta bug, is to disable the "send HDR metadata", which as you know is a no no for me until we get the ability to switch HDR profiles according to max brightness with pixel shader, as the Vertex relies on this to select my custom curve. It is, however, a viable workaround for those whose display doesn't need HDR Metadata.

So pick your poison:

385.28: everything works but 1 frame drop every 3-5min in 3D.

397.93: better 3D (13min between frame drops), Asio4all compatibility still broken, levels borked in 12bits, HDR borked on JVCs in 8bits.

Aaaargh!

Thanks a lot for your efforts and your friends though, but I'm back to 385.28 if I don't find a way to get rid of the magenta bug in 8bits, at least until I can switch to pixel shader.



I'll let others reply. The AVS thread is indeed reserved to ongoing work on MadVR's HDR to SDR conversion with pixel shader for projectors, and it's not a support thread for MadVR projector users either, at least not at this stage, as it's a work in progress. HDR10 is a grey area because there is no standard and every display is different.

So I suggest you post your display model and hope that someone can make suggestions for you.
Ok no problem. I understand all that you have said but as you say there are so many TV models then the chances may be slim, also with some people doing calibrations and other using a default built into MadVR this also muddys the water.

Was just looking for a middle of the road default that applys to most things as the thread was suggesting, but like most I guess it boils down to what people like or don't like.

Thanks for replying anyway.
madjock is offline   Reply With Quote
Old 30th May 2018, 23:32   #51084  |  Link
theDongerr
Registered User
 
Join Date: Nov 2016
Posts: 14
Is it safe to install the Windows 10 Spring update? Or did it do something to mess up MadVr?
theDongerr is offline   Reply With Quote
Old 31st May 2018, 00:03   #51085  |  Link
Polopretress
Registered User
 
Join Date: Sep 2017
Posts: 28
Quote:
Originally Posted by nevcairiel View Post
Clock deviation makes no real difference. madVR accounts for it when calculating the repeat/drop timer, you just need to play long enough for it to stabilize, and the ultimate test is for every setup to just play a 2 hour movie and check the stats afterwards (clear them after launch to get rid of the first drops during startup).
I agree but not in the conclusion.
Clock deviation is tacken into account in the calculation but the result will not be "no drop frame expected" when the display will be at 23.97602 Hz if clock deviation is not null.

That means that , with clock deviation, you if you want to target "no drop frame expected", you will not be at 23.97602.
(or if you want to target 23.97602, you will not be at "no drop frame expected")

And let's try the both players (MPC and potplayer) in pcm with the same settings and you wil see that "display" is at the same value and the tilme counter is different

So which player is giving me the truth ?
For me , it is clear that the value of the "display" is the truth because the both players give the same result.
Then my conclusion is that it is the time counter that is wrong using MPC-BE with clock deviation not null.

That's the reason why, i said, the judge is the value of the "display" and not the time counter if clock deviation is not null.
then i think that clock deviation is more tacken into account in the value of the "display" than in the value of the time counter.


If i am wrong , do you mean that i need to follow the time counter instead of the "display" ?
If it is the case, how to explain that value of "display" are the same with a player with no clock deviation and another one with clock deviation not null ?


Quote:
I also assume you are not Bitstreaming DTS-HD or TrueHD via HDMI to get 0.000000% and POT player uses some sort of re-clock, which is no use to people who want to Bitstream.
Exact i am in pcm.
Potplayer in bitstream works fine but clock deviation is not null (like mpc-be).
Nervertheless... the value of "display" is the same and perfectly adjusted to 23.97602 (but the time counter is different and very far from "no drop frame expected".

Same question that previously. Which one is the truth ?
For me it is the "display" and not the counter.

Show me an example with a display at 23.97602 and the counter at "no drop frame expected" while the clock deviation is not null and i will revise my judgement

Last edited by Polopretress; 31st May 2018 at 00:15.
Polopretress is offline   Reply With Quote
Old 31st May 2018, 00:15   #51086  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,733
Quote:
Originally Posted by Polopretress View Post
I agree but not in the conclusion.
Clock deviation is tacken into account in the calculation but the result will not be "no drop frame expected" when the display will be at 23.97602 Hz if clock deviation is not null.

That means that , with clock deviation, you if you want to target "no drop frame expected", you will not be at 23.97602.
(or if you want to target 23.97602, you will not be at "no drop frame expected")
But thats perfectly fine. The goal is to not have any frame drops or repeats, not to hit some special magical number. Its perfectly expected to require a slightly different refresh rate to compensate for the clock deviation.

The perfect refresh rate would be where the display refresh rate multiplied by the clock deviation would be 23.97602 (without software tricks enabled), so for example if you have a small clock deviation of 0.003, you would want a refresh rate of 23.9767 instead (because 23.9760 + 0.003% = 23.9767)

And as others have said, in PCM mode PotPlayer just "cheats" by modifying the audio to perfectly match the video clock, ie. like ReClock. You could get it to match any refresh rate then. But it does modify the audio, so its no longer "bit-exact". Its probably not something anyone can hear and it usually works just fine, but you need to be aware that your software works around the hardware problem in this manner. The clock deviation does not go away, its just "hidden" from madVR and compensated for on the audio side instead. And of course this does not work when bitstreaming.

The entire drop/repeat frame mechanic only exists because of audio. If we had no audio, we could just show the video at 23.977 instead of 23.976 and no person in the world would ever notice. But we do have audio, and the audio would lose sync with the video. So to maintain sync, the video renderer has to drop/repeat a frame when appropriate. Now the alternative solution is to change the audio instead, which is what PotPlayer is doing, and why its claiming a zero clock deviation - because it handles that, and madVR does not have to.

The only number that really matters is the number of dropped/repeated frames at the end of the movie. Everything else is just there as information.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 31st May 2018 at 00:46.
nevcairiel is offline   Reply With Quote
Old 31st May 2018, 00:35   #51087  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 466
Quote:
Originally Posted by Manni View Post
HDR to SDR, when done well, is the same as HDR. It's just that the source (MadVR here) does the conversion instead of the display.
If that really was the case then why would TV reviewers who test stuff like 2% window white find very different brightness levels in SDR and HDR modes (last two that come to mind were 400 cd/mē SDR / 700 HDR on an LG OLED, and 750 SDR / 1250 HDR on a Sony LCD)?
__________________
HTPC: W10 1809, E7400, 1050 Ti, DVB-C, Denon 2310, Panasonic GT60 | Desktop: W10 1809, 4690K, HD 7870, Dell U2713HM | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR
el Filou is offline   Reply With Quote
Old 31st May 2018, 01:03   #51088  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 700
Quote:
Originally Posted by el Filou View Post
If that really was the case then why would TV reviewers who test stuff like 2% window white find very different brightness levels in SDR and HDR modes (last two that come to mind were 400 cd/mē SDR / 700 HDR on an LG OLED, and 750 SDR / 1250 HDR on a Sony LCD)?
Is this with MadVR?

Some players do an HDR to SDR conversion that doesn't exploit the whole dynamic range. Others don't even use the wider gamut.

I'm talking about pixel shader with MadVR, given that it's the topic of the thread. I don't know about OLEDs/LCDs, I can only comment about what I see on my JVC projector. Some displays might not be able to provide the SDR BT2020 mode with full brightness that the JVC delivers to be used as a baseline for MadVR's HDR to SDR conversion (pixel shader).
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 31st May 2018, 01:34   #51089  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,530
Yes, your projector probably does not have a seperate HDR mode in the same way TVs do. All non-projectors I have seen (not that many) do not work this way. You cannot use pixel shaders to get a similar image when the display is in its SDR mode. For one thing it will not go as bright, reasonably given that SDR at 700 nits peak white would be pretty bright and the display couldn't handle it. Projectors tend to be a lot dimmer so maybe they use the same peak brightness for both HDR and SDR.

I think we need to remember that HDR works very differently in projectors compared to backlit displays and to not assume we can transfer our experiance directly.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 31st May 2018, 05:13   #51090  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,067
For the HDR -> SDR for 1080p question,

I have two 1080p displays. One is calibrated to about 150 nits, the other is probably closer to 200 nits. I find a target nits between 400 to 500 nits seems to achieve adequate brightness.

Set the tone mapping curve to BT.2390.

Gamut mapping can be set to any of the comlex scientific modes or dumb mode - convert gamut late. It can be hard to immediately notice a difference between any of the gamut mapping algorithms, so this choice may not be critical to you.

Check the rest of the boxes at the bottom.

It is really that simple and not that hard to figure out. Adjust the target nits to your own tastes.

It is an adequate way to show HDR on a non-HDR screen if you accept the limitations of tone mapping. It is certainly a viable way to watch 4K UHD content.

As for the levels thing, I think this is a JVC projector oddity. I was trying to help someone with a JVC projector at AVSForums and his projector displays the strangest behavior. Every time he would fix something, something else would break. It is the most bizzare machine I have come across that runs madVR. His graphics card is a GTX 1050. He actually posted in this forum, so he might be reading this. I know you haven't had as many problems with your JVC, but it doesn't seem to like Windows or anything output from a GPU.

Last edited by Warner306; 31st May 2018 at 05:17.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 06:01   #51091  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,140
Is there a HDR test video that one could use for setting ideal target nits?
ryrynz is offline   Reply With Quote
Old 31st May 2018, 08:03   #51092  |  Link
sauma144
Registered User
 
Join Date: Sep 2016
Posts: 89
Quote:
Originally Posted by nevcairiel View Post
In PCM mode PotPlayer just "cheats" by modifying the audio to perfectly match the video clock, ie. like ReClock. You could get it to match any refresh rate then.
Seriously? I never noticed.
sauma144 is offline   Reply With Quote
Old 31st May 2018, 10:31   #51093  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 700
Quote:
Originally Posted by Warner306 View Post
As for the levels thing, I think this is a JVC projector oddity. I was trying to help someone with a JVC projector at AVSForums and his projector displays the strangest behavior. Every time he would fix something, something else would break. It is the most bizzare machine I have come across that runs madVR. His graphics card is a GTX 1050. He actually posted in this forum, so he might be reading this. I know you haven't had as many problems with your JVC, but it doesn't seem to like Windows or anything output from a GPU.
I was in touch with him by PM. This is because he was using a recent driver (39x.x). As I reported yesterday, each 39x version broke something new, at least here. The best driver to use with a JVC is 385.28. With this, you can get correct levels in RGB Full, 12bits in SDR and HDR, and he will only get the magenta bug (which can be fixed, as I told him, by disabling "Sending HDR metadata) at 4K60p.

With a recent driver, he can solve things if he disables "Send HDR Metadata" and uses 8bits in the GPU. But that's not an option for me at this stage (I need the metadata for the Vertex), so I went back to 385.28 for now.

The main downside of 385.28 is 3D (1 frame drop repeat every 3mn vs every 13mn with 397.93), and very minor banding in HDR passthrough (not sure when that was introduced by nVidia, but it's still here with 39x.x). You can solve most of the banding in passthrough using 9bits instead of 10bits dithering in MadVR, unfortunately that's not possible with the latest MadVR build, so you have to revert to the previous one.

The JVCs do have some quirks, but they can be used fine if you know how to drive them
__________________
Win10 Pro x64 b1809 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 398.11 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25PR2
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 31st May 2018 at 11:01.
Manni is offline   Reply With Quote
Old 31st May 2018, 10:52   #51094  |  Link
SuLyMaN
Registered User
 
Join Date: Jul 2007
Posts: 156
Will madvr work with an intel HD 630 onboard graphics? Looking at the requirements, I'd say yes...but you never know. Anybody has tried it?
SuLyMaN is offline   Reply With Quote
Old 31st May 2018, 12:27   #51095  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 219
Quote:
Originally Posted by SuLyMaN View Post
Will madvr work with an intel HD 630 onboard graphics? Looking at the requirements, I'd say yes...but you never know. Anybody has tried it?
Depends what you mean by work, since it is not a dedicated card with its own memory then you will be very limited on what you could do with madVR.

Last edited by madjock; 31st May 2018 at 12:35.
madjock is offline   Reply With Quote
Old 31st May 2018, 15:46   #51096  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,067
Quote:
Originally Posted by Manni View Post
I was in touch with him by PM. This is because he was using a recent driver (39x.x). As I reported yesterday, each 39x version broke something new, at least here. The best driver to use with a JVC is 385.28. With this, you can get correct levels in RGB Full, 12bits in SDR and HDR, and he will only get the magenta bug (which can be fixed, as I told him, by disabling "Sending HDR metadata) at 4K60p.

With a recent driver, he can solve things if he disables "Send HDR Metadata" and uses 8bits in the GPU. But that's not an option for me at this stage (I need the metadata for the Vertex), so I went back to 385.28 for now.

The main downside of 385.28 is 3D (1 frame drop repeat every 3mn vs every 13mn with 397.93), and very minor banding in HDR passthrough (not sure when that was introduced by nVidia, but it's still here with 39x.x). You can solve most of the banding in passthrough using 9bits instead of 10bits dithering in MadVR, unfortunately that's not possible with the latest MadVR build, so you have to revert to the previous one.

The JVCs do have some quirks, but they can be used fine if you know how to drive them
Well I'm glad you helped him. He was using 385.28 when I talked to him and was still having black screens and some oddities with 3D playback, so I'm not sure if it has all been ironed out.

Last edited by Warner306; 31st May 2018 at 16:44.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 16:20   #51097  |  Link
veggav
Registered User
 
Join Date: Mar 2008
Posts: 79
Quick question

My TV has the following color spaces to choose from
sRGB/BT.709
DCI
Adobe RGB
BT.2020

And MadVR has the following options in "the display is calibrated to the following primaries / gamut:
BT.709
SMPTE C
EBU / Pal
BT2020
DCI-P3

I let madVR change my display resolution to 2160p23 for movies.
So my question is if I'm playing a blu-ray that is bt709 and have DCI on my display settings and DCI-P3 on MadVR, do I get better colors?


Also, when using the pure power curve option set to 2.40 would that translate to the gamma option -2 of the display.
I mean 2.30 = -1 and 2.20 = 0?
veggav is offline   Reply With Quote
Old 31st May 2018, 16:25   #51098  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,067
Quote:
Originally Posted by veggav View Post
Quick question

My TV has the following color spaces to choose from
sRGB/BT.709
DCI
Adobe RGB
BT.2020

And MadVR has the following options in "the display is calibrated to the following primaries / gamut:
BT.709
SMPTE C
EBU / Pal
BT2020
DCI-P3

I let madVR change my display resolution to 2160p23 for movies.
So my question is if I'm playing a blu-ray that is bt709 and have DCI on my display settings and DCI-P3 on MadVR, do I get better colors?


Also, when using the pure power curve option set to 2.40 would that translate to the gamma option -2 of the display.
I mean 2.30 = -1 and 2.20 = 0?
The calibration setting only applies to SDR content like 1080p Blu-ray. Yes, madVR will upconvert a BT.709 source to DCI-P3 or BT.2020 (which is the recommended setting). I'm not sure you will get better colors, but you will get different colors. It depends on whether you like this effect or not.

The gamma setting is only in effect when enable gamma processing is selected. It is generally advised to set the gamma at the display level rather than the media player.
Warner306 is offline   Reply With Quote
Old 31st May 2018, 16:43   #51099  |  Link
arcspin
Registered User
 
Join Date: Apr 2018
Location: Stockholm, Sweden
Posts: 27
Quote:
Originally Posted by Manni View Post
I was in touch with him by PM. This is because he was using a recent driver (39x.x). As I reported yesterday, each 39x version broke something new, at least here. The best driver to use with a JVC is 385.28. With this, you can get correct levels in RGB Full, 12bits in SDR and HDR, and he will only get the magenta bug (which can be fixed, as I told him, by disabling "Sending HDR metadata) at 4K60p.
Hi Manni,
I have a question regarding when I set the NVIDIA GPU Driver to 12 bits the green colors and blacks/whites are all messed up, when I switch back to 8 bits all is well.

I have tried this in Full screen exklusive mode (to take into account the missing 10 bit feature in the current MadVR).
I have, with a great deal of help from Asmodian, in another thread debugged my system and the end result is that I have to set NVIDIA GPU Driver to 8 bits to get the colors and blacks/whites to work properly.

The end result from our de bugging session is here:
https://forum.doom9.org/showthread.p...08#post1841508

I have my NVIDIA GPU Driver set to:
NVIDIA RGB Full 8bpc > Madvr custom level 16-255 > JVC Enhanced.
I get the same result in: NVIDIA RGB Full 8bpc > Madvr Tv level > JVC Standard
(My projector is professionally calibrated to JVC Enhanced)


This is what I use:
WIN 10 64-bit, version 1709 (no spring update) with NVIDIA GTX 1060 (385.28 driver)
I have also in NVIDIA control panel made a custom resolution for 3840 x 2160 23.978hz to get a low clock deviation in MadVR.
JRiver (24.0.20, 64-bit) with Madvr (0.92.14)
JVC RS420/X5500 projector (4K, capable of receiving 12-bit).



Do you have an idea why I cant get the green and blacks/whites to work in 12 bits?


Best regards,

//Peter
arcspin is offline   Reply With Quote
Old 31st May 2018, 16:52   #51100  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,067
Quote:
Originally Posted by ryrynz View Post
Is there a HDR test video that one could use for setting ideal target nits?
A colorimeter would do the job of telling you how bright your display is. Maybe you could adjust the target nits with HDR grayscale test patches up to 10,000 nits, but I don't know how that would work. Such patterns might help with adjusting the brightness for lower nits.
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 02:19.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.