Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd December 2021, 18:03   #421  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
Those are laptop screens, so there is no easy way to use them as 2nd screen anyway.
If you want a control, you buy a monitor which can be calibrated (or LUT box), but this is really not a 'home' tech and cost solid $.

Mac at least offers some color management, where on Windows it's all mess. In the same time nothing stop MS to offer some color management which allows you to easily control your screen. There is something already there, but it never seems to work.

We are talking about bit different things- you not really talking about home tech, but more a pro (I worked at top VFX house and had access to Sony HDR OLED monitor and Dolby). Things don't have to be that expensive to be accurate enough and can be controlled by OS. It's just needs to be finally written and implemented properly (but it won't as there is no real money in it to be made). Specialised tech like SDI, etc. is past as all can be achieved way easier now. Video is nothing more than a data, so no need to treat it in special way to pass it from A to B. Metadata is hugely underestimated if not a Dolby (private company!) initiative and push we would be still watching Rec.709 at 100nits
Dolby worked on HDR for long time and they use to show it (behind the doors) to professionals on many shows, but reception was so poor. Industry is scared of new as it cost them money and necessarily brings money.

You can write own player and control HDMI metadata regardless of OS, no?

Last edited by kolak; 3rd December 2021 at 18:30.
kolak is offline   Reply With Quote
Old 4th December 2021, 02:54   #422  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
no i'm not talking about laptops... they literally have external device which can't be properly used outside of mac OS by design not by something that other OS can't do.

gaming monitors are missing these basic things but even cheap TV fully supported gamut control over HDMI signals. for many many years i should add.
Quote:
You can write own player and control HDMI metadata regardless of OS, no?
yes?
huhn is offline   Reply With Quote
Old 4th December 2021, 15:06   #423  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
I assume so, as there are pro players which allows you to monitor HDR over GPU's HDMI and let you take full control over it.

Can't madVR talk to GPU directly and pass HDR metadata (regardless of Win settings)?

Last edited by kolak; 4th December 2021 at 15:11.
kolak is offline   Reply With Quote
Old 4th December 2021, 19:29   #424  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
yes nvidia and AMD have API to use HDMI meta data.
that's the problem these monitors don't care because they are not able to handle this correctly they use wide gamut always even if bt 709 is send or at best have a very bad clamp.
if they support fake HDR they usually can gamut map this which is just hilarious...

bt2020 to 95% DCI P3 D65 yes .
bt709 to bt709 nope.
huhn is offline   Reply With Quote
Old 4th December 2021, 22:57   #425  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
Unless you use good monitor like Eizo, but even then you in most cases ned to switch mode manually.

With Eizo there is a way to control it by connecting USB cable and having special commands sends through it. There is a NLE which can control Eizo monitors this way depending on project settings you choose. Same could be done for the player.

I don't get why monitor can't switch itself to correct mode based on info which travels on HDMI. All needed info is there.
kolak is offline   Reply With Quote
Old 5th December 2021, 02:07   #426  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
While it would be great if more displays could handle flagged signals accordingly, one big issue remains: Lots of them would screw up anyway, e.g. by locking color or even brightness settings, introducing banding or scoring just low sRGB scores. I wouldn't want to sacrifice native display 6500k white point, doing it via calibration lowers contrast ratio (at least if its not OLED).

Ideally we would be able to feed Windows or the GPU driver a 3D LUT (or it would generate a proper one from the ICC profile itself) and it would do high quality conversion all the time before scan out. In case there is content with other non wide gamuts like rec601, applications could just convert them to rec709 (lots of seemingly rec601 content is downscaled rec709 content without proper flags/conversion anyway).
HDR would still toggle native wide gamut mode.
aufkrawall is offline   Reply With Quote
Old 5th December 2021, 04:13   #427  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
there is absolutely no issue with doing such things with TV including entry class ones.
ICC aren't good enough to gamut map properly and while a 3D LUT would do the trick a proper one cost noteable performance.

white point has nothing todo with this why the white point or brightness or what ever is locked in sRGB or BT 709 modes is absolutely beyond me.

do not forget nearly all programs don't tell the OS what color gamut is used. this will change now but old programs will not get updated so BT 709 has to be guessed.
huhn is offline   Reply With Quote
Old 5th December 2021, 14:06   #428  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by huhn View Post
there is absolutely no issue with doing such things with TV including entry class ones.
I temporarily had a monitor with sRGB clamp and it both locked colors (annoyingly cold white point) and introduced banding.
Monitors aren't TVs. Those cheap TVs you mentioned often have absurd input lag.

Quote:
Originally Posted by huhn View Post
ICC aren't good enough to gamut map properly
Nope, mpv can pick up ICC profile and when specifying gamma 2.4, it looks exactly like DisplayCal's 3D LUT with 100% black output offset. There might me mathematical differences, but they don't really matter. There is 0 banding and it works on my crappy Gemini Lake GPU.

Quote:
Originally Posted by huhn View Post
and while a 3D LUT would do the trick a proper one cost noteable performance.
What is "notable"? It's rather cheap in ReShade.

Quote:
Originally Posted by huhn View Post
white point has nothing todo with this why the white point or brightness or what ever is locked in sRGB or BT 709 modes is absolutely beyond me.
It probably is related because scalers (or how else you would call the component that does color adjustments) in monitors are so limited.
aufkrawall is offline   Reply With Quote
Old 5th December 2021, 15:39   #429  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
Quote:
I temporarily had a monitor with sRGB clamp and it both locked colors (annoyingly cold white point) and introduced banding.
Monitors aren't TVs. Those cheap TVs you mentioned often have absurd input lag.
or better input lag. they are not gaming displays but the gamut mapping is not the reason they are lower.
LG OLED screens are one of the fastest screen on the market if not the fastest well aware they are not cheap.

gaming displays are just bad because they get away with it.
Quote:
Nope, mpv can pick up ICC profile and when specifying gamma 2.4, it looks exactly like DisplayCal's 3D LUT with 100% black output offset. There might me mathematical differences, but they don't really matter. There is 0 banding and it works on my crappy Gemini Lake GPU.
a proper program can use them properly but that needs work and mpv put that work in to get results you can not except other program developer to do the same. if you just let windows do it ouch...

Quote:
What is "notable"? It's rather cheap in ReShade.
last time i checked reshade 3D LUTs are very very low end and produce banding. that was when it was very new and dispalcal just added support these 3D LUT are really tiny.
3d LUT in madVR can kill entire entry class GPU just from tonemapping even my gtx 960 didn't have enough processing power to do that reliable. and even if it is doable easily on new GPU you still lose like 5 % performance that's quite a lot.

i can't believe that reshade get's away without losing some FPS in games just calling a shader cost at least 0.2 ms in my experience even if the shader does nothing.
Quote:
It probably is related because scalers (or how else you would call the component that does color adjustments) in monitors are so limited.
they have to really cheap out but that's just how it is...
hardware unboxed test this and yes clamps are usually very very badly implemented and need ICC profiles to be of any use so they are sadly pretty useless. there are some that are not terrible but i'm not following screens right now. if i need to buy one write now i would look for one that can't do DCI p3 D65 sounds dumb but i like accurate colors.
huhn is offline   Reply With Quote
Old 5th December 2021, 17:40   #430  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by huhn View Post
LG OLED screens are one of the fastest screen on the market if not the fastest well aware they are not cheap.
Yeah, they make PC monitors look bad in a lot, if not all regards.

Quote:
Originally Posted by huhn View Post
if you just let windows do it ouch...
Yes, Windows color management is indeed extra bad. It even is too stupid to apply 1D LUT without a shipton of banding. That amount of incompetence really is remarkable. Hopefully Linux Wayland desktop will crush this garbage around 2025, but app support will of course still be an issue, at least partially...


Quote:
Originally Posted by huhn View Post
last time i checked reshade 3D LUTs are very very low end and produce banding. that was when it was very new and dispalcal just added support these 3D LUT are really tiny.
I suspect the banding isn't necessarily caused by the size of the 3D LUT, but by either missing precision for ReShade effects in general or missing dithering (or both).
ReShade can force D3D applications into 10 bit frame buffer and this decreases introduced banding by the 3D LUT to a really acceptable amount for gaming (and Windows dithers down to 8 bit for 8 bit displays).
Interestingly the old Mirror's Edge game from 2009 has very clear skybox gradients with just very little banding, I found this to be a good testcase.


Quote:
Originally Posted by huhn View Post
3d LUT in madVR can kill entire entry class GPU just from tonemapping even my gtx 960 didn't have enough processing power to do that reliable. and even if it is doable easily on new GPU you still lose like 5 % performance that's quite a lot.
Yes, but truth is the development of madVR had stopped before madshi implemented more performance optimizations. E.g. also the Jinc scaler is very slow compared to mpv's.

Quote:
Originally Posted by huhn View Post
if i need to buy one write now i would look for one that can't do DCI p3 D65 sounds dumb but i like accurate colors.
I would do the same, though it seems basically every relatively recent LCD panel is DCI-P3 these days. I tested ~a dozen of >= 144Hz 1440p devices (mostly IPS, but also some VA) and they all were wide gamut without exception.
I think the first 1440p 144Hz IPS panels by AU Optronics still were closer to sRGB, but they were of terrible quality otherwise.

Last edited by aufkrawall; 5th December 2021 at 17:48.
aufkrawall is offline   Reply With Quote
Old 9th December 2021, 15:29   #431  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
Quote:
Originally Posted by aufkrawall View Post
While it would be great if more displays could handle flagged signals accordingly, one big issue remains: Lots of them would screw up anyway, e.g. by locking color or even brightness settings, introducing banding or scoring just low sRGB scores. I wouldn't want to sacrifice native display 6500k white point, doing it via calibration lowers contrast ratio (at least if its not OLED).

Ideally we would be able to feed Windows or the GPU driver a 3D LUT (or it would generate a proper one from the ICC profile itself) and it would do high quality conversion all the time before scan out. In case there is content with other non wide gamuts like rec601, applications could just convert them to rec709 (lots of seemingly rec601 content is downscaled rec709 content without proper flags/conversion anyway).
HDR would still toggle native wide gamut mode.
This is similar how OSX is doing it. Any properly tagged video gets converted to screen profile. If profile describes screen accurately then you have correct preview regardless of the screen. It's all down to screen capabilities then.

I just done HDR grading on new MacBook for my holiday video in Resolve (using fact that on Mac Resolve GUI offers accurate preview including HDR). Looks great as HDR. I took this video and played it on older Mac with SDR screen. QTX does tonne mapping (uses HDR metadata, so file needs to be properly flagged) and converts preview to screen profile, so SDR representation of my HDR grade is correct. It's not as good as my old native SDR grade, but absolutely fine. Best bit- if you take this video to other Mac with different screen you get "same" SDR preview. As long as profiles are close to real screen parameters you get about same preview on different machines (assuming they are all about 100% Rec.709 gamut), which is exactly how it should work.
Try doing this on Windows

Of course ideal solution is for any screen to have today 100% P3 gamut and 1000nits HDR support+ modes for HD and SD standards. If video is in one of the official standards then screen switches to given profile and if not you get good conversion to P3 'main' profile. This is how pro world works, expect they don't really relay on any auto switching (and conversion) but it's all set by human on both sides to the same profile- video feeding app and screen itself. Sometimes they forget to switch mode in the monitor and waste eg. time on grading which is useless. It happens
As long as we keep getting screens which are so far from any standard (and their profiles) none of it will work well.

Last edited by kolak; 9th December 2021 at 19:23.
kolak is offline   Reply With Quote
Old 9th December 2021, 23:53   #432  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Thanks for those Mac insights!

According to this Reddit Thread, Eizo recommends to stay away from Windows 11 due to brokenness with ICC profiles:
https://www.reddit.com/r/Windows11/c..._icc_profiles/
Likely explains why installing the ICC profile doesn't work for me...

But that linked tool dwm_lut is interesting, it indeed works. It has some limitations and causes a lot of GPU load, but I can watch 1080p video via streaming apps with proper sRGB gamut mapping.
aufkrawall is offline   Reply With Quote
Old 10th December 2021, 13:42   #433  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
it's sadly not really possible to speed it up by a lot.
3d LUT are very straight forward what they do and how to apply them.

just to make that clear i didn't say it's impossible.
huhn is offline   Reply With Quote
Old 10th December 2021, 14:32   #434  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
I assume inside monitors they must have good chips to manage LUTs. Eizo HDR monitors use 24bit LUTs, so those must be quite demanding.
kolak is offline   Reply With Quote
Old 10th December 2021, 16:49   #435  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
LUTs are not particularly expensive to apply, you just need enough memory to store them, otherwise its basically just one lookup per pixel, something GPUs are good at, and dedicated image processors are even better at. If that DWM LUT tool is particularly slow, its probably due to the fact that its an external hook that adds an additional rendering pass and copies all textures around once. And most importantly, it disables DirectFlip, which every typical fullscreen rendering probably otherwise uses to bypass DWM entirely.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 10th December 2021, 18:31   #436  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
madVR dies to it too you can easily trigger that with an intel i GPU or i even trigger it with a 960.

the first issue is a 3D LUT is to "small" for a better word so it can not simply look up a color it has to calculate it (madVR uses trilinear) this needs processing power it's simple but it's not zero.
next issue is for a lack of better words again they are to "big"
which trigger heavy slow downs with "rare colors" was the word used by madshi if i remember correctly.
there is also the issue with how big a single texture can be which can be quite different depend on the GPU. GPU caps viewer shows this information under vulkan.

a 1060 has this for vulkan
maxImageDimension1D: 32768
maxImageDimension2D: 32768
maxImageDimension3D: 16384
maxImageDimensionCube: 32768

the DWM 3D LUT should have a little easier job by been inputting with 10 or 8 bit data not 16 bit float but on the other hand new GPU really like FP16 and clearly not int10.
huhn is offline   Reply With Quote
Old 10th December 2021, 18:57   #437  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by huhn View Post
there is also the issue with how big a single texture can be which can be quite different depend on the GPU. GPU caps viewer shows this information under vulkan.

a 1060 has this for vulkan
maxImageDimension1D: 32768
maxImageDimension2D: 32768
maxImageDimension3D: 16384
maxImageDimensionCube: 32768
You do realize how big a 16384x16384x16384 3D LUT texture would be, right?
You'll explode your video memory long before these dimensions are a limit, because you need a x^3 texture in all dimensions, so even 1024^3 would already be a gigabyte of space in 8-bit.

So most practial 3DLUTs are likely at most 512^3 or smaller, which is where interpolation comes in to fill in the gaps.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 10th December 2021, 19:19   #438  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
madVR 3D LUT are 256^3(which are the biggest i'm aware of 65/64 is already considered huge and monitors easy go down to 16.) which is 100 mb so irrelevant but that'S not the issue it's the temporary calculated numbers usually in the green.

beware my 1060 size is insane compared to the 960 and i have no clue what Dx9 has available.

there are rendertimes of 200ms just from the 3D LUT and madshi sad something about the use of smaller 3D LUT the memory is clearly not the issue here.

feel free to find out what the real issue is. i tried to find the bug report about it but it was not my report. there will be report of this from me here ont eh forum but they are at best old.

edit: the input is a float point number you always have to fill in gaps with madVR.
huhn is offline   Reply With Quote
Old 10th December 2021, 21:34   #439  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by huhn View Post
there are rendertimes of 200ms just from the 3D LUT and madshi sad something about the use of smaller 3D LUT the memory is clearly not the issue here.
feel free to find out what the real issue is. i tried to find the bug report about it but it was not my report. there will be report of this from me here ont eh forum but they are at best old.
I did report an issue back in the day where using a 3DLUT with a weak Intel GPU would make render times shoot up but only on certain scenes. You even commented on it. The main hypothesis was that the GPU would evict parts of the 3DLUT from some cache due to rarely used colors, and would struggle to get them back when these colors happened to be used.
e-t172 is offline   Reply With Quote
Old 11th December 2021, 00:04   #440  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
GPU usage doesn't really increase on a RTX 3060 when using 3D LUT in madVR.
Edit: Additional VRAM consumption is ~220MB.

Last edited by aufkrawall; 11th December 2021 at 00:25.
aufkrawall is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:51.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.