Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
3rd December 2021, 18:03 | #421 | Link |
Registered User
Join Date: Nov 2004
Location: Poland
Posts: 2,843
|
Those are laptop screens, so there is no easy way to use them as 2nd screen anyway.
If you want a control, you buy a monitor which can be calibrated (or LUT box), but this is really not a 'home' tech and cost solid $. Mac at least offers some color management, where on Windows it's all mess. In the same time nothing stop MS to offer some color management which allows you to easily control your screen. There is something already there, but it never seems to work. We are talking about bit different things- you not really talking about home tech, but more a pro (I worked at top VFX house and had access to Sony HDR OLED monitor and Dolby). Things don't have to be that expensive to be accurate enough and can be controlled by OS. It's just needs to be finally written and implemented properly (but it won't as there is no real money in it to be made). Specialised tech like SDI, etc. is past as all can be achieved way easier now. Video is nothing more than a data, so no need to treat it in special way to pass it from A to B. Metadata is hugely underestimated if not a Dolby (private company!) initiative and push we would be still watching Rec.709 at 100nits Dolby worked on HDR for long time and they use to show it (behind the doors) to professionals on many shows, but reception was so poor. Industry is scared of new as it cost them money and necessarily brings money. You can write own player and control HDMI metadata regardless of OS, no? Last edited by kolak; 3rd December 2021 at 18:30. |
4th December 2021, 02:54 | #422 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
no i'm not talking about laptops... they literally have external device which can't be properly used outside of mac OS by design not by something that other OS can't do.
gaming monitors are missing these basic things but even cheap TV fully supported gamut control over HDMI signals. for many many years i should add. Quote:
|
|
4th December 2021, 15:06 | #423 | Link |
Registered User
Join Date: Nov 2004
Location: Poland
Posts: 2,843
|
I assume so, as there are pro players which allows you to monitor HDR over GPU's HDMI and let you take full control over it.
Can't madVR talk to GPU directly and pass HDR metadata (regardless of Win settings)? Last edited by kolak; 4th December 2021 at 15:11. |
4th December 2021, 19:29 | #424 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
yes nvidia and AMD have API to use HDMI meta data.
that's the problem these monitors don't care because they are not able to handle this correctly they use wide gamut always even if bt 709 is send or at best have a very bad clamp. if they support fake HDR they usually can gamut map this which is just hilarious... bt2020 to 95% DCI P3 D65 yes . bt709 to bt709 nope. |
4th December 2021, 22:57 | #425 | Link |
Registered User
Join Date: Nov 2004
Location: Poland
Posts: 2,843
|
Unless you use good monitor like Eizo, but even then you in most cases ned to switch mode manually.
With Eizo there is a way to control it by connecting USB cable and having special commands sends through it. There is a NLE which can control Eizo monitors this way depending on project settings you choose. Same could be done for the player. I don't get why monitor can't switch itself to correct mode based on info which travels on HDMI. All needed info is there. |
5th December 2021, 02:07 | #426 | Link |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
While it would be great if more displays could handle flagged signals accordingly, one big issue remains: Lots of them would screw up anyway, e.g. by locking color or even brightness settings, introducing banding or scoring just low sRGB scores. I wouldn't want to sacrifice native display 6500k white point, doing it via calibration lowers contrast ratio (at least if its not OLED).
Ideally we would be able to feed Windows or the GPU driver a 3D LUT (or it would generate a proper one from the ICC profile itself) and it would do high quality conversion all the time before scan out. In case there is content with other non wide gamuts like rec601, applications could just convert them to rec709 (lots of seemingly rec601 content is downscaled rec709 content without proper flags/conversion anyway). HDR would still toggle native wide gamut mode. |
5th December 2021, 04:13 | #427 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
there is absolutely no issue with doing such things with TV including entry class ones.
ICC aren't good enough to gamut map properly and while a 3D LUT would do the trick a proper one cost noteable performance. white point has nothing todo with this why the white point or brightness or what ever is locked in sRGB or BT 709 modes is absolutely beyond me. do not forget nearly all programs don't tell the OS what color gamut is used. this will change now but old programs will not get updated so BT 709 has to be guessed. |
5th December 2021, 14:06 | #428 | Link | ||
Registered User
Join Date: Dec 2011
Posts: 1,812
|
Quote:
Monitors aren't TVs. Those cheap TVs you mentioned often have absurd input lag. Nope, mpv can pick up ICC profile and when specifying gamma 2.4, it looks exactly like DisplayCal's 3D LUT with 100% black output offset. There might me mathematical differences, but they don't really matter. There is 0 banding and it works on my crappy Gemini Lake GPU. Quote:
It probably is related because scalers (or how else you would call the component that does color adjustments) in monitors are so limited. |
||
5th December 2021, 15:39 | #429 | Link | ||||
Registered User
Join Date: Oct 2012
Posts: 7,925
|
Quote:
LG OLED screens are one of the fastest screen on the market if not the fastest well aware they are not cheap. gaming displays are just bad because they get away with it. Quote:
Quote:
3d LUT in madVR can kill entire entry class GPU just from tonemapping even my gtx 960 didn't have enough processing power to do that reliable. and even if it is doable easily on new GPU you still lose like 5 % performance that's quite a lot. i can't believe that reshade get's away without losing some FPS in games just calling a shader cost at least 0.2 ms in my experience even if the shader does nothing. Quote:
hardware unboxed test this and yes clamps are usually very very badly implemented and need ICC profiles to be of any use so they are sadly pretty useless. there are some that are not terrible but i'm not following screens right now. if i need to buy one write now i would look for one that can't do DCI p3 D65 sounds dumb but i like accurate colors. |
||||
5th December 2021, 17:40 | #430 | Link | ||||
Registered User
Join Date: Dec 2011
Posts: 1,812
|
Quote:
Yes, Windows color management is indeed extra bad. It even is too stupid to apply 1D LUT without a shipton of banding. That amount of incompetence really is remarkable. Hopefully Linux Wayland desktop will crush this garbage around 2025, but app support will of course still be an issue, at least partially... Quote:
ReShade can force D3D applications into 10 bit frame buffer and this decreases introduced banding by the 3D LUT to a really acceptable amount for gaming (and Windows dithers down to 8 bit for 8 bit displays). Interestingly the old Mirror's Edge game from 2009 has very clear skybox gradients with just very little banding, I found this to be a good testcase. Quote:
Quote:
I think the first 1440p 144Hz IPS panels by AU Optronics still were closer to sRGB, but they were of terrible quality otherwise. Last edited by aufkrawall; 5th December 2021 at 17:48. |
||||
9th December 2021, 15:29 | #431 | Link | |
Registered User
Join Date: Nov 2004
Location: Poland
Posts: 2,843
|
Quote:
I just done HDR grading on new MacBook for my holiday video in Resolve (using fact that on Mac Resolve GUI offers accurate preview including HDR). Looks great as HDR. I took this video and played it on older Mac with SDR screen. QTX does tonne mapping (uses HDR metadata, so file needs to be properly flagged) and converts preview to screen profile, so SDR representation of my HDR grade is correct. It's not as good as my old native SDR grade, but absolutely fine. Best bit- if you take this video to other Mac with different screen you get "same" SDR preview. As long as profiles are close to real screen parameters you get about same preview on different machines (assuming they are all about 100% Rec.709 gamut), which is exactly how it should work. Try doing this on Windows Of course ideal solution is for any screen to have today 100% P3 gamut and 1000nits HDR support+ modes for HD and SD standards. If video is in one of the official standards then screen switches to given profile and if not you get good conversion to P3 'main' profile. This is how pro world works, expect they don't really relay on any auto switching (and conversion) but it's all set by human on both sides to the same profile- video feeding app and screen itself. Sometimes they forget to switch mode in the monitor and waste eg. time on grading which is useless. It happens As long as we keep getting screens which are so far from any standard (and their profiles) none of it will work well. Last edited by kolak; 9th December 2021 at 19:23. |
|
9th December 2021, 23:53 | #432 | Link |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
Thanks for those Mac insights!
According to this Reddit Thread, Eizo recommends to stay away from Windows 11 due to brokenness with ICC profiles: https://www.reddit.com/r/Windows11/c..._icc_profiles/ Likely explains why installing the ICC profile doesn't work for me... But that linked tool dwm_lut is interesting, it indeed works. It has some limitations and causes a lot of GPU load, but I can watch 1080p video via streaming apps with proper sRGB gamut mapping. |
10th December 2021, 16:49 | #435 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
LUTs are not particularly expensive to apply, you just need enough memory to store them, otherwise its basically just one lookup per pixel, something GPUs are good at, and dedicated image processors are even better at. If that DWM LUT tool is particularly slow, its probably due to the fact that its an external hook that adds an additional rendering pass and copies all textures around once. And most importantly, it disables DirectFlip, which every typical fullscreen rendering probably otherwise uses to bypass DWM entirely.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
10th December 2021, 18:31 | #436 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
madVR dies to it too you can easily trigger that with an intel i GPU or i even trigger it with a 960.
the first issue is a 3D LUT is to "small" for a better word so it can not simply look up a color it has to calculate it (madVR uses trilinear) this needs processing power it's simple but it's not zero. next issue is for a lack of better words again they are to "big" which trigger heavy slow downs with "rare colors" was the word used by madshi if i remember correctly. there is also the issue with how big a single texture can be which can be quite different depend on the GPU. GPU caps viewer shows this information under vulkan. a 1060 has this for vulkan maxImageDimension1D: 32768 maxImageDimension2D: 32768 maxImageDimension3D: 16384 maxImageDimensionCube: 32768 the DWM 3D LUT should have a little easier job by been inputting with 10 or 8 bit data not 16 bit float but on the other hand new GPU really like FP16 and clearly not int10. |
10th December 2021, 18:57 | #437 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Quote:
You'll explode your video memory long before these dimensions are a limit, because you need a x^3 texture in all dimensions, so even 1024^3 would already be a gigabyte of space in 8-bit. So most practial 3DLUTs are likely at most 512^3 or smaller, which is where interpolation comes in to fill in the gaps.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
|
10th December 2021, 19:19 | #438 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
madVR 3D LUT are 256^3(which are the biggest i'm aware of 65/64 is already considered huge and monitors easy go down to 16.) which is 100 mb so irrelevant but that'S not the issue it's the temporary calculated numbers usually in the green.
beware my 1060 size is insane compared to the 960 and i have no clue what Dx9 has available. there are rendertimes of 200ms just from the 3D LUT and madshi sad something about the use of smaller 3D LUT the memory is clearly not the issue here. feel free to find out what the real issue is. i tried to find the bug report about it but it was not my report. there will be report of this from me here ont eh forum but they are at best old. edit: the input is a float point number you always have to fill in gaps with madVR. |
10th December 2021, 21:34 | #439 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
|
|
|
|