Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
![]() |
#241 | Link | |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,579
|
Quote:
The thing is that a lut doesn't understand Dolby Vision metadata, it's just a matrix: numbers in, numbers out, so the results can change. Post a sample and I'll see what I can do. Oh, almost forgot: Merry Christmas! ![]() |
|
![]() |
![]() |
![]() |
#242 | Link | |
Registered User
Join Date: Oct 2022
Posts: 4
|
Quote:
Whenever I import "HLG_BT2020_to_Linear_BT709.cube " LUT in my resolve, it gives error". Attached is the screenshot. Other LUTs import worked fine from your collection. Below, I am giving original sample and processed videos as you asked. Original HLG video from iPhone 12: https://drive.google.com/file/d/1M_O...usp=share_link Processed SDR video in resolve with your "HLG_A_to_LinearBT709" LUT: https://drive.google.com/file/d/1RaN...usp=share_link Processed Dolby Vision Profile v5 video in Resolve: https://drive.google.com/file/d/1L4z...usp=share_link I am attaching project settings and export video settings of Resolve here for dolby vision export and SDR export for your information. |
|
![]() |
![]() |
![]() |
#243 | Link |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,579
|
Happy New Year!
![]() So... let's first take a look at what we have here. The reference white of 75% has overshooting over 0.52V in what is supposed to be specular highlights like the reflections in the tiara and the other pieces of jewelry that the girl is wearing. Far from being an ideal shot, but I guess we can't expect much from a tiny sensor like the one we have in mobile phones (iPhones included): ![]() Although there's a great deal of motion blur at 29,970p, at least the real reference white is correctly positioned at 0.52V as we can see on the left hand side of the waveform monitor, so looks like Apple didn't screw this up completely after all: ![]() This is the conversion with the non public BBC LUT: Code:
video=LWLibavVideoSource("\\mibctvan000.avid.mi.bc.sky.it\Ingest\MEDIA\temp\Sample.mov") audio=LWLibavAudioSource("\\mibctvan000.avid.mi.bc.sky.it\Ingest\MEDIA\temp\Sample.mov") AudioDub(video, audio) ConvertBits(16) ConvertToPlanarRGB(matrix="Rec2020") Cube("C:\Program Files (x86)\AviSynth+\LUTs\8a_HLG_bt709_AC_mode-nar_in-nar_out-nar_nocomp.cube", fullrange=true) ConverttoYUV420(matrix="Rec709") ![]() ![]() Now, I know that the BBC LUT is correct, which means that there's something deeply wrong in the way Apple's logic saved the BT.2020 colours as there's a huge shift towards the magenta and red. As always, my HLG_A_to_LinearBT709 LUT was done in 2021 to address that and in fact the result is quite pleasing: Code:
video=LWLibavVideoSource("\\mibctvan000.avid.mi.bc.sky.it\Ingest\MEDIA\temp\Sample.mov") audio=LWLibavAudioSource("\\mibctvan000.avid.mi.bc.sky.it\Ingest\MEDIA\temp\Sample.mov") AudioDub(video, audio) ConvertBits(16) ConvertToPlanarRGB(matrix="Rec2020") Cube("C:\Program Files (x86)\AviSynth+\LUTs\HLG_A_to_LinearBT709.cube", fullrange=true) ConverttoYUV420(matrix="Rec709") ![]() ![]() The magenta shift is still quite noticeable, though, in your case, so I would still consider color shifting a bit more or reduce the saturation in the magenta part only altogether. I think something like this looks much more natural: Code:
video=LWLibavVideoSource("\\mibctvan000.avid.mi.bc.sky.it\Ingest\MEDIA\temp\Sample.mov") audio=LWLibavAudioSource("\\mibctvan000.avid.mi.bc.sky.it\Ingest\MEDIA\temp\Sample.mov") AudioDub(video, audio) ConvertBits(16) ConvertToPlanarRGB(matrix="Rec2020") Cube("C:\Program Files (x86)\AviSynth+\LUTs\HLG_A_to_LinearBT709.cube", fullrange=true) ConverttoYUV420(matrix="Rec709") tweak(StartHue=75, EndHue=135, sat=0.70, dither=true) ![]() Last edited by FranceBB; 4th January 2023 at 12:18. |
![]() |
![]() |
![]() |
#244 | Link |
Registered User
Join Date: Oct 2022
Posts: 4
|
Happy New Year. That's one hell of analysis which is certainly above my understanding and requirement. Thanks for the detailed reply though.
I shoot lots of home videos of my daughter. I normally convert them in Dolby Vision (for my TV) and Rec709 (for sharing). Just wanted to have quick LUT which can give me Rec709 video in color grades like Dolby Vision. I guess I will just use your LUT which gives decent colours in Rec709. ![]() |
![]() |
![]() |
![]() |
#245 | Link | |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,579
|
Oh, I didn't know that she was your daughter.
It's nice that you get to spend some time with your family. ![]() Quote:
The "official" use we have of that LUT is for our News channel. Essentially we get people/viewers sending us videos from all over the world 'cause everyone has a smartphone around these days and people started sending us videos in HLG shot with the iPhone in early 2021, so we needed to cope with those, especially given that the news channel is still in FULL HD BT709, hence the LUT. ![]() The other use we had for this LUT was during the America's Cup 'cause we couldn't send cameramen physically on the boats as they would slow them down, so we had the crew recording videos on their iPhone in HLG and sending those over (I talked about this early on in this topic if you're curious ![]() |
|
![]() |
![]() |
![]() |
#246 | Link |
Registered User
Join Date: Feb 2020
Posts: 499
|
There are now HDR10+ EETFs in libplacebo and in ffmpeg. https://github.com/FFmpeg/FFmpeg/com...2530c2e0d86450
So initial support for the base of Dolby Vision curves too, another part of ST2094 |
![]() |
![]() |
![]() |
#247 | Link |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,579
|
Any chance of having a CPU only version of libplacebo in FFMpeg as well as the current GPU one?
I mean, currently it requires GPU acceleration, which is fine for on prem, but not very cloud friendly given the limits of most AWS EC2... Ideally, if a GPU isn't available, it should be able to have a CPU only fallback instead of failing. |
![]() |
![]() |
![]() |
#248 | Link | |||
Registered User
Join Date: Jan 2019
Location: Canada
Posts: 545
|
Quote:
https://archlinux.org/packages/extra...vulkan-swrast/ Following the Arch Wiki example for "Software Vulkan: lavapipe" at: https://wiki.archlinux.org/title/Vulkan. Quote:
Quote:
In FFmpeg, vf_libplacebo is currently only supported for Vulkan frames. It's a GPU library, after all.
__________________
LG C2 OLED | GitHub Projects Last edited by quietvoid; 23rd February 2023 at 18:39. |
|||
![]() |
![]() |
![]() |
#250 | Link | |
hlg-tools Maintainer
Join Date: Feb 2008
Posts: 354
|
Quote:
I don't remember anything about basket weaving. |
|
![]() |
![]() |
![]() |
Thread Tools | Search this Thread |
Display Modes | |
|
|