Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th September 2018, 19:59 | #1 | Link |
Registered User
Join Date: Sep 2013
Posts: 22
|
Encoding HDR movie with SDR compatibility?
I copy all of my BluRays into a personal "digital archive", so I'll be able to easily access them later on from a range of different devices. I start by ripping the discs with MakeMKV, and I then re-encode them with ffmpeg, so they won't be quite so huge.
For the first time, I have a 4K BluRay that supports HDR, and I'm not sure what to do with it. I don't want to loose HDR, but I also want the movie to look more-or-less correct on standard screens. I'd also like the movie to be playable via more standard software than madVR. Although I truthfully don't have a specific device in mind, I generally live in Apple's ecosystem, so for the purposes of this question, pretend I'm targeting both the iPad Pro (HDR) and the standard iPad (no HDR). Without creating two separate streams, is this completely impossible right now? I'm starting to suspect it is, because I haven't found any information on how to do it. I would have expected there to be a metadata field that marks a stream as HDR and provides players with information to guide with tone mapping. It seems to me that some day, Apple or Samsung or someone else is going to release a phone that can record HDR videos, and people will expect their videos to be viewable on all types of displays, without saving out multiple formats. Am I SOL? To be clear, I know that HDR → SDR tone mapping will never be perfect, I'm just looking for something better than the super washed out colors you get by default when playing HDR content on SDR screens. Thanks! Again, if this is impossible, that's a fine answer, I just wanted to check! |
12th September 2018, 20:40 | #4 | Link | |
Registered User
Join Date: Sep 2013
Posts: 22
|
Quote:
What is the quality loss like when converting PQ → HLG? And how bad will it look on standard 8 bit SDR screens? Last edited by Wowfunhappy; 12th September 2018 at 21:14. |
|
13th September 2018, 14:04 | #6 | Link | |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,905
|
Quote:
You won't be able to do this on regular Avisynth, but you can do it in Avisynth+ via HDRTools: https://forum.doom9.org/showthread.php?t=175488 Alternatively, you can use a LUT (in attachment): Code:
#Index FFMpegSource2("Video.m2ts", atrack=-1) #Bringing everything to 16bit ConvertBits(16) #Convert to RGB 16bit ConvertToPlanarRGB() #Converting HDR PQ to HDR HLG Cube("PQ_to_HLG.cube") #Convert to yuv 4:2:0 Converttoyuv420(matrix="Rec2020") #Dithering down to 10bit with Floyd-Steinberg error diffusion ConvertBits(bits=10, dither=1) Last edited by FranceBB; 13th September 2018 at 14:07. |
|
15th September 2018, 12:41 | #9 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,771
|
Quote:
Also, HLG support in displays and devices isn't as common as HDR-10 support. The HDR on that disc had a separate color grade with different creative intent than the standard Blu-ray SDR grade. An SDR you derive from HDR will typically look quite a bit different than the intended SDR look for that title. So even if HLG works-ish, you still aren't getting the real SDR experience. |
|
16th September 2018, 08:04 | #10 | Link |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
|
It would be interesting to see some HDR-vs-SDR-vs-HLG comparisons from the same source. Of course, I've seen some HDR movies that appear to have been graded in an entirely different way than the original SDR Bluray... and then there's The Matrix, which gets a completely different regrade every single time it's released since the DVD days, to the point that you might as well just impose your own creative intent on it.
|
16th September 2018, 13:24 | #11 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,771
|
Quote:
I’ve seen plenty. HDR-10 looks the best, particularly if there was a lot of creative color work done. HLG is pretty good, especially for stuff like live sports where there isn’t creative color. SDR derived from HDR can look great-for-SDR, but won’t match the creative intent of a creative SDR grade. But it is generally fine for stuff like live sports/news/concerts. One issue with HLG is that the more tuned it is for HDR, the worse it’ll be on SDR, and vise versa. Sent from my iPhone using Tapatalk |
|
18th September 2018, 08:25 | #12 | Link |
Derek Prestegard IRL
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
|
I have to throw in a word of support for the overall Dolby Vision ecosystem. HDR10 is great, has the widest support for sure, and when "good" creative decisions are made it can really wow.
However, I'd argue that DoVi is technically a superior solution - you just have to be okay with everything that comes with it |
18th September 2018, 11:02 | #13 | Link |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
|
HDR10 has some definite severe limitations -- but HDR10+ is essentially technically identical to DV when it comes to gamut and limitations, for all practical purposes. The last thing DV has over the free competition is 12-bit, but I've never seen a real-world 12-bit DV release (since there's no such thing as 12-bit UHD BD and no streaming company uses it) so it's basically theoretical at this point. DV did get to dynamic lighting first, so I'm sure it'll hang on with studios eager to maximize their investment for many years, but there's going to have to be a DV+ if it's going to have a future.
If 12-bit ever makes inroads into the real world, I'm certain there will be an HDR12 to back it up by then. |
18th September 2018, 21:54 | #17 | Link |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,227
|
Until manufacturers start producing projectors and panels that are actually capable of displaying native 12-bit content for the home market, I personally don't see the point of Dolby Vision's version of 'dynamic' 12-bit HDR.
Currently, Dolby Vision for the home seems overly complicated. Which is probably why there's been so many issues with it (ie: black mattes looking grey, colours flashing/being displayed the wrong colour, blown white levels...). Given that today's projector, panel and video encoding world revolves around 10-bits, HDR10+ makes much more sense to me.
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
|
18th September 2018, 22:05 | #18 | Link | |
Cary Knoop
Join Date: Feb 2017
Location: Newark CA, USA
Posts: 397
|
Quote:
Now in 2018 folks are still enjoying color from 60's productions. |
|
18th September 2018, 22:19 | #19 | Link |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,227
|
Not the same logic at all...
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
|
18th September 2018, 22:59 | #20 | Link |
Cary Knoop
Join Date: Feb 2017
Location: Newark CA, USA
Posts: 397
|
Then what are you saying, should you master a movie using Doly Vision so that a decade from now people can enjoy the 12-bit quality or would you master using HDR10 because, well, almost no-one has the capability right now to see it?
|
|
|