View Single Post
Old 12th August 2020, 01:47   #30  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
Quote:
Originally Posted by wswartzendruber View Post
FUD? You mean Fear, Unbelief, and Doubt?

My background is that I didn't get into HDR at all until HDR10+ was already a thing. So I'm very late to the game. Research and study (various docs from the ITU and BBC) revealed that for some reason, the industry had chosen to begin distributing content in a system that used absolute brightness coupled with metadata. Distribution to end users inherently entails differing viewing environments, which metadata doesn't address at all.

If you are going to point at HLG and call its rationale FUD, then I want you to defend distributing content in PQ instead. What advantages do you receive?
FUD = fear, uncertainty, and doubt

To be honest, I was just trolling a little I don't have any hard evidence whatsoever that HLG is inferior to PQ, or that using PQ offers some advantage over HLG --- aside from my unchallenged point that PQ offers wider device compatibility in the rec2020 case, especially on early HDR TVs.

Thankfully, almost everyone seems to support HLG on modern devices now from what I can tell. The notable exception is Roku set-top boxes (distinct from TVs with Roku's software like TCL). This is a ___HUGE___ market in the US, actually, and is the single biggest cohort for 10 foot viewing here, so that's a major factor.

Does anyone know of any other new devices that support a PQ format but not HLG?

The creatives I've spoken with with prefer PQ, and are in general fans of having an absolute signal that is display referenced. Their standpoint is that with relative luminance, for example, HLG can misrepresent contrast.

However, I absolutely do feel the pain of needing to worry about a 1000 nit grade or a 2000 nit grade, and worrying about how mapping these to a 4000 nit display is not defined, even if there are a lot of pretty good solutions to map either of those examples to a 600 nit display, for example. I can see the draw of sticking to a relative, scene-referenced signal, and having a reasonable degree of faith that it will display nicely on a broad swath of HDR WCG displays without needing to futz with metadata.

My point was that, to be totally pragmatic, many displays totally ignore metadata and just do their own thing. So like, I wouldn't be surprised if everything would mostly be fine if we just had no metadata at all (other than the transfer, primaries, and differencing flags to identify content as PQ or HLG, and 709 or 2020, of course!) hence the "metadata FUD" troll

Personally I still prefer Dolby Vision Profile 5 for a number of reasons, but that's getting OT pretty fast!

Last edited by Blue_MiSfit; 12th August 2020 at 01:54.
Blue_MiSfit is offline   Reply With Quote