Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.


Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Thread Tools Search this Thread Display Modes
Old Today, 02:57   #1  |  Link
Registered User
Join Date: May 2014
Posts: 24
UHD1 (Astra 19.2° East): how to test HLG for actual HDR?


so I recorded a transportstream of the Channel UHD1 and I'm wondering, whether the material is actually HDR.

The whole channel constantly sends UHD with HDR-HLG and since it's backwards compatible to SDR: is there a program that allows me to analyze whether the full capabilities of this HDR mapping where used?

I don't even know how to word a google search request that could yield one, but I tried at least.

If there is no tool for this (which is my guess atm.), then it might be possible to extract 10bit PNGs from the stream. (I have a few tools at hand like ffmpeg or VLC)
How would I manually analyze such images in GIMP or another image tool?

I obviously would need a screenshot which is pretty bright/has a high dynamic range. (and should make use of hlg-hdr if it was recoreded that way, I can't use dark screenshots because hlg doesn't improve those afaik in order to stay backwards compatible to sdr, sdr devices simply cut off when it gets too bright (might look overexposed, but shouldn't when done right))

What are your thoughts on the matter?

Djfe is offline   Reply With Quote
Old Today, 06:13   #2  |  Link
Broadcast Encoder
FranceBB's Avatar
Join Date: Nov 2013
Location: Germany
Posts: 621
19.2E, huh? That's Astra 1KR, Astra 1L, Astra 1M and Astra 1N.
I can't talk for others, but on Astra 1M you should be getting what we're airing from here at Sky (if you do have Sky, of course).

Try to get 11798 H tp 69, you should get Sky Sport UHD.
Our signal is basically:

Codec: HEVC H.265
Profile: Main10
GOP: Closed
Resolution: 3840x2160
Sample Type: 4:2:0 planar
Bitrate: 25 Mbit/s VBR
Framerate: 50fps
Scan type: Progressive
Color Curve: HLG (Hybrid Log Gamma)
Colormatrix: BT2020nc
Bit Depth: 10bit

I'm wondering, whether the material is actually HDR.
Yes, it has to be, at least in our workflows.
Sky Italy works in HLG HDR, however in the UK they work in BT2020nc SDR, therefore whenever we get a Premier League fixture, we basically have to apply the HLG color curve to the BT2020nc SDR signal.
Of course, it won't be a real HDR signal, but it would be a pain in the butt to dynamically change it and change the way the decoder decodes it and the way we air it and the "metadata" at the very beginning of the transport stream.
As a matter of fact, if you take a look at the commercials, each and every one of them is in HLG BT2020nc HDR as well, which doesn't actually mean that they were shot with HDR in mind, but we have to air them like that.
Please note that this doesn't mean that we make a fake HDR signal that is air and there's no artistic adjustment or anything in the live encoding of the signal, it's just a conversion to keep the signal as consistent as possible across different programmes.

The whole channel constantly sends UHD with HDR-HLG
Yep. Dynamically changing that in our videoserver, playout ports, live encoders and home-user decoders would be a nightmare.
Perhaps in an utopistic future it will, but right now it's definitely not doable.
Those things have been available for quite some years now, but with the things we have now (and I don't only speak about the company I work for, but in general), doing this kind of things on live events is already a very resource expensive task with potentially many things that could go wrong. An example? Well, we may not always have cameras that record higher than 50fps in 4K using a curve like HLG or sometimes we do, but we may not have enough lines available in our matrix on the field and so on. Then there's the whole remote production thing where each and every camera is sent through via fiber and therefore the production is not on site, but on the main center and there are potentially many things that could go wrong there as well and so on.
For instance, ultra-slow-motion in live sport events are currently done in FULL HD and then upscaled live as we air them.

is there a program that allows me to analyze whether the full capabilities of this HDR mapping where used?
Not always, no. The stream is always a proper HLG stream, however, as I said before, sometimes we may get a 4K UHD BT2020nc 10bit SDR stream, therefore a conversion is applied, but you won't get more than 100-120 nits.
Unfortunately, Avisynth as a frameserver is slowly adapting to HDR and its own waveform monitor doesn't tell you anything.
I personally use a professional Tektronix Waveform monitor that tells how how many nits there are in an HDR signal:

I asked to add the option to see the amount of nits and proper HDR (HLG and PQ) support in the monitor on February 2019 and I'm sure it will be introduced someday. For the moment, though, you can use Davinci Resolve Pro to check the nits of the signal. If you turn on "HDR Mode" you can see a different waveform which tells you how many nits there are, which is what I use at home when I don't have my beloved Tektronix:

As you can see, on the left hand side you have the nits.
This is a PQ source of about 1500 nits which has been brutally clipped to 1000 nits prior to HLG conversion (as HLG has to be within 0-1000 nits); anyway, this is something you should never, ever do... (I don't know why they did it).

it might be possible to extract 10bit PNGs from the stream.
Sure, however, you'll lose HDR metadata, which, in HLG, only consists in correctly flagging the stream as HLG BT2020nc. Anyway, that's important as although you would still be able to retain the original quality, the image viewer you'll use won't be able to recognize what the image you are seeing is and it will almost definitely assume that it's a plain RGB image with no color curve whatsoever, therefore it will show it in the wrong way.

I obviously would need a screenshot which is pretty bright/has a high dynamic range.
Well, not necessarily something completely bright, just something normal. Again, it depends on who is grading what. For instance, even though HLG can go up to 1000 nits, I don't personally grade live sports events to 1000 nits 'cause:

1) I may not have enough stops in the camera to get to real 1000 nits.
2) The more you grade it towards HDR (high amount of nits), the worse it will look for those who have a normal SDR monitor/TV.

I often find 300 nits to be accommodating, but I eventually go up to 500 nits. In very rare occasions I went to 890 nits and I used all the 1000 nits only for offline encoding whenever a movie was graded in PQ with over 1000 nits. But again, we're talking about live events here, so...

sdr devices simply cut off when it gets too bright (might look overexposed, but shouldn't when done right)
Uh... no, it should look dull and lifeless, but not brutally clipped out.
Let's suppose that you have three TVs:

- TV1: 4K UHD BT2020 SDR compatible
- TV2: 4K UHD BT2100 HDR compatible (max 900 nits)
- TV3: 4K UHD BT2100 HDR compatible (max 2000 nits)

Let's suppose that we're gonna get an HDR HLG signal which has been graded at 300 nits.
The signal is gonna look good on the first TV, the SDR one, as although it's gonna ignore the color curve information about the HLG curve, it won't look bad, anyway it won't look exactly as the encoder wanted you to see it either.
The signal is gonna look exactly the way the encoder wanted you to see it on TV2 and TV3, though, as they are correctly gonna read the HLG color curve information and they are gonna make you see the signal exactly as it should be reproduced.

Let's suppose that we have just changed channel and that we're now seeing a movie, but we're still using our beloved satellite decoder and the signal we're getting is an HDR HLG one.
Such a signal is probably gonna be 1000 nits.
If you were going to see a screenshot without any metadata of that signal, what you would see is something really gray with very little colors. The SDR BT2020 capable TV (TV1) will receive the signal, ignore the HDR HLG color curve metadata and it will show you the image, however, you'll see it very dull and lifeless as it's ignoring the color curve and it's pretending that it's a proper SDR signal.
Let's go to TV2 now.
TV2 is HDR therefore it's gonna correctly see that the signal is not a BT2020nc SDR one, but it's an HDR HLG one as there's the HLG color curve applied. Then, it's gonna try to do its best to show it to you properly, however the signal peaks at 1000 nits while TV2 only has 890 nits, so... what is it gonna do? Well, remember when I talked about "metadata" in the transport stream? What I meant by that is the flag that tells the decoder "hey, this is an HLG signal". That's it. There are no other metadata in a transport stream, therefore our poor TV2 won't know what's the minimum and the maximum amount of nits of that stream and won't be able to re-map them to its own maximum and minimum. Sure, it will try to internally analyze it while its decoding it, but the internal algorithm may not be perfect and you may end up seeing a few details clipped out.
TV3 instead is gonna show it properly.

Let's suppose that we're now turning off our beloved decoder and we're turning on our bluray as we've got a copy of a PQ 1500 nits movie we wanna watch.
Sadly TV1 won't be able to show anything and will tell us that the signal that it's receiving from the HDMI cable is not compatible.
Our TV2 however will receive the signal from the HDMI cable and will display it to us, but note, here's the difference: thanks to PQ, the TV is actually able to know the minimum and maximum amount of nits that the signal is going to have and therefore is gonna re-map it precisely to its own maximum and minimum amount of nits. Sure, it's not gonna be the exact same image as the director, the colorist and the encoder wanted you to see, but it's gonna be the best possible reproduction of it that the TV is capable of and it's not gonna be so far from what the director wanted it to look like.
TV3 instead will display it properly.

Take a look at what I mean by this.

Let's suppose we have a low nits HLG signal, like 300 nits.
This is how an HLG HDR TV would reproduce it:

This is how a BT2020nc SDR TV would reproduce it:

Let's now suppose that we have a higher nits signal, like 1000 nits.
This is how an HLG HDR TV would reproduce it:

This is how a BT2020nc SDR TV would reproduce it:

As you can see, while the first example gave watchable and considerably good results on both the SDR TV and the HDR HLG capable one, the second one (which is more HDR oriented) gives a very "gray" (dull and lifeless) result on the poor SDR TV which is ignoring the HLG curve and therefore is displaying it pretending that it's a normal SDR signal which has a considerably less amount of nits.

I hope it helps and it clarifies your doubts.

Kind regards,
Broadcast Encoder

Last edited by FranceBB; Today at 06:32.
FranceBB is online now   Reply With Quote

hdr, hdr to sdr, hevc, hlg, uhd

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +1. The time now is 07:16.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.