Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
![]() |
#2 | Link |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,354
|
Interesting question!
I more or less understand why HDR would be useful with 8-bit 4:2:0 video sources which only offer 256 levels per RGB channel (total 16,777,216 colours) but why is it deemed necessary with 10-bit 4:2:0 video sources which offer 1024 levels per RGB channel (total 1,073,741,824 colours)? Surely there's enough colour range and grey scale for it to be considered HDR already. Same goes with any lossless image format, such as PNG.
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
Last edited by SeeMoreDigital; 17th November 2024 at 18:58. |
![]() |
![]() |
![]() |
#3 | Link |
Registered User
Join Date: Aug 2024
Posts: 529
|
Stop flooding the forum with stupid questions
I have an useful link for you https://www.google.com/search?q=png+hdr+support |
![]() |
![]() |
![]() |
#4 | Link | |
Registered User
Join Date: Apr 2024
Posts: 400
|
Quote:
|
|
![]() |
![]() |
![]() |
#5 | Link | ||
Formerly davidh*****
Join Date: Jan 2004
Posts: 2,666
|
Quote:
And when you say "HDR would be useful with 8-bit" do you mean the pseudo-HDR of blending under- and over-exposed images? Because that isn't real HDR. Quote:
|
||
![]() |
![]() |
![]() |
#6 | Link |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,354
|
Okay... "it" refers to HDR...
In other words, why is "HDR" deemed necessary with 10-bit 4:2:0 video sources which offers 1024 levels per RGB channel (a total 1,073,741,824 colours)... Surely when you have more than one billion colour/grey scale levels that's enough dynamic range to provide HDR already.
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
Last edited by SeeMoreDigital; 17th November 2024 at 20:58. |
![]() |
![]() |
![]() |
#7 | Link | |
SuperVirus
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,448
|
Quote:
__________________
«Your software patents have expired.» Last edited by filler56789; 17th November 2024 at 20:57. Reason: grammar+clarity |
|
![]() |
![]() |
![]() |
#8 | Link |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,354
|
lol...
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
|
![]() |
![]() |
![]() |
#9 | Link |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 3,227
|
Well, when it comes to images, there's Google's Ultra HDR Image Standard which has become the de facto standard.
The Ultra HDR Image Standard was introduced with Android 14 (we're currently in Android 15) and it's supported by the mobile phones running that version of android or newer and also by Windows 11. Obviously you also need an app that can open those images and display them appropriately and luckily Google Chrome supports it, which means that Chromium based browsers should support it too. It's an open standard, which means that as long as the producer of a decoder wants to support it, they can do it. Anyway, the way it works is basically by combining a standard SDR JPEG image with a gain map in XMP. Essentially, the camera will shoot raw and from there it will create a version with highlights dimmed down in JPEG and then it will save the coefficients of the gain map in a separate XMP file. When a standard SDR display tries to open it, it will only display the base layer, so the image will look "ok-ish", however when an HDR capable display (on an HDR capable OS that supports the standard) opens it, it will apply the gain map to the file to create the appropriate HDR version. The gain map metadata consists in a series of values the most important of which are the GainMapMin and GainMapMax, namely the min and max allowed ratio for the linear luminance for the target HDR rendition relative to that of the SDR image at a given pixel. This is just an example extracted from one of my pictures: Code:
<x:xmpmeta xmlns:x="adobe:ns:meta/" x:xmptk="XMP Core 5.5.0"> <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <rdf:Description rdf:about="" xmlns:hdrgm="http://ns.adobe.com/hdr-gain-map/1.0/" hdrgm:Version="1.0" hdrgm:GainMapMin="-0.46508971" hdrgm:GainMapMax="3.5125889" hdrgm:Gamma="1" hdrgm:OffsetSDR="0.014513" hdrgm:OffsetHDR="0.014513" hdrgm:HDRCapacityMin="0" hdrgm:HDRCapacityMax="3.5125889" hdrgm:BaseRenditionIsHDR="False"/> </rdf:RDF> </x:xmpmeta> Here's a 4080x3072 picture I shot earlier this August when I was wondering in the woods in JPEG with the Ultra HDR Standard. https://photos.app.goo.gl/2nmzwTwooaWXBRxDA Download link: https://we.tl/t-tGY7dJ8v9I (link valid for 3 days only) If I open just the base layer, I can see the different rays of light hit the tree on the right hand side but the shadows and the lit parts of the trunk are almost at the same level, we're talking about a very minimal difference, around 0.15V (i.e 150 millivolt) in the signal: ![]() Unfortunately there's noise as there wasn't much light going through the trees in the forest so it's hard to spot. I don't have a way of showing you how it looks on a proper HDR monitor, but on mine (around 600 nits) it's like watching a completely different picture. The highlights stand out, there's a clear difference between the various shadows and the ray of light hitting the trees and the patches of ground. The ground itself, full of sticks, gets some life and it's actually how it looked in real life rather than the way it's displayed in the SDR version which almost looks lifeless. Anyway, for those interested, I strongly suggest reading the documentation about the standard at this link: https://developer.android.com/media/...r-image-format |
![]() |
![]() |
![]() |
#10 | Link | |
Formerly davidh*****
Join Date: Jan 2004
Posts: 2,666
|
Quote:
Why is HDR "deemed necessary" when you have 10 bits, which is already enough for HDR...? I really not clear on what you think "HDR" is here. You're talking as if it's some process which is applied to the data before display. |
|
![]() |
![]() |
![]() |
#11 | Link |
Registered User
Join Date: Feb 2022
Location: Austin, TX
Posts: 9
|
I don't think this is a stupid question. PNG technically does support a high dynamic range in its limited suite of bit depths.
SeeMoreDigital's suggestion of using more bits doesn't actually give you more dynamic range out of the box unless you are using a linear light space. Otherwise it is just giving more in-between colors and levels within the same dynamic range (e.g. SDR or HDR). When your pixel/sample values are linear light levels, the higher their value, the brighter they are and the higher your their max value is, the brighter you go. PNG doesn't really have a way to hint that your pixel values are linear so you can't just throw more bits at it to increase dynamic range. Instead it gives you multiple options to hint what non-linear curve the pixel values use and this allows you to specify pixels brighter than where SDR would peak. This is a native feature of PNG via its support for embedding color profiles with this kind of curve. An example such curve is PQ (same one used in HDR10 video). In this, you are supplying pixel values that map to light levels between 0 and 10,000 nits (big dynamic range) via a transfer curve. The more bits you have, the more levels in between 0 and 10,000 you can represent in your image. The technique is described here: https://www.w3.org/TR/png-hdr-pq/ It's going to be up to your image viewer, your operating system, video driver, colorimetry settings, and display on whether those higher-than-SDR nits are being blasted to your eyes. If you use Google Chrome on a recent macOS with one of Apple's HDR-capable displays while in a 1000-nits-or-more display preset mode, then you should see this red square popping out radioactively in your display: https://chromium.googlesource.com/ch...es/cicp_pq.png Most folks will only see a red square about as bright as the SDR white around it. Even current Safari will only map it to the SDR portion of macOS compositor's EDR range. This is different from Google's Ultra HDR. Google Ultra HDR has you tone map an HDR source image to an image using common SDR curve like sRGB's EOTF for your JPEG, then attaches a second behind-the-scenes image that shows says how to convert the SDR image values to linear light with high dynamic range. In some ways this is better, because today most browsers on most displays only be displaying images in SDR, so having something in BT.709 colors pre-tonemapped to sRGB curve displays predictably for most viewers and a few lucky ones will get to see your light levels in HDR. Last edited by JustinTArthur; 18th November 2024 at 02:25. |
![]() |
![]() |
![]() |
#15 | Link |
Registered User
Join Date: Feb 2022
Location: Austin, TX
Posts: 9
|
Yes, it supports HDR light levels and corresponding UHD or digital cinema colors, but it's up to your viewing software whether it gets the hint, and additionally up to your OS, drivers, settings, and monitor whether it gets displayed that way. Click the chromium Blink webtest link to see if yours does.
PNG 3rd Edition will provide the most comprehensive support for it once it's been finalized, and some software (like Chromium and Chrome) already support those new features in their draft form: https://github.com/w3c/png/blob/main...md#hdr-support Older viewers will fail somewhat gracefully if you provide enough fallbacks, but you can expect the mapping into SDR to darken things unpredictably. The link I provided earlier provides a PNG 2nd Edition-compliant way to do it and goes into detail on fallbacks: https://w3c.github.io/png-hdr-pq/ Last edited by JustinTArthur; 19th November 2024 at 04:12. |
![]() |
![]() |
![]() |
#17 | Link |
Registered User
Join Date: Aug 2024
Posts: 529
|
Support for colors is an ambiguous expression.
You don't even know what you are asking. Support RGB? Yes. Support the pixel values required? Yes. Support the the metadata? Yes. Can you tell FFmpeg to add the metadata? Yes. Can FFmpeg passthrough the metadata from source? Yes. Can FFmpeg add the metadata for you? I'm not sure. Probably not. What are you asking? Last edited by Z2697; 22nd November 2024 at 11:14. |
![]() |
![]() |
![]() |
#18 | Link | |
Registered User
Join Date: Aug 2024
Posts: 529
|
Quote:
Even if PNG supports A2R10B10G10 the difference in bitdepth still isn't the key to the HDR support. |
|
![]() |
![]() |
![]() |
#20 | Link | |
Registered User
Join Date: Apr 2024
Posts: 400
|
Quote:
|
|
![]() |
![]() |
![]() |
Tags |
hdr, image-quality, png |
Thread Tools | Search this Thread |
Display Modes | |
|
|