View Single Post
Old 16th March 2023, 10:44   #102  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,068
"I could make HDR MPEG-2 "

HDR depend on camera (and display device) and not on MPEG codec in the chain. Transfer function for compression of dynamic range may be quantized to either 8 or 10 or more bits with simply different level of quantization noise. But if camera not provide required physical dynamic range - no real HDR after camera. From good broadcast-graded cameras with 600%+ range we really watch HDR for decades but it only was not standartized like HLG transfer. So yes - MPEG-2 HDR already worked many years. And yes - with HLG metadata marking it is possible to run 8bit HLG HDR in MPEG-2. It only one of infinity modes of HDR in 8bit MPEG-2. Display device have a right to expand transfer curve of near white and overwhites as it like (and as user like).

Any SDR-decoder display device like 8bit MPEG1 if feed by professionally designed 8bit content may have simple AI-driven or even simple LUT-based user control option like
Simulate HDR-expand mode 1
Simulate HDR-expand mode 2
...
Simulate HDR-expand mode N
Simulate HDR-expand mode HLG-8bit

Because good designed 8bit SDR content already have some HDR compressed range in near whites and superwhites. It is legal from begining of Rec.601 in 1982. Close to half a century now in 2023.

"and display supports 10-bit HEVC decode. "

My poor people's 4K display form China-Philips too frequently refuses to decode 10bit HEVC (though it is possible to create playable file with x265 and mp4box for it). So I do not like to download HEVC 10bit release from torrents and pay per GBs and finally found it is not playable at display. Typically all xvid and AVC releases run OK.

" I can't imagine why anyone would practically benefit from using a sub-HEVC codec for HDR."

8bit is endusers standard for display and decoder devices for about 1/3 of a century now. So I expect 'in a good still not perfect world' we can have slow motion from 8bit to 10bit at endusers homes (and may be from MPEG-4ASP/AVC to HEVC) at about 1/2 of a century. And for this good intermediate period of a 1/2..1 of a century users may use good old 8bit workflow with simple some additional metadata enhancement for highlights expanding (transfer) curve.

"H.264 is definitely the first "after" codec in the before/after of codec evolution, with MPEG-4 pt 2 and VC-1 the last gasp of the classic 8x8 intra block IPB single forward reference no in-loop deblocking non-arithmetic entropy coding era. It was a huge break from from everything before, and HEVC and VVC are refinements and extensions of the H.264 model, based on fundamentally the same core."

So h.264 may be selected as last great product of current civilization for video compression and we can use it to the end. The end of civilization may be much quicker even 1/2 of a century. So few reasons to switch versions of 264+(++, +++,...) every several years for a very few of enhancement and very large performance penalty and new hardware to purchase.

Addition: Also it is remarkable how quickly died hype around WCG. It is typically missed that 10bit enduser systems are not only about HDR but also with WCG. So +2bits to standard 8 really divided between new WCG feature and HDR feature. If we finally skip the not very useful WCG feature we may found the 8+1=9 bits may also cover HDR not very bad.

WCG looks like died not only because it is hard to find natural scene with any benefit from WCG, but also the scene setup group (colour artist, lighting/shading artist, cameraman) must be able to create visually pleasing scene with WCG. It may be even more harder to find.

Last edited by DTL; 16th March 2023 at 15:19.
DTL is offline   Reply With Quote