View Single Post
Old 16th March 2023, 17:48   #103  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by DTL View Post
"I could make HDR MPEG-2 "

HDR depend on camera (and display device) and not on MPEG codec in the chain. Transfer function for compression of dynamic range may be quantized to either 8 or 10 or more bits with simply different level of quantization noise. But if camera not provide required physical dynamic range - no real HDR after camera. From good broadcast-graded cameras with 600%+ range we really watch HDR for decades but it only was not standartized like HLG transfer. So yes - MPEG-2 HDR already worked many years. And yes - with HLG metadata marking it is possible to run 8bit HLG HDR in MPEG-2. It only one of infinity modes of HDR in 8bit MPEG-2. Display device have a right to expand transfer curve of near white and overwhites as it like (and as user like).
I don't think anyone has ever done MPEG-2 HDR. I can squint and imagine how I could kind of get it to work, but it'd still be a regression in quality, bitrate (like 4x!), and compatibility. I was also presuming 10-bit MPEG-2 encoding to PQ.

Quote:
Any SDR-decoder display device like 8bit MPEG1 if feed by professionally designed 8bit content may have simple AI-driven or even simple LUT-based user control option like
Simulate HDR-expand mode 1
Simulate HDR-expand mode 2
...
Simulate HDR-expand mode N
Simulate HDR-expand mode HLG-8bit
What device supports user or metadata defined LUTs but not HEVC? Let alone real-time AI like that?

Quote:
My poor people's 4K display from China-Philips too frequently refuses to decode 10bit HEVC (though it is possible to create playable file with x265 and mp4box for it). So I do not like to download HEVC 10bit release from torrents and pay per GBs and finally found it is not playable at display. Typically all xvid and AVC releases run OK.
Does 8-bit HEVC work? If it is a Smart TV, can it play back HDR from streaming services?

8bit is endusers standard for display and decoder devices for about 1/3 of a century now.[/QUOTE]
Display and video depth are quite different things; RGB 8-bit full range can have more visual information than a limited range 4:2:0. And a panel's "native" EOTF is far from the CRT-emulating Rec709 gamma. Lots of good HDR has been viewed on 8+2 panels.

Quote:
So I expect 'in a good still not perfect world' we can have slow motion from 8bit to 10bit at endusers homes (and may be from MPEG-4ASP/AVC to HEVC) at about 1/2 of a century. And for this good intermediate period of a 1/2..1 of a century users may use good old 8bit workflow with simple some additional metadata enhancement for highlights expanding (transfer) curve.
Premium content is already exclusively 10-bit for HDR, and lots is 10-bit for SDR as well. We'll see >50% of global eyeball hours be in 10-bit within a few years.

Quote:
So h.264 may be selected as last great product of current civilization for video compression and we can use it to the end. The end of civilization may be much quicker even 1/2 of a century. So few reasons to switch versions of 264+(++, +++,...) every several years for a very few of enhancement and very large performance penalty and new hardware to purchase.
I think MPEG has done well in keeping decoder performance increases per generation quite proportionate to compression efficiency gain. A HEVC decoder block costs WAY less than a MPEG-2 decoder back when DVD launched.

That said, I have no reason to think the H.264++ era will last tremendously longer than the H.261++ era it replaced. Peering into the future, I can see half-float linear light frequency-domain prediction codecs that use ACES mezzanines next decade.

Quote:
Addition: Also it is remarkable how quickly died hype around WCG. It is typically missed that 10bit enduser systems are not only about HDR but also with WCG. So +2bits to standard 8 really divided between new WCG feature and HDR feature. If we finally skip the not very useful WCG feature we may found the 8+1=9 bits may also cover HDR not very bad.

WCG looks like died not only because it is hard to find natural scene with any benefit from WCG, but also the scene setup group (colour artist, lighting/shading artist, cameraman) must be able to create visually pleasing scene with WCG. It may be even more harder to find.
WGC is alive and well. It's just that WGC and PQ are almost always implemented together, which is HDR-10 and Dolby Vision as used in HDR streaming and UHD Blu-ray. It turned out that adding HDR on top of WGC was pretty small incremental complexity, so there were very few WGC-only displays ever shipped.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote