Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 ASP

Reply
 
Thread Tools Search this Thread Display Modes
Old 16th March 2023, 05:38   #101  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,653
Quote:
Originally Posted by 102030 View Post
I still convert all movies with Xvid. I have the most experiences with this codec (13 years).
I really dislike h264 deblocking. The Xvid results are better for me than with x264.
You can always lower or turn off the in-loop deblocking filter in H.264; Gary Sullivan once told me that his biggest regret in H.264 was that the default loop filter strength was a little too high; -1,-1 is a better default. Even without it, High Profile will still outperform Xvid a lot due to better entropy coding, better adaptive quant signaling, multiple reference frames, hierarchical b-frames, and so on.

H.264 is definitely the first "after" codec in the before/after of codec evolution, with MPEG-4 pt 2 and VC-1 the last gasp of the classic 8x8 intra block IPB single forward reference no in-loop deblocking non-arithmetic entropy coding era. It was a huge break from from everything before, and HEVC and VVC are refinements and extensions of the H.264 model, based on fundamentally the same core.

Plus x264 is simply a much more psychovisually refined encoder than xvid, and it was more broadly used, both by hobbyists and for commercial use by companies who paid for improvements in it.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 16th March 2023, 10:44   #102  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
"I could make HDR MPEG-2 "

HDR depend on camera (and display device) and not on MPEG codec in the chain. Transfer function for compression of dynamic range may be quantized to either 8 or 10 or more bits with simply different level of quantization noise. But if camera not provide required physical dynamic range - no real HDR after camera. From good broadcast-graded cameras with 600%+ range we really watch HDR for decades but it only was not standartized like HLG transfer. So yes - MPEG-2 HDR already worked many years. And yes - with HLG metadata marking it is possible to run 8bit HLG HDR in MPEG-2. It only one of infinity modes of HDR in 8bit MPEG-2. Display device have a right to expand transfer curve of near white and overwhites as it like (and as user like).

Any SDR-decoder display device like 8bit MPEG1 if feed by professionally designed 8bit content may have simple AI-driven or even simple LUT-based user control option like
Simulate HDR-expand mode 1
Simulate HDR-expand mode 2
...
Simulate HDR-expand mode N
Simulate HDR-expand mode HLG-8bit

Because good designed 8bit SDR content already have some HDR compressed range in near whites and superwhites. It is legal from begining of Rec.601 in 1982. Close to half a century now in 2023.

"and display supports 10-bit HEVC decode. "

My poor people's 4K display form China-Philips too frequently refuses to decode 10bit HEVC (though it is possible to create playable file with x265 and mp4box for it). So I do not like to download HEVC 10bit release from torrents and pay per GBs and finally found it is not playable at display. Typically all xvid and AVC releases run OK.

" I can't imagine why anyone would practically benefit from using a sub-HEVC codec for HDR."

8bit is endusers standard for display and decoder devices for about 1/3 of a century now. So I expect 'in a good still not perfect world' we can have slow motion from 8bit to 10bit at endusers homes (and may be from MPEG-4ASP/AVC to HEVC) at about 1/2 of a century. And for this good intermediate period of a 1/2..1 of a century users may use good old 8bit workflow with simple some additional metadata enhancement for highlights expanding (transfer) curve.

"H.264 is definitely the first "after" codec in the before/after of codec evolution, with MPEG-4 pt 2 and VC-1 the last gasp of the classic 8x8 intra block IPB single forward reference no in-loop deblocking non-arithmetic entropy coding era. It was a huge break from from everything before, and HEVC and VVC are refinements and extensions of the H.264 model, based on fundamentally the same core."

So h.264 may be selected as last great product of current civilization for video compression and we can use it to the end. The end of civilization may be much quicker even 1/2 of a century. So few reasons to switch versions of 264+(++, +++,...) every several years for a very few of enhancement and very large performance penalty and new hardware to purchase.

Addition: Also it is remarkable how quickly died hype around WCG. It is typically missed that 10bit enduser systems are not only about HDR but also with WCG. So +2bits to standard 8 really divided between new WCG feature and HDR feature. If we finally skip the not very useful WCG feature we may found the 8+1=9 bits may also cover HDR not very bad.

WCG looks like died not only because it is hard to find natural scene with any benefit from WCG, but also the scene setup group (colour artist, lighting/shading artist, cameraman) must be able to create visually pleasing scene with WCG. It may be even more harder to find.

Last edited by DTL; 16th March 2023 at 15:19.
DTL is offline   Reply With Quote
Old 16th March 2023, 17:48   #103  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,653
Quote:
Originally Posted by DTL View Post
"I could make HDR MPEG-2 "

HDR depend on camera (and display device) and not on MPEG codec in the chain. Transfer function for compression of dynamic range may be quantized to either 8 or 10 or more bits with simply different level of quantization noise. But if camera not provide required physical dynamic range - no real HDR after camera. From good broadcast-graded cameras with 600%+ range we really watch HDR for decades but it only was not standartized like HLG transfer. So yes - MPEG-2 HDR already worked many years. And yes - with HLG metadata marking it is possible to run 8bit HLG HDR in MPEG-2. It only one of infinity modes of HDR in 8bit MPEG-2. Display device have a right to expand transfer curve of near white and overwhites as it like (and as user like).
I don't think anyone has ever done MPEG-2 HDR. I can squint and imagine how I could kind of get it to work, but it'd still be a regression in quality, bitrate (like 4x!), and compatibility. I was also presuming 10-bit MPEG-2 encoding to PQ.

Quote:
Any SDR-decoder display device like 8bit MPEG1 if feed by professionally designed 8bit content may have simple AI-driven or even simple LUT-based user control option like
Simulate HDR-expand mode 1
Simulate HDR-expand mode 2
...
Simulate HDR-expand mode N
Simulate HDR-expand mode HLG-8bit
What device supports user or metadata defined LUTs but not HEVC? Let alone real-time AI like that?

Quote:
My poor people's 4K display from China-Philips too frequently refuses to decode 10bit HEVC (though it is possible to create playable file with x265 and mp4box for it). So I do not like to download HEVC 10bit release from torrents and pay per GBs and finally found it is not playable at display. Typically all xvid and AVC releases run OK.
Does 8-bit HEVC work? If it is a Smart TV, can it play back HDR from streaming services?

8bit is endusers standard for display and decoder devices for about 1/3 of a century now.[/QUOTE]
Display and video depth are quite different things; RGB 8-bit full range can have more visual information than a limited range 4:2:0. And a panel's "native" EOTF is far from the CRT-emulating Rec709 gamma. Lots of good HDR has been viewed on 8+2 panels.

Quote:
So I expect 'in a good still not perfect world' we can have slow motion from 8bit to 10bit at endusers homes (and may be from MPEG-4ASP/AVC to HEVC) at about 1/2 of a century. And for this good intermediate period of a 1/2..1 of a century users may use good old 8bit workflow with simple some additional metadata enhancement for highlights expanding (transfer) curve.
Premium content is already exclusively 10-bit for HDR, and lots is 10-bit for SDR as well. We'll see >50% of global eyeball hours be in 10-bit within a few years.

Quote:
So h.264 may be selected as last great product of current civilization for video compression and we can use it to the end. The end of civilization may be much quicker even 1/2 of a century. So few reasons to switch versions of 264+(++, +++,...) every several years for a very few of enhancement and very large performance penalty and new hardware to purchase.
I think MPEG has done well in keeping decoder performance increases per generation quite proportionate to compression efficiency gain. A HEVC decoder block costs WAY less than a MPEG-2 decoder back when DVD launched.

That said, I have no reason to think the H.264++ era will last tremendously longer than the H.261++ era it replaced. Peering into the future, I can see half-float linear light frequency-domain prediction codecs that use ACES mezzanines next decade.

Quote:
Addition: Also it is remarkable how quickly died hype around WCG. It is typically missed that 10bit enduser systems are not only about HDR but also with WCG. So +2bits to standard 8 really divided between new WCG feature and HDR feature. If we finally skip the not very useful WCG feature we may found the 8+1=9 bits may also cover HDR not very bad.

WCG looks like died not only because it is hard to find natural scene with any benefit from WCG, but also the scene setup group (colour artist, lighting/shading artist, cameraman) must be able to create visually pleasing scene with WCG. It may be even more harder to find.
WGC is alive and well. It's just that WGC and PQ are almost always implemented together, which is HDR-10 and Dolby Vision as used in HDR streaming and UHD Blu-ray. It turned out that adding HDR on top of WGC was pretty small incremental complexity, so there were very few WGC-only displays ever shipped.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 16th March 2023, 21:33   #104  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,735
Quote:
Originally Posted by DTL View Post
"I could make HDR MPEG-2 "
In theory, yes, but it would be tonemapped BT709 SDR content from a real logarithmic source (or even something like BT709 800% etc).

In terms of "real" HDR, well... as long as you can "live" with 8bit banding, specifying BT2020nc and arib-std-b67 (i.e HLG) while encoding an HLG content did indeed work... beyond any of my expectations as I really thought it was gonna fail...

Quote:
Originally Posted by benwaggoner View Post
I don't think anyone has ever done MPEG-2 HDR.
Yet...
Look what you made me do, the atrocity... the blasphemy... an MPEG-2 XDCAM-50 in BT2020 HDR HLG

Code:
General
Complete name                            : /home/FranceBB/Share Windows Linux/temp/test.mxf
Format                                   : MXF
Commercial name                          : XDCAM HD422
Format version                           : 1.3
Format profile                           : OP-1a
Format settings                          : Closed / Complete
File size                                : 62.0 MiB
Duration                                 : 10 s 80 ms
Overall bit rate                         : 51.6 Mb/s

Video
ID                                       : 2
Format                                   : MPEG Video
Commercial name                          : XDCAM HD422
Format version                           : Version 2
Format profile                           : 4:2:2@High
Format settings                          : BVOP
Format settings, BVOP                    : Yes
Format settings, Matrix                  : Default
Format settings, GOP                     : M=3, N=12
Format settings, picture structure       : Frame
Format settings, wrapping mode           : Frame
Codec ID                                 : 0D01030102046001-0401020201040300
Duration                                 : 10 s 80 ms
Bit rate mode                            : Constant
Bit rate                                 : 50.0 Mb/s
Width                                    : 1 920 pixels
Height                                   : 1 080 pixels
Display aspect ratio                     : 16:9
Frame rate                               : 25.000 FPS
Color space                              : YUV
Chroma subsampling                       : 4:2:2
Bit depth                                : 8 bits
Scan type                                : Interlaced
Scan order                               : Top Field First
Compression mode                         : Lossy
Bits/(Pixel*Frame)                       : 0.965
Time code of first frame                 : 00:00:00:00
Time code source                         : Group of pictures header
GOP, Open/Closed                         : Closed
Stream size                              : 60.1 MiB (97%)
Color range                              : Limited
Color primaries                          : BT.2020
Transfer characteristics                 : HLG
Matrix coefficients                      : BT.2020 non-constant
Delay_SDTI                               : 36000000

Audio #1
ID                                       : 3-1
Format                                   : Dolby E
Format settings                          : Little
Format settings, wrapping mode           : Frame (AES)
Muxing mode                              : SMPTE ST 337
Codec ID                                 : 0D01030102060300
Duration                                 : 10 s 80 ms
Bit rate mode                            : Constant
Bit rate                                 : 1 291 kb/s
Channel(s)                               : 6 channels
Channel layout                           : L C Ls X R LFE Rs X
Sampling rate                            : 48.0 kHz
Frame rate                               : 25.000 FPS (1920 SPF)
Bit depth                                : 20 bits
Delay relative to video                  : -9 h 59 min
Stream size                              : 1.55 MiB (3%)
Title                                    : ITADE-fatto_Prog0
Delay_SDTI                               : 36000000
Locked                                   : Yes

Audio #2
ID                                       : 3-2
Format                                   : Dolby E
Format settings                          : Little
Format settings, wrapping mode           : Frame (AES)
Muxing mode                              : SMPTE ST 337
Codec ID                                 : 0D01030102060300
Duration                                 : 10 s 80 ms
Bit rate mode                            : Constant
Bit rate                                 : 505 kb/s
Channel(s)                               : 2 channels
Channel layout                           : X X X L X X X R
Sampling rate                            : 48.0 kHz
Frame rate                               : 25.000 FPS (1920 SPF)
Bit depth                                : 20 bits
Delay relative to video                  : -9 h 59 min
Stream size                              : 621 KiB (1%)
Title                                    : ITADE-fatto_Prog1
Locked                                   : Yes

Other #1
ID                                       : 1-Material
Type                                     : Time code
Format                                   : MXF TC
Frame rate                               : 25.000 FPS
Time code of first frame                 : 10:00:00:00
Time code of last frame                  : 10:00:10:01
Time code settings                       : Material Package
Time code, stripped                      : Yes

Other #2
ID                                       : 1-Source
Type                                     : Time code
Format                                   : MXF TC
Frame rate                               : 25.000 FPS
Time code of first frame                 : 10:00:00:00
Time code of last frame                  : 10:00:10:01
Time code settings                       : Source Package
Time code, stripped                      : Yes

Other #3
Type                                     : Time code
Format                                   : SMPTE TC
Muxing mode                              : SDTI
Frame rate                               : 25.000 FPS
Time code of first frame                 : 10:00:00:00

Well... it worked...
MPV is also reading it correctly...

https://i.imgur.com/p26Dz63.jpg

Also the end result isn't that terrible, but that's only 'cause it's HLG, if I used a PQ source and encoded as PQ it would have been waaaaaay worse and full of banding as PQ is a truly logarithmic curve and it would be pretty hard to keep. Besides, the lack of logarithmic curves aware compression tools in x262 would make it destroy what's left of the image. I know 'cause a while back I've seen some pretty bad results with Slog3 and Clog3 footages (which are truly logarithmic curves) where blacks were starting very high and the MPEG-2 compression totally destroyed them and of course skies were full of banding as well.


Speaking of which, for anyone reading, PLEASE DO NOT make this a standard. Just because you CAN do it, doesn't mean you SHOULD do it. We should be moving forward not backwards.
H.264 being used for UHD HDR in common broadcast mezzanine formats like Sony's XAVC and Panasonic's AVC Ultra is already half-a-blasphemy as it blocked the adoption of H.265 as a mezzanine format and encoders like x265 which actually have HDR-aware coding tools (so much so that generally it's H.264 at high bitrate re-encoded to consumer-tier bitrate H.265 only for distribution), but at the very least they are 10bit and 50p, so please people, don't make MPEG-2 25i HDR a thing or I might very well suicide.
Honestly, if any improvement was to be done in x262 and in libavcodec's MPEG-2 encoder, it should NOT be about HDR-aware coding, but rather really making the existing coding tools multithread and use proper intrinsics that are not stuck in the past.

Quote:
Originally Posted by benwaggoner View Post
WGC is alive and well. It's just that WGC and PQ are almost always implemented together, which is HDR-10 and Dolby Vision as used in HDR streaming and UHD Blu-ray.
Absolutely correct. And it's not just PQ, it's also HLG.
Let's not forget that from 2015 to 2017 the broadcasting companies who actually made the switch to UHD were airing in BT2020 SDR, but then, when HDR arrived, switched to HLG thus keeping the benefit of WCG in place thanks to the BT2020 AND summing them to a hybrid transfer function which allowed much more headroom for highlights.
FranceBB is offline   Reply With Quote
Old 17th March 2023, 10:50   #105  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
Quote:
Originally Posted by FranceBB View Post
specifying BT2020nc
Color primaries : BT.2020
Matrix coefficients : BT.2020 non-constant
I see the WCG hype is dead but WCG still continue to decrease quality at user side at some (many) use cases. Users got one more way to make good visual content to look worse.

Unless your (typically unnatural synthetic) content is absolutely require WCG and only benefit from being produced from (typically unnatural) WCG scene it is really require to use BT.2020 colour gamut at 8bit system. It is only about lossless if you use float samples system with about unlimited precision and very low quantization noise.

In all other standard use cases typical natural scenes do not have out-of-standard (narrow/small) gamut colours so attempt to compress natural gamut into BT.2020 colour gamut in low-bits integer samples system only cause significant increase of quantization noise and loss of fine natural colours difference reproduction.

So correct 8bit HDR-legalized system must use standard (professional of 20 century, narrow/small) colour gamut of rec.709. So if your mezzanine source is bt.2020 colour gamut encoded - you need to extract natural (small) colour gamut from input bt.2020 content and encode it into bt.709 our lovely 8bit system.

So in AVS z_ConvertFormat it is something like:
=>709:std_b67:709:l

and simply put
ConvertBits(8) or better ConvertBits(8, dither=1) to better benefit with MPEG-friendly dithering.

And mark this any MPEG encoded 8bit content as having HLG HDR transfer. This will be our awaited 8bit MPEG HDR system with now legal and standartized to displaying HDR transfer. It also can be MPEG-1 encoded and put to VideoCD and make handwritten note on the CD surface about HLG transfer used. So user will not forgot to click HLG transfer decoder at the playback.

Last edited by DTL; 17th March 2023 at 11:16.
DTL is offline   Reply With Quote
Old 17th March 2023, 12:53   #106  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
Quote:
Originally Posted by benwaggoner View Post
I don't think anyone has ever done MPEG-2 HDR. I can squint and imagine how I could kind of get it to work, but it'd still be a regression in quality, bitrate (like 4x!),
Because our lovely professional quality visual systems are 8bit so we can make MPEG 8bit HDR and it will be about equal in bitrate to 'classic' 8bit release titles.

Really most of digital video titles released in about 1/3 of a century now do have HDR inside but in non-standartized transfer as was already noted.
Even classic bt.601/709 'SDR' broadcast-graded video camera is only 'SDR-on-output' or targeted to SDR transfer decoder in display as 'reference'. But its linear scene light can not be completely restored using inverse-OETF of SDR in single way because old SDR standards do not cover possible HDR to SDR compression of highlights before applying SDR OETF. So all old good MPEG2 8bit broadcasts already had HDR inside but in non-standartized form. No additional bits of MPEG required. The HLG only put normative on one possible transfer curve of HDR to system bits range compression.

"What device supports user or metadata defined LUTs but not HEVC? Let alone real-time AI like that?"

Possibly many high-graded digital-image-processing enduser TV-sets from before-HEVC years with 'digital image (improving) processing'. The analog TV-sets unlikely have such advanced processing. The manufacturers of TV-sets already know about functions of broadcast SDR video cameras and film-based content production and can offer some HDR-expanding to user with addition to product price.

"Does 8-bit HEVC work? If it is a Smart TV, can it play back HDR from streaming services?"

Yes - some 8bit HEVC files also work. And yes - it is Android-based 'smart TV' and possibly can use network streaming services but I never tried it. I live in my own self-build home far away from large city and it is too expensive to make wired or optical broadband internet connection here to watch network broadcasts. The affordable 3G and 4G wireless network providers already cut-away unlimited traffic options here. And internet start to work worse and worse last days. The digital civilization quickly dying here. Also I many years only watch downloaded content when I have time.
The device specifications lists for formats and codecs:
https://www.download.p4c.philips.com...12_dfu_eng.pdf
Multimedia
Connections
• USB 2.0
Playback formats
• Containers : 3GP, AVCHD, AVI, MPEG-PS, MPEG-TS,
MPEG-4, Matroska (MKV), Quicktime (MOV, M4V,
M4A), Windows Media (ASF/WMV/WMA)
• Video Codecs : MPEG-1, MPEG-2, MPEG-4 Part 2,
MPEG-4 Part 10 AVC (H264), H.265 (HEVC), VC-1,
WMV9
– MPEG-4 AVC (H.264) is supported up to High
Profile @ L5.1.
– H.265 (HEVC) is supported upto Main / Main 10
Profile up to Level 5.1

It is 4K endusers TV-set of rec.709 colour gamut and rec.709 transfer only. Practically nice enough to watch FullHD with 2x upscaling to not see aliasing from display pixels grid.

"RGB 8-bit full range can have more visual information than a limited range 4:2:0."

8bit full is not really nice for high-end sinc-based workflows because it also cut undershoots below system black level. Yes - using over-whites for HDR compression is not 100% perfect idea but real superwhites are less common in compare with low levels near black.

"Premium content is already exclusively 10-bit for HDR,"

Industry manufacturers while adding easy to show at showrooms performance booths to old video systems (4K/8K/WCG/HDR/HFR) still not fix real old design bugs:
1. The 'better' quality over-SD systems still badly bugged with ugly old poor-past 4:2:0 2:1 compression with irreversible distortions at enduser display.
2. The 'better' quality digital visual systems still not have a standard on spatial image decoding from sampled form at all.
3. Also transfer-function compression also adds distortions (may be via quantizing noise too) to possible 'linear' sinc-based workflows at least in 1D+1D form (if we expect 2. to be solved as sinc-based upscaler).

So when I see 'premium' digital video content I think it must be internally based on linear float (lets half precision 16bit) RGB. Not on ugly old 4:2:0 many-distortive compressions (additionally extra bugged with extra-non-linear HDR range compression transfer). Even if it is wrapped in some modern-named MPEG and have +2 bits to old 8.

So in 21century the visual industry instead of saying:
"Hey - look - we finally solve all old bugs of old digital video systems and offer to buy a really high-quality system as clear as PCM audio". They start to build new booths (HD/4K/8K/WCG/HDR/HFR) over the old internally ugly and buggy and uncompletely documented basis from poor-past. That is fun to see as nowdays no one at this planet knows how to properly decode moving digital pictures content so still no 'reference' digital moving pictures display exist. So users still messed with tons of different ways of upscaling - see madvr renderer of todays for example. So I see no 'premium' digital moving pictures content still exist even if something is 10bit and HEVC.

I can expect nowdays 'real premium' digital video system is digital cinema released as a sequence of files-frames of 4:4:4 and in better case linear float32 RGB form. But not something for TV-broadcast or IP-streaming in some poor 4:2:0 MPEG form.

Last edited by DTL; 17th March 2023 at 13:04.
DTL is offline   Reply With Quote
Old 17th March 2023, 17:47   #107  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,735
Quote:
Originally Posted by DTL View Post
So when I see 'premium' digital video content I think it must be internally based on linear float (lets half precision 16bit) RGB. Not on ugly old 4:2:0 many-distortive compressions (additionally extra bugged with extra-non-linear HDR range compression transfer). Even if it is wrapped in some modern-named MPEG and have +2 bits to old 8.
Eh, I think the 4:4:4 time will eventually come one day.
I mean, sure, we're still using 4:2:0 and sure enough when you watch a "modern" UHD content it will still be 4:2:0 with the chroma upscaled from 1920x1080 to 3840x2160 to match the luma, but think about it this way (and I'm talking about linear broadcasting around the world):

- at least we're no longer using 8bit and we moved to 10bit
- at least we're no longer using 25i and 29,970i but rather 50p and 60p


going forward, there will be 8K and H.266 VVC with the same specs, so still 10bit and 50p / 60p as standard (although some people were calling for the standard to be bumped to 12bit).

Sure, considering that we're still using 4:2:0 chroma subsampling is a bit of a pain, but in the long distant future I don't really see the world (content producers & broadcasters included) going any further than 16K.
By then, we will probably have 12bit as a standard at 50p/60p.

At that point, raising resolution further wouldn't make much sense for a consumer TV watched from the couch inside an apartment, so they will probably bump the standard to 100p/120p ('cause there are some contents that would benefit from it like sports). At that point, it would be hard to find benefits on raising from 12bit to something more like 14bit or 16bit (forget 32bit float), so probably we'll move to either 4:2:2 or 4:4:4.

So, my forecast for the distant future is:

16K 100p 4:4:4 HDR PQ BT2020 12bit H.268 for PAL
16K 120p 4:4:4 HDR PQ BT2020 12bit H.268 for NTSC

(Yes, H.268 and not H.267 probably 'cause H.266 VVC will be used for 8K, while 267 will be used for the first implementation of 16K which will probably still be 50p/60p, while 268 will probably be used for the second high frame rate 4:4:4 implementation of 16K).


Of course there's absolutely nothing official and those are just wild guesses / speculations from my side.

I could be terribly wrong and I have been in the past, so who knows... I guess we'll have to wait and see what the future holds for us.

Last edited by FranceBB; 17th March 2023 at 17:50.
FranceBB is offline   Reply With Quote
Old 17th March 2023, 19:04   #108  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
"going forward, there will be 8K"

It is fun to see how people going to go to completely invisible in typical household viewing conditions 4K and 8K and even 16K with 4x, 16x and 64x more initial datarate (from the 'base' FullHD) but still can not fix old design error with simple 2x datarate from ugly 4:2:0 to good 4:4:4. And if really going to live in nice future with legal standard HDR finally going from 10bit transfer-distorted domain HDR into HDR-enough system 16bit half-float linear domain with only 16/10=1.6x datarate increasing.

So all that required for nice bug-less and finally high-end really visible FullHD is 2x increase from 420 to 444 and 1.6x increase from going into transfer-distorted to linear domain. It is 2*1.6=3.2x only increase in datarate. It is lower than motion from FullHD to invisible 4K. But it is not provided to endusers.

It looks like severily degraded civilization.

The really reasonable good visual system of 'high-end' class to may be the soon enough end of this current civilization is:
- 1920x1080,
- 50p, * <--- we are here *
- linear RGB in 16bit half-precision floats (yes it can cover both BT.2020 colour space and some HDR, for 'full' HDR - 32bit standard precision floats highly recommended),
- finally defined in all-planet-regions (ITU) standard reference image restoration algorithm from sampled form to allow to create reference master grading monitors to production studios and displays for endusers,
- some not very complex MPEG codec to use with average title (or 1 hour runtime) channel datarate of about 10 (5..20) Mbit/s.

And yes - to create hi-end graded 1920x1080 sample array for distribution (aliasing free (max allowed residual aliasing for test conditions X is below Y% of full datarange), noise free (max noise level below ___ % of full datarange), >8bit non-distorted quality of residual distoritions) the 4K, 8K or even 16K shooting equipment is highly recommended. So it not mean only live with 'poor' FullHD cameras.

Last edited by DTL; 17th March 2023 at 21:00.
DTL is offline   Reply With Quote
Old 17th March 2023, 21:02   #109  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,735
I'll tell you a secret (well, I guess it won't be a secret any longer xD).
My parents bought a Samsung LED TV in 2009, FULL HD, BT709.
My father is retired and still lives in his home town in Leghorn, Italy, several miles away from where I live, while my mum is from Florence, Italy, but she has been living with my dad in Leghorn ever since they met.
Being from an "old generation", they don't really want to / feel the need to throw away perfectly working hardware.
I'm a Sky employee, so I have the right to free subscription and boxes, but I didn't want them to experience FULL HD 25i yv12 given that we've been consistently airing in UHD for a while now.
What I've done, then, is give my parents a Sky Q box hooked up to the satellite grabbing the UHD 50p 4:2:0 10bit BT2020 HLG signal (which the box converts to BT709 8bit from the settings I picked).
Then, it goes through an hardware downscaler I bought that downscales the luma from 3840x2160 to 1920x1080 while leaving the chroma as it is at 1920x1080, thus creating a fabulous FULL HD 50p 4:4:4 BT709 8bit signal carried through HDMI which goes straight into the 2009 Samsung LED TV.
Needless to say, it brought the TV back to life.
FranceBB is offline   Reply With Quote
Old 17th March 2023, 21:09   #110  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
"Samsung LED TV in 2009, FULL HD, BT709."

To better (more correctly, with less aliasing) display FullHD content we need at least 4K screen or better 8K to have 2x or 4x (typically SincResize, still not put to ITU standard) upscale. Because we typically no more have analog displays even with horizontal analog lines scanning. So it is sorry for old FullHD screens.

The end-users laser projector TVs (really covering full BT.2020 saturated colours) with possibility of analog output after good DAC at least horizontally are not go mainstream and died in North America. Even not reach 4K output.

So the MPEG compressor for 4K and more pixels screen may be simple xvid with 1920x1080 system frame.

Last edited by DTL; 17th March 2023 at 21:12.
DTL is offline   Reply With Quote
Old 17th March 2023, 21:11   #111  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,735
Trust me, if I could buy an hardware downscaler which uses SinPowResizeMT() I would, but I can't xD
FranceBB is offline   Reply With Quote
Old 17th March 2023, 22:08   #112  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
" if I could buy an hardware downscaler which uses SinPowResizeMT() I would, but I can't"

It only part of a solution but not full solution. And also the UserDefined2 kernel expected to be better. If you need to realtime live AVS processing I think it is possible with simple windows-box running some not very old windows with directshow and SDI in/out board (for single home use the HDMI may be used) capable of directshow interfaces (good board running in and out at once - some may only in or out).

And after windows load install ffdshow and AVS - open graphedit and construct graph of hardware input source - ffdshow in RAW processing mode and hardware out - put avisynth script in ffdshow and click run graph. I thinking of making some like this for our broadcast company to run mvtools degrain before main broadcast MPEG encoder so help MPEG coder with lower noised content. But current dying civilization do not have interest in quality broadcast and we have no one complain on quality.

So if the in/out SDI (or IP-in/out for IP-based network) board and its driver would be nice (may be AJA and not cheap Blackmagic ?) and not hang or crash after several hours/days of runtime you can make some part of this planet more happy with both mvtools denoise and/or UserDefined2 (or SinPow) downscale of 16K/8K/4K source to better FullHD in live broadcast. Though I think some hardware vendors already uses the kernel like in SinPow or UserDefined2 resize - I see some like this adjustment in some good video camera setup instruction and it was called 'peaking adjustment'.

My tests with AVS and very old Decklink cards with SD-SDI input and ffdshow (for input only and running sort of dual input colour analyser for many cameras colour tone grading/alignment) result in lowering framerate update after about 2..3 hours of runtime. So it is not applicable for airing to the part of the planet for many days without restart. May be better cards or better directshow version or patched ffdshow required.

Last edited by DTL; 17th March 2023 at 22:16.
DTL is offline   Reply With Quote
Old 18th March 2023, 12:17   #113  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,735
I'm totally with you on this.
Although it could be done for home use, when they asked me to experiment with it at work in summer last year I said: absolutely no!
The reason is exactly what you highlighted above, I said "there's no way I'll ever certify something like this" mostly 'cause I'm pretty damn sure that even if I used real-time filters in Avisynth getting the incoming stream from the sat decoders from our MCR via SDI:

1) It could crash at any moment
2) There's no way it would be able to sustain days of workload without rebooting etc

And of course one could make all the arguments of the case like putting a DFS in front of it and using AJA cards instead of Blackmagic decklinks for input/output, but the benefit vs risk still wouldn't cut it.
I mean, I would rather have no filtering and a crappy hardware downscale with gibbs ringing and what not than to have perfect quality until Avisynth crashes in the middle of a game and it takes forever to bring the frameserver back up, people at home complain (and I'm blamed for it).

By the way, it's funny how we tested the same things 'cause I made the same tests myself and with the very same technology (and with pretty much the same results... )

Last edited by FranceBB; 18th March 2023 at 12:19.
FranceBB is offline   Reply With Quote
Old 18th March 2023, 12:51   #114  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
"16K 100p 4:4:4 HDR PQ BT2020 12bit H.268 for PAL"

Some more longread about >FullHD frame sizes:

Yes - 4K is not totally invisible. 4K and more samples per frame may be used for higer-quality digital visual systems but in the following way:

1. Digital imaging systems have very important internal design param of samples per degree of view. The 'classic industry' SD/FHD/4K/8K/(16K) form family of system-60spd. They designed to the close to critical spatial samples density around 60 samples per degree. Below 60spd the visual resolution quickly degrade, above 60spd visual resolution very slowly increases. Only 8K start to move some higher to about system-80 or more.
With FHD at system-60 we already reach limit of 'normal' angle of view of about 30 degree horizontal. It is even a bit more to normal (about 16 degrees) so left some side space.
Higher family members of system-60 like 4K/8K/16K are completely unusable for general broadcasting and require special wide-angle production (and displaying).

But system-60 also require very high quality engineering of samples usage in both compression (creating low sample count buffer) and decompression (restoring image from sampled form) stages. As today it looks we still not have perfect 2D linear upsampler mathematics (the 1D+1D sinc is not perfect for 2D and some 2D-oriented resizers like Jinc/EWA-Lz and others are really not perfect also - the elementary dot is not perfectly round at all sizes). So nothing really mathematically perfect once for ages to suggest to ITU to be accepted as industry standard upsampler for moving pictures digital imaging at this planet. May be even rectangular sampling grid is not completely perfect for 2D imaging and need replacement to other type of 2D sampling grid.

But requirements for upsampler significantly relaxed if we not try to reach peak possible sharpness of digital imaging (limited by Nyquist and Gibbs) and use some more samples to describe same detailed image. And to restore display visual sharpness we simply increase design samples density per degree of view above 60. Lets to 120.

So frame sizes above FHD may be used for different systems:
1. 4K system-60 - wide angle, standard per degree quality system.
2. 4K system-120 - normal angle of view, increased quality per degree of view system.
Same with 8K and more - like 8K may be 'normal viewing angle' or 'full viewing angle' (as from initial FHD) system-240 with close to no special requirements for upsampler (simple bilinear can be used or direct samples to screen pixels mapping).

It also connected with settings of UserDefined2Resize with adjustable s-param: For system-60 targets it is benefitically to keep some non-linearity and use low s-value around 2 to make 'halo/over/under shoots' thinner but it still leave some residual non-linear distortions. But allow to use higher-compressed low initial sample count 1920x1080 frame and more simple MPEG codec for same average bitrate and filesize.

For higher-quality system-120 and more it may be recommended to use 'unlimited' kernel size (s > 3..5 and more) and output 'full' width 'halo/over/under shoots' and got less residual ringing and other possible distortions. The visual sharpness of 'oversampled' system-120 and more is enough high so no thicker over/under shoots will degrade it.

But the not very nice side of 4K and more systems is much higher requirement for MPEG codec like 4x and 16..64 more initial datarate to compress. So the usage of 4K system-120 may be as 'bruteforce' solution of better quality for poorly engineered 'initial sampling compression' frame.

It looks close to audio systems from 44.1 kHz for very perfectly and precisely engineered old systems for all parts - from ADC with very nice LPF, very complex processing and high oversampling DAC with good knowledge in sinc() and modern 192 kHz system running well on very simple LPF and poor DAC still very fine. Just because 192 kHz workflow is very cheap now.

So for good old MPEG codecs the good engineered digital visual workflows with low frame sizes may be used for 'classic' system-60. It require additional efforts in design good sample-compressor to low samples frame size and complementary decompressor for displaying (upscaler). The higher samples per frame count systems like system-120 and system-240 may use very simple bilinear resize or direct samples to pixel mapping with also good image but require much larger frame size to operate.

"There's no way it would be able to sustain days of workload without rebooting etc"

It is not really badly nature of Windows and DirectShow - I have experience on broadcast running some handmade DirectShow file player with controlled image overlay via sample-designed DS filter in between file decoder and renderer running for weeks/months or may be years without reboot. And got no complaints on its stability over many years running until it was replaced by other more featured solution. So the DirectShow core in Win7 is not very bad even for broadcast-graded stability. But other parts like ffdshow or AVS are subject to test. Though to run AVS as DirectShow intermeduate RAW processing component may be much easier in compare with ffdshow design but require once again some new DS filter design (or extract from ffdshow).

Last edited by DTL; 18th March 2023 at 15:50.
DTL is offline   Reply With Quote
Old 8th April 2023, 06:50   #115  |  Link
orion44
None
 
orion44's Avatar
 
Join Date: Jul 2007
Location: The Background
Posts: 297
Quote:
Originally Posted by 102030 View Post
I still convert all movies with Xvid. I have the most experiences with this codec (13 years).
I really dislike h264 deblocking. The Xvid results are better for me than with x264.
Me too.

I never liked the look that x264 produced, no matter which settings I tried,
I just couldn't watch it. Especially when it was used to encode DVDs.

Last edited by orion44; 6th September 2023 at 12:53.
orion44 is offline   Reply With Quote
Old 10th April 2023, 09:50   #116  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,000
" if I could buy an hardware downscaler which uses SinPowResizeMT() I would, but I can't"

You can e-mail to any local or global planet hardware manufacturer about same looking kernel. It may be several minutes of engineer work to patch FPGA chip firmware and you will got either downloadable firmware upgrade for module or mail of new unit or installable board of scaler in the chassis. The math of the kernel in the AVS and JPSDR github opensource. I can ask local SDI hardware manufacturers but they may be not widely available at European market. For IP-based solutions it mostly simple patch of firmware.

Some more sad note about HDR from natural scenes: The HDR-capability or typical shot optics is limited to about 8 F-stops for 'high-key' scenes. Because of light scattering in glass itself and flare/glare on surfaces (surface finish/coatings/residual dust). And only for low-key scenes may be significantly larger. The 'standard' scenes is something inbetween. Unfortunately after the hype around HDR was started - close to no lens manufacturers announce any better HDR-lens series. So for general multi-lens zooms the HDR is the worst quality and for some low-lens cine primes (specially produced in clean low-dust environment, special coatings, special low-internal light scattering glass types, may be even reflective no-glass or hybrid mirror+glass optics) the high-key HDR may reach 10..12+ F-stops. Mirror-only optics typically is sort of fixed-focal prime of very large size. But extreme HDR optics expected to be glass-less (mirrors-based). Though it is really still in 8bit SDR range. So really the all possible in real hardware HDR is limited to some limited highlights non-clipping. Not about really nice colour expanding in shadows at high-DR scenes.

Last edited by DTL; 10th April 2023 at 10:13.
DTL is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:30.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, vBulletin Solutions Inc.