Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 ASP

Reply
 
Thread Tools Search this Thread Display Modes
Old 16th February 2023, 17:42   #81  |  Link
kurkosdr
Registered User
 
Join Date: Aug 2009
Posts: 304
Quote:
Originally Posted by FranceBB View Post
Honestly, though, we're here discussing about Xvid still being around, but can you imagine how long H.264 will stick around?
Forever. It's the "lowest common denominator" video format for the web and hence an essential part of the web standards (at least defacto). I consider H.264 to be the video equivalent of GIF: it doesn't have the bit depth we want it to have, or the compression efficiency we want it to have, but everyone has implemented its "high" profile so it's the lowest common denominator.

If you want to use something newer than H.264, you have to encode in both VP9 and HEVC (and deal with the patent mess of HEVC) because Apple still plays hardball and doesn't support VP9 in Safari. Or, more realistically, encode in VP9 and give everyone else H.264 until they decide to use a browser that doesn't such as much as Safari or whatever.

I just hope everyone agrees to implement 10-bit H.264 with HDR support, so at least FullHD HDR is possible everywhere. But even if not, expect H.264 to be implemented for the rest of time in browsers like GIF.

Last edited by kurkosdr; 16th February 2023 at 23:35.
kurkosdr is offline   Reply With Quote
Old 16th February 2023, 20:00   #82  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,036
"I consider H.264 to be the video equivalent of GIF: it doesn't have the bit depth we want it to have, or the compression efficiency we want it to have, but everyone has implemented its "high" profile so it's the lowest common denominator."

The 8bit SDR is really the high-visual art standard for decades in the great civilization of previous century. Its 1000:1 dynamic range is selected very close to nominal output dynamic range of reversible professional film for artists. It is not limitation of the reversible film type - the great old civilization capable of the flight to the moon can easily reach 1000000:1 display dynamic range simply combining 2 layers of 1000:1 for example but it looks the 1000:1 was good enough.

Also about input dynamic range for good art images - it is also very narrow like about 5 'stops' (also see Fuji Velvia datasheet for example or any Kodak reversal natural shooting film). It is because to make scene image pleasant to look it required complex enough expanding at middle and compression at the edges (so incoming about 5'stops' about 50:1 scene light range is expanded to about 1000:1 display light range by film processing for example). Also the real art transfer curve is not linear (and really curve - not straight line) and it is also not limitation of old century chemicals. If required 3.0D range linear transfer user can shoot linear duplication film for reversal film (so it have 3.0D input range and 3.0D output with linear transfer). But good visual artists already found in the previous century about linear scene shooting in the 'large range' for example 3.0D is about 10'stops' is not any good for viewers and not make good money in business.
It is applied to correctly designed by visual artists images - not to reatime 'live' broadcasts shot with some random non-prepared scene by very cheap personnel without special visual arts education.

But as we see at least some part of 8bit xvid encodings are for really good mastered (good DVD titles for example) content so it is perfectly enough. If we use xvid to compress poorly shot cheap TV shows - it is not issue of xvid or 8bit and only badly shot input content.

So when we have correctly mastered by enough good educated person and visual artist content we can freely use SDR 8bit and xvid to make perfectly good encodings.

"10-bit H.264 with HDR support"

We can assign some HDR decompression profile to standard 8bit encodings - it only may have more banding at highlights. The HDR itself not limited to any bitdepth - only limitation is more or less visible banding (if dithering is not applied) at some target physical display brightness levels. It is close to 'legal expanding' of over-whites (range of 236..254 code values of Y-channel in 8bit or even to 255 because we are not limited to system-service-reserved words like in SDI) into standartized-HDR like HLG. Older displays may already do it in customized vendor's solutions for 'image enchancements'.

HDR is not main part of good image design - it only a small additional way to a bit more impress user when all other underlying structure of tone and colours in image is perfectly mastered and most of important range is already fit in SDR (and standard colour gamut). If you can not view in HDR mode - you can easily skip HDR-addition to perfectly mastered SDR content and you will not lost significant part.

Last edited by DTL; 16th February 2023 at 20:21.
DTL is offline   Reply With Quote
Old 16th February 2023, 20:58   #83  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,869
Quote:
Originally Posted by DTL View Post
We can assign some HDR decompression profile to standard 8bit encodings - it only may have more banding at highlights. The HDR itself not limited to any bitdepth - only limitation is more or less visible banding (if dithering is not applied) at some target physical display brightness levels. It is close to 'legal expanding' of over-whites (range of 236..254 code values of Y-channel in 8bit or even to 255 because we are not limited to system-service-reserved words like in SDI) into standartized-HDR like HLG.
Correct.
Sony has been doing 8bit H.264 BT2020 HLG HDR in their old Sony A7 III cameras and they were using the expanded values of Full PC Range to take advantage of the extra headroom to make it fit better. I've seen plenty of footage shot like that. In most cases it was "good enough" (i.e it wasn't terrible, but it wasn't great either).

Quote:
Originally Posted by kurkosdr View Post
I just hope everyone agrees to implement 10-bit H.264 with HDR support
Sony is gonna be your best buddy, here, then, 'cause they actively support 10bit H.264 BT2020 HLG HDR in their cameras, both Intra and Long GOP (i.e I-P-B) in their XAVC flavor and of course x264 can encode those and add the right metadata. Unfortunately for you, though, no consumer hardware will play those, only professional hardware playout ports, so... it looks like it's not gonna be a thing at consumer level as such a stream is almost always re-encoded to H.265... but you know what? Never say never and most importantly if all you care about is software playback on computers, you can totally do it even today as support for HDR metadata was introduced in 2017 in x264.
FranceBB is offline   Reply With Quote
Old 16th February 2023, 21:46   #84  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,036
" 8bit H.264 BT2020 HLG HDR "

Expanding frame size 2x linearly +dithering (2x2 samples dithered) may be about same as 10bit (8bit +2bit from 2x2 dithering) per sample with original frame size (in terms of banding). Only 2x frame size +dithering may cause more or less higher MPEG output bitrate.

So when wider compatibility with 8bit MPEG-4ASP/AVC decoders +HDR built-in is required it may be achieved with increasing frame size +dithering (or simply not downsize 8K..4K or FullHD too much).

Also WCG from BT.2020 is not very required in general public content so it is no need to shrink colour gamut of 'normal scenes' to 8bit system and risk to have additional colour banding.
The main point of 'artistic' image creation is not shrinking all-world-around data (luma range, chroma range and angle of view) to the single image frame but finding pleasant to view limited size scenes and expanding part of (luma and chroma) range for even better visibility of something pleasant to see (because viewer have limited ability to see small deviations in luma and chroma in natural scenes). So using of WCG in the bit-limited 8bit system is additional source of banding errors and lost of valuable low contrast colour details when operating with natural scenes.

So that attempt to use 8bit BT.2020 (+HDR) system was simple temporal marketing buff with attempt to sell the colour-details-shrinking system to artists. The sales possibly was not any good as well as shooting result.

So 8bit-dithered MPEG-4ASP/AVC bt.709 chroma +HLG range compression encoding (mapping) may be wide-used and backward compatible 'HDR for general public' format. It possibly will run a bit dimmer at SDR decoders if use 75% as nominal white point as in HLG. Sort of SDR+ 8bit widely compatible format. At the good enough users playback devices with 'HDR light power output' and good enough 'AI SDR to HDR' processing in display it will run closer to 'real 10-bit HDR'.

Though HLG HDR range mapping was already designed to very easily downconverting to SDR (for example with applying AUTO KNEE camera processing to decoded to linear scene light content).

The SDR digital motion pictures system was designed by the great designers of the nice great technical civilization of past so enough self-balanced. The only good addition (as well as defining finally 4:2:0 display UV filtering transfer curve at decoding to 4:4:4) is putting standard to usage of upper part of Y-code values for HDR-range expansion.
In old times CRT displays where very poor to run over 100..200 nit at home-sized TVs so the great old designers of the great old civilization are gone until the usage of upper codevalues of 230..254..255 of 8bit system for highlights compression up to 600..1000 nits become really usable at endusers displays.

Last edited by DTL; 16th February 2023 at 22:23.
DTL is offline   Reply With Quote
Old 16th February 2023, 21:55   #85  |  Link
kurkosdr
Registered User
 
Join Date: Aug 2009
Posts: 304
Quote:
Originally Posted by FranceBB View Post
Sony is gonna be your best buddy, here, then, 'cause they actively support 10bit H.264 BT2020 HLG HDR in their cameras, both Intra and Long GOP (i.e I-P-B) in their XAVC flavor and of course x264 can encode those and add the right metadata. Unfortunately for you, though, no consumer hardware will play those, only professional hardware playout ports, so... it looks like it's not gonna be a thing at consumer level as such a stream is almost always re-encoded to H.265... but you know what? Never say never and most importantly if all you care about is software playback on computers, you can totally do it even today as support for HDR metadata was introduced in 2017 in x264.
Consumer HD hardware (aka consumer H.264 hardware, these two are mostly synonymous) will stay SDR-only. It can't do tone-mapping and even in the rare cases it can it's very rare that a manufacturer will send an update and risk losing sales of new hardware. What I am talking about is HLG HDR for H.264 in browsers. That way a common format (H.264) everyone implements and everyone accepts its royalty structure/patent situation can be used to provide HLG HDR (if only at FullHD due to the inferior compression efficiency compared to VP9 or HEVC). It's a missed opportunity, as it could instantly enable HLG HDR in almost all browsers. And the website can use Javascript to only serve that kind of content to browsers that understand it and can tone-map it. And if they allow 8-bit, they can even use existing hardware acceleration, which means that, assuming the CPU can take on the load of tone-mapping, there will be no missed frames.

Last edited by kurkosdr; 16th February 2023 at 22:01.
kurkosdr is offline   Reply With Quote
Old 16th February 2023, 22:36   #86  |  Link
Balling
Registered User
 
Join Date: Feb 2020
Posts: 538
Quote:
Originally Posted by kurkosdr View Post
If not having an UHD TV is considered "extra poor", then I guess most people are "extra poor" according to that weird definition.
No UHD TVs are 100 nits out of the box. So in fact you should be extra rich and clever to go buy Calman and X-rite and calibrate (though LG C2 has Filmmaker that is supposed to be 100 nits and LG C9 has 100 nits mode in Technicolor expert, right).
Balling is offline   Reply With Quote
Old 16th February 2023, 22:47   #87  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,036
Quote:
Originally Posted by Balling View Post
No UHD TVs are 100 nits out of the box. So in fact you should be extra rich and clever to go buy Calman and X-rite and calibrate (though LG C2 has Filmmaker that is supposed to be 100 nits and LG C9 has 100 nits mode in Technicolor expert, right).
If you buy some UHD-for poors like Philips 40PUT6400 you can find it 4K VA-panel and even can decode 10bit h.265 some profile but eating 100+ Watts it can only emit some inbetween 100 and 200 nits. Like poor old CRTs. Very poor LEDs in backlight mounted.

I really impressed how business-level sub $10000 display of LG (simple 8bit SDR for xvid playback) eat only 400 Watts peak and run at up to 4000 nits fullscreen all time. Having about 50 inches sized screen.
https://www.lg.com/uk/business/digit...ge/lg-49XS4F-B
Really nice LEDs installed in backlight. May be much higher 100 lm/W. Really right 8bit SDR display running dark blacks at about 4 nit and peak whites at 4000 nits - close to 'full visible colours down to the very deep darks'.

Last edited by DTL; 16th February 2023 at 22:52.
DTL is offline   Reply With Quote
Old 16th February 2023, 23:15   #88  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
Apple's HEVC HLG with Dolby Vision dynamic metadata is really a very clever solution as well. Certainly not the headroom possible with native Slog3 or PQ, but about as much as you're going to get while still having decent playback without tone mapping on existing SDR devices.

Getting back to xvid, yes, there are all kinds of ways to make stuff forward compatible. But any device that can handle the above will support at least H.264 High Profile.

I wonder if there are new devices shipping that have dropped MPEG-4 pt 2 decode. I saw one of last year's GPUs dropped HW VC-1 decode, for example. And VC-1 was somewhat more advanced that MPEG-4 ASP, with Overlap Transform and (lightweight) in-loop deblocking.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 16th February 2023, 23:18   #89  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
Quote:
Originally Posted by kurkosdr View Post
Consumer HD hardware (aka consumer H.264 hardware, these two are mostly synonymous) will stay SDR-only. It can't do tone-mapping and even in the rare cases it can it's very rare that a manufacturer will send an update and risk losing sales of new hardware.
Well, it's inaccurate to say that there isn't any tonemapping. All flat panel technologies respond a lot differently than a Cathode Ray Tube would. Lots of tone mapping takes place in the panel controller to get it to respond to digital values like a CRT would to voltage. It's not like a LCD has intrinsic 2.2 or 2.4 gamma!
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 16th February 2023, 23:39   #90  |  Link
kurkosdr
Registered User
 
Join Date: Aug 2009
Posts: 304
Quote:
Originally Posted by benwaggoner View Post
Well, it's inaccurate to say that there isn't any tonemapping. All flat panel technologies respond a lot differently than a Cathode Ray Tube would. Lots of tone mapping takes place in the panel controller to get it to respond to digital values like a CRT would to voltage. It's not like a LCD has intrinsic 2.2 or 2.4 gamma!
They don't do tone-mapping for HDR though, it should be clear from the context that's what I meant.
kurkosdr is offline   Reply With Quote
Old 17th February 2023, 13:37   #91  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,869
Quote:
Originally Posted by kurkosdr View Post
What I am talking about is HLG HDR for H.264 in browsers.
Uhmmm, I'll tell you what, I think it could work even right now. H.264 can transport HDR info in SEI so it's very easy to create an 8bit BT2020 HLG file. Chrome also is able to recognize the desktop colorspace and the normal video stream info (chrome://gpu), so in theory it should be able to recognize an H.264 stream flagged as arib-std-b67 and bt2020-10 just like it's able to recognize it if it's an AV1. I mean, it should actually work already out of the box with the current technology without any further update, in theory, but no one is doing it.
FranceBB is offline   Reply With Quote
Old 17th February 2023, 16:06   #92  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,036
Here is example of article why 8bit SDR xvid is always enough to encode correctly shot content - https://www.kenrockwell.com/tech/highlight-shadow.htm . The extra HDR is mostly way to bring to viewer 'RAW' shooting content so it need to create colour and tone grading by itself or suffer from non-graded cheap content. For live broadcasting without controllable scene light is may be some solution.

Short text extra is:
I was watching some old movies from the 1930s, and noticed how even back in those days that they had perfect shadow and highlight detail in every scene, be it daylight, back light, side light, indoors, moonlight, low light, candle light, or whatever.

Check out Mohawk Valley, shot in Technicolor in 1939, or The Plainsman, shot in black-and-white in 1937. These aren't even technical masterpieces; they just happened to be old Indian movies we were watching as I noticed this.

An indoor scene with a door open to the outside world? Perfect detail everywhere. The same thing, but moonlight outside reflecting off a river seen as through a window while the actors are indoors at night? Again, perfect!

No matter how tough the light, they always had perfect shadows and perfect highlights, be it in color or in black-and-white.

How can this be? There must be at least 15 stops of dynamic range needed, and film was primitive in those days.

Of course the movies have always had perfect highlights and shadows. How? Why? Because they are shot by real photographers, typically ASC members, who know how to light a scene.

When shooting a movie, you spend a couple of days lighting each set and each scene. You bring four generator trucks and eight trucks of lighting and grip equipment, and have at it. You'll scrim, gel, gobo and reflector everything until you go blind.

In the end, you get perfect results, even if it was the 1930s.

Photographers know how to get perfect highlight and shadow detail in every shot, regardless of the light — or lack thereof — while hobbyists freak out and start buying more equipment, like pro digital backs claiming "18 stops dynamic range," which has nothing to do with getting good highlight and shadow detail.

It is always the photographer who is responsible for highlights and shadows, not the Great Spirit inside a new camera.

Last edited by DTL; 17th February 2023 at 16:09.
DTL is offline   Reply With Quote
Old 8th March 2023, 10:05   #93  |  Link
102030
Registered User
 
Join Date: Apr 2011
Posts: 3
I still convert all movies with Xvid. I have the most experiences with this codec (13 years).
I really dislike h264 deblocking. The Xvid results are better for me than with x264.

But there is also one issue that bothers me. Xvid changes the naturally vivid colors. I think it converts yv12 to rgb.
Is there a way to prevent this problem? yv12 input --> yv12 output

Last edited by 102030; 8th March 2023 at 10:22.
102030 is offline   Reply With Quote
Old 8th March 2023, 10:35   #94  |  Link
Emulgator
Big Bit Savings Now !
 
Emulgator's Avatar
 
Join Date: Feb 2007
Location: close to the wall
Posts: 1,531
YUV to RGB OOTB ?
I don't think so, implementing MPEG-4 ASP after all why would one sacrifice the efficiency of chroma subsampling.
mediainfo will tell.
__________________
"To bypass shortcuts and find suffering...is called QUALity" (Die toten Augen von Friedrichshain)
"Data reduction ? Yep, Sir. We're that issue working on. Synce invntoin uf lingöage..."
Emulgator is offline   Reply With Quote
Old 8th March 2023, 13:08   #95  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,255
probably a tv vs pc scale or color matrix issue.
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 8th March 2023, 13:34   #96  |  Link
102030
Registered User
 
Join Date: Apr 2011
Posts: 3
I have read that Xvid only work with a RGB32 input, and Divx only works with RGB24.
Therefore i guess, these codecs convert the colours while encoding. (i use only directshow, mostly graphedit)
102030 is offline   Reply With Quote
Old 8th March 2023, 16:41   #97  |  Link
filler56789
SuperVirus
 
filler56789's Avatar
 
Join Date: Jun 2012
Location: Antarctic Japan
Posts: 1,351
Quote:
Originally Posted by 102030 View Post
I have read that Xvid only work with a RGB32 input, and Divx only works with RGB24.
Therefore i guess, these codecs convert the colours while encoding. (i use only directshow, mostly graphedit)
If you use Avisynth with VirtualDub's fast-recompression-mode,
you can confirm that both DivX and Xvid accept YV12 inputs.

But if you want or have-to use DirectShow anyway, then
1) replace graphedit with GraphStudioNext, and
2) don't let the ""smartness"" of DirectShow add unnecessary colorspace conversions to the graph.
__________________
«Your software patents have expired.»
filler56789 is offline   Reply With Quote
Old 9th March 2023, 09:56   #98  |  Link
102030
Registered User
 
Join Date: Apr 2011
Posts: 3
Quote:
Originally Posted by filler56789 View Post
If you use Avisynth with VirtualDub's fast-recompression-mode,
you can confirm that both DivX and Xvid accept YV12 inputs.

But if you want or have-to use DirectShow anyway, then
1) replace graphedit with GraphStudioNext, and
2) don't let the ""smartness"" of DirectShow add unnecessary colorspace conversions to the graph.
Thank you, i will try this possible Avisynth solution soon. I did no colorspace conversions in my graphs.
I did always deactivate the colorspace filter in the Windows registry.

Last edited by 102030; 12th March 2023 at 06:07.
102030 is offline   Reply With Quote
Old 14th March 2023, 15:11   #99  |  Link
DTL
Registered User
 
Join Date: Jul 2018
Posts: 1,036
MPEG-4 still not dead !

Some new idea in progress to make ASP/AVC releases looks better - https://forum.doom9.org/showthread.p...72#post1984472 . Sort of backporting some features from h.265/HEVC to lower numbered MPEGs. So wider compatibility moving pictures files my be created.
DTL is offline   Reply With Quote
Old 16th March 2023, 05:28   #100  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
Quote:
Originally Posted by FranceBB View Post
Uhmmm, I'll tell you what, I think it could work even right now. H.264 can transport HDR info in SEI so it's very easy to create an 8bit BT2020 HLG file. Chrome also is able to recognize the desktop colorspace and the normal video stream info (chrome://gpu), so in theory it should be able to recognize an H.264 stream flagged as arib-std-b67 and bt2020-10 just like it's able to recognize it if it's an AV1. I mean, it should actually work already out of the box with the current technology without any further update, in theory, but no one is doing it.
HLG is HDR-lite, however. And nominally requires 10-bit. The only reason anyone ever uses HLG as the same bitstream can provide reasonably decent SDR and HDR. But bang-for-the-bit is always better doing native SDR or native PQ HDR if universal compatibility isn't required.

HEVC is the current standard for HDR because:
  1. HDR typically comes with 4K, and HEVC is >2x as efficient as H.264 at 4K resolutions. These are big files, so the lower bitrate is really helpful.
  2. Real (PQ) HDR needs some clever QP offset tuning for optimal results, which has only been well documented for HEVC. That's what x265's --hdr10opt implements.
  3. All HDR capable CE devices support 10-bit HEVC decode, even though many of those only support H.264 up to 8-bit.

Sure, whacky things can be done with full range mapping and dithering and all that. Heck, I could make HDR MPEG-2 with sufficient motivation and time. But in the end it's a lot of work and a lot more bits for worse results, and no real reason to bother. Essentially everything with an HDR tone mapper and display supports 10-bit HEVC decode. We'll upgrade to better thing in the future, like VVC and/or AV1. But I can't imagine why anyone would practically benefit from using a sub-HEVC codec for HDR.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:06.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.