Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th July 2020, 19:15   #1  |  Link
YaBoyShredderson
Registered User
 
Join Date: Jul 2020
Posts: 76
Transparent encodes of fake 4k blurays

My goals when encoding are to be indistinguishable from the source based on my viewing conditions, with a bit of quality buffer should those viewing conditions change (closer seating, larger tv etc).

I havent started encoding my 4k blurays just yet, only 1080p, as i want to get a ryzen 4000 processor before to speed things up a tad. But i wanted to know if i should downsample them to 1080p?

Obviously true 4k i should leave as is, but with fake 4k, stuff that was mastered at 2k and then upscaled to 4k, should i leave that in the full 4k? Or will it be the same quality at 1080p as thats what it mastered at? I would likely save a whole lot more space doing this.
YaBoyShredderson is offline   Reply With Quote
Old 27th July 2020, 02:21   #2  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
>1080p doesn't matter so much with SDR, but absolutely can with HDR so that sharp specular highlights can be preserved.

There isn't always a hard line between fake and real 4K, as titles may combine 4K elements with 2K VFX shots. How much grain and motion blur have a big impact on potential detail as well.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 27th July 2020, 07:25   #3  |  Link
scharfis_brain
brainless
 
scharfis_brain's Avatar
 
Join Date: Mar 2003
Location: Germany
Posts: 3,653
Also 2k Upscaled UHD Bluray benefit from better chroma resolution. (Near 444 related to 2k)

When downscaling an UHD Bluray with 2k content to 1080p, one will reduce the chroma to a quarter of its original resolution.
__________________
Don't forget the 'c'!

Don't PM me for technical support, please.
scharfis_brain is offline   Reply With Quote
Old 29th July 2020, 07:23   #4  |  Link
YaBoyShredderson
Registered User
 
Join Date: Jul 2020
Posts: 76
Ok, makes sense, leaving at 4k is the best for me then. Thanks.
YaBoyShredderson is offline   Reply With Quote
Old 1st August 2020, 10:11   #5  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,558
Before you make a choice that will hugely impact both encoding time and storage space, just watch a few downsampled to 1080p and then upscaled again to see if you can tell. Some can, many can't, and there's not much point if you can't.
foxyshadis is offline   Reply With Quote
Old 1st August 2020, 14:09   #6  |  Link
Boulder
Pig on the wing
 
Boulder's Avatar
 
Join Date: Mar 2002
Location: Finland
Posts: 5,718
Quote:
Originally Posted by foxyshadis View Post
Before you make a choice that will hugely impact both encoding time and storage space, just watch a few downsampled to 1080p and then upscaled again to see if you can tell. Some can, many can't, and there's not much point if you can't.
This. I use a method that utilizes the clever tool called Zopti to determine optimal (well, at least better than using the same values for all sources) parameters b and c for BicubicResize to downsample. There's a huge difference compared to just using Spline36 or Lanczos. For native 4K, I go for 1440p and upscaled 2K goes down to 1080p.
__________________
And if the band you're in starts playing different tunes
I'll see you on the dark side of the Moon...
Boulder is offline   Reply With Quote
Old 7th August 2020, 21:09   #7  |  Link
K.i.N.G
Registered User
 
Join Date: Aug 2009
Posts: 90
scaling down to 1440p might be a good middle ground.
K.i.N.G is offline   Reply With Quote
Old 11th August 2020, 01:13   #8  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
And note that even "uprezzed" 2K to 4K titles don't use some fire-and-forget algorithim like bicubic. There is a lot of shot- and scene-based tweaking of parameters to get best results. For major titles from major studios, it's more like a lightweight remastering. And if it was a DCI-P3 master getting converted to HDR, it really IS a remaster. What you'd get scaling back down to 1080p isn't going to be the same as the original 2K master (which was 2048x, not 1920x; ~14% more source pixels).

Note a cinema "4K" projector is 4096x, not the home video/broadcast 3840x.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 14th August 2020, 23:37   #9  |  Link
FranceBB
Broadcast Encoder
 
FranceBB's Avatar
 
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 2,883
Quote:
Originally Posted by scharfis_brain View Post
Also 2k Upscaled UHD Bluray benefit from better chroma resolution. (Near 444 related to 2k)

When downscaling an UHD Bluray with 2k content to 1080p, one will reduce the chroma to a quarter of its original resolution.
Not necessarily. If someone doesn't care about playback compatibility, when using a reverse upscale filter, the idea would be to reverse upscale the luma inverting the kernel (i.e Debilinear, Debicubic etc) to the original resolution but leaving the chroma as it is, which means that you're gonna end up with a FULL HD 4:4:4 10bit file from an upscaled UHD 4:2:0 10bit which is a win win.
FranceBB is online now   Reply With Quote
Old 20th August 2020, 02:22   #10  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by FranceBB View Post
Not necessarily. If someone doesn't care about playback compatibility, when using a reverse upscale filter, the idea would be to reverse upscale the luma inverting the kernel (i.e Debilinear, Debicubic etc) to the original resolution but leaving the chroma as it is, which means that you're gonna end up with a FULL HD 4:4:4 10bit file from an upscaled UHD 4:2:0 10bit which is a win win.
It would be VERY unusual content for there to be a visible difference between downscaling to 4:4:4 or 4:2:0. Something like very sharp red text on a blue background or something. I don't believe I've seen actual TV/movie content where 4:2:0 at 1080p and above wasn't adequate.

Screen recordings, sure. Turn off ClearType before doing screen recordings, people! It messes up the chroma for people with different types of panels.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 20th August 2020, 07:50   #11  |  Link
scharfis_brain
brainless
 
scharfis_brain's Avatar
 
Join Date: Mar 2003
Location: Germany
Posts: 3,653
Quote:
Originally Posted by benwaggoner View Post
I don't believe I've seen actual TV/movie content where 4:2:0 at 1080p and above wasn't adequate.
Actually I see it all the time, when the content is crisp enough.
It bothers me.
That's why I try to use MadVRs chroma reconstruction/upsampling most of the time I use thr PC to playback movies.

Anyways YUV444 files won't be too compatible with SmartTV or Bluray players. So we are stuck here with 420 chroma.
__________________
Don't forget the 'c'!

Don't PM me for technical support, please.
scharfis_brain is offline   Reply With Quote
Old 24th August 2020, 18:40   #12  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by scharfis_brain View Post
Actually I see it all the time, when the content is crisp enough.
It bothers me.
Can you give some examples where you see this?

Quote:
That's why I try to use MadVRs chroma reconstruction/upsampling most of the time I use thr PC to playback movies.
Is the problem perhaps in the display pipeline, not the encode? There are obviously different algorithms that could be used for how to optimally go from the 4:2:0 to the display's RGB. Nearest Neighbor is going to be a lot worse even compared to a basic bilinear.

Quote:
Anyways YUV444 files won't be too compatible with SmartTV or Bluray players. So we are stuck here with 420 chroma.
>4:2:0 chroma samples haven't ever been used in a broadly supported distribution format, disc or streaming. 4:4:4 doubles memory buffer and bandwidth requirements, and increases bitrate some, and demonstrated gains have never been worth the extra cost.

Put another way, 1440p 4:2:0 is fewer samples-per-second than 1080p 4:4:4, and will look better for 4K source content.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 24th August 2020, 18:55   #13  |  Link
Cary Knoop
Cary Knoop
 
Cary Knoop's Avatar
 
Join Date: Feb 2017
Location: Newark CA, USA
Posts: 397
Quote:
Originally Posted by benwaggoner View Post
4:4:4 doubles memory buffer and bandwidth requirements, and increases bitrate some, and demonstrated gains have never been worth the extra cost.
I would challenge the assertion that not using chroma subsampling would increase the bitrate somewhat for the same perceptual quality.

It would be much more efficient for a CODEC to determine what perceptual compressions to make instead of feeding an already chroma-subsampled source.

There are five things, I believe need to change in video processing:

1. More to a float32 code value standard
2. Abandon the difference between full and limited range (no clipping while float 0.0 to 1.0 is always the visible range)
3. No more chroma subsampling (but allow chroma subsampling for legacy sources)
4. No more interlaced sources (but allow interlaced for legacy sources)
5. Allow for multiple framerates in the same source (each frame will have an absolute time duration marker)

This will make life a lot simpler!

Last edited by Cary Knoop; 24th August 2020 at 19:07.
Cary Knoop is offline   Reply With Quote
Old 24th August 2020, 22:49   #14  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Cary Knoop View Post
I would challenge the assertion that not using chroma subsampling would increase the bitrate somewhat for the same perceptual quality.

It would be much more efficient for a CODEC to determine what perceptual compressions to make instead of feeding an already chroma-subsampled source.
I agree. And thanks to Moore's Law, memory requirements become less expensive year-on-year.

Quote:
There are five things, I believe need to change in video processing:

1. More to a float32 code value standard
2. Abandon the difference between full and limited range (no clipping while float 0.0 to 1.0 is always the visible range)
3. No more chroma subsampling (but allow chroma subsampling for legacy sources)
4. No more interlaced sources (but allow interlaced for legacy sources)
5. Allow for multiple framerates in the same source (each frame will have an absolute time duration marker)

This will make life a lot simpler!
1-3 would be a HUGE increase in memory and memory bandwidth requirements. Today 8-bit video is 12 bit/pixel. Going to fully sampled float32 is an 8x increase in memory requirements!

Doing 1080p30 in your proposed would have the same memory requirements as 4Kp60. Even 10-bit HDR would still be 6x more efficient using 4:2:0.

4. Is pretty much accomplished at this point. There is no interlaced happening with resolutiosn beyond 1080i or with HDR. And H.264 was the last codec that really engineered for interlaced efficiency (HEVC doesn't have MBAFF, for example). Interlaced is already a shrinking legacy technology, thank goodness.

5. We've had media formats that just give each frame a duration for ages and ages - supporting a fixed frame rate was itself an innovation that didn't get broadly implemented until about 15 years ago. The real challenge with variable frame rate video is that technologies like HDMI don't handle it gracefully. Over a 60p connection, switching between 24 and 60 has intrinsic judder. Having everything 120 Hz would be a lot easier for NTSC countries. PAL countries mixing 24/25/30/50/60. THe least common multiplier for all standard frame rates is 600. And that's not accouting for the NTSC 29.97/30 timing headache.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 25th August 2020, 00:47   #15  |  Link
Cary Knoop
Cary Knoop
 
Cary Knoop's Avatar
 
Join Date: Feb 2017
Location: Newark CA, USA
Posts: 397
Quote:
Originally Posted by benwaggoner View Post
1-3 would be a HUGE increase in memory and memory bandwidth requirements. Today 8-bit video is 12 bit/pixel. Going to fully sampled float32 is an 8x increase in memory requirements!
Not really, the floats are converted to 8 or 10 bit using a byte stream depending on the destination bit-depth during (or right after) decoding. Practically speaking there are no commercial monitors that can display over 10-bit of accuracy, there is no need to actually render float values to the monitor.

I believe it should be possible when someone makes a documentary with mixed framerate footage not to worry about making the overall framerate unique but to allow multiple segments. There is absolutely no technical reason why modern monitors cannot switch framerates on the fly (or get a change signal a few frames ahead).

HDMI is an awful, protectionist, and very limited technology.
Cary Knoop is offline   Reply With Quote
Old 25th August 2020, 02:13   #16  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
adaptive sync for gaming is not a special feature anymore more which HDMI can do but rarely used with HDMI displays. usually only a range of 48-60 is present which means a practical range of 50-58 in short not useful at all yet.

HDMI 1.3 was always able to do 1080p 60 hz 12 bit and TV where able to accept this type of signal too and it was used on PC with consumer grade hardware.

HDMI 2.0 is the first major used version that couldn't do the target refreshrate/resolution with 12 bit for 23p 12 bit is a default feature.

if i now take in to consideration that every TV except one i have tested in my life span performance much worse in term of banding when 10 bit or more bit was send into the display instead of 8 bit i don't even know what the point of it is. guess what my new X900h get's "destroyed" when 10 bit is send into it instead of 8 bit and i wasn't expecting anything else.

hasn't it been proven that downscaling is a terrible compression algorithm and that 4:4:4 which a higher chroma qp offset setting is "general" better for quality even for luma then 4:2:0 because more bandwidth is used on luma?

the chroma scaler in madVR weren't created for fun and they where discussed with real world examples. if i really have to i can hunt some... or we just take a esport computer game gaming is already bigger then hollywood so lot's and lot's of content where 4:4:4 matter a lot.

pretty much every new not low end LG TV can or is planned to do 120hz 4K 4:4:4.

just for fun nvidia can do 4.4:4 hardware "encoding" is limited but it can do it.

modern computer are very good at 16-256 bit so i don't see a huge issue here and aren't people already doing this with "avisynth" and vapoursynth? we are talking about processing right not encoding to 16-32 bit?
huhn is offline   Reply With Quote
Old 26th August 2020, 18:34   #17  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Cary Knoop View Post
Not really, the floats are converted to 8 or 10 bit using a byte stream depending on the destination bit-depth during (or right after) decoding. Practically speaking there are no commercial monitors that can display over 10-bit of accuracy, there is no need to actually render float values to the monitor.
It is still memory and bandwidth required in the decoder itself, even if the output is always normalized to 10-bit.

Quote:
I believe it should be possible when someone makes a documentary with mixed framerate footage not to worry about making the overall framerate unique but to allow multiple segments. There is absolutely no technical reason why modern monitors cannot switch framerates on the fly (or get a change signal a few frames ahead).
Variable frame rate monitors for gaming have been around for a few years, and we're seeing those techs make it into some televisions. But that only works when all devices support it. I don't know of any media playback solution that's tried to leverage variable frame rate at all. For pretty much all consumer delivery, the masters have to get conformed to a specific frame rate for a master to be used broadly.

It's a fine idea, and I could see something like the PS5 or XBox Series X being able to support it eventually. It might "just work" under a few specific high-end Windows gaming setups. But that'd be <0.1% of the installed base at best, today.

Sounds like a potential SMPTE spec.

Quote:
HDMI is an awful, protectionist, and very limited technology.
And startlingly undertested for performance. Most interoperability testing stops with "I saw and heard something" not "Did 4k HDR with Atmos work correctly without having to fiddle with menu settings?"
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 26th August 2020, 20:00   #18  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
does 4:4:4 decoding really cost more silicon when a hardware decoder can do 8K 4:2:0 and 4:4:4 is limited to 4k in this case?

BTW. freesync and gsync screen work out of the box with pretty old hardware and even entry level screens have often support for it (mostly useless like TVs).
and player seem to support it too. a high end gaming PC is clearly not needed.
https://github.com/mpv-player/mpv/issues/6137

https://forums.blurbusters.com/viewtopic.php?t=3509
huhn is offline   Reply With Quote
Old 27th August 2020, 18:03   #19  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by huhn View Post
does 4:4:4 decoding really cost more silicon when a hardware decoder can do 8K 4:2:0 and 4:4:4 is limited to 4k in this case?
I don't think it be a material difference. Although any potential visible benefit of >4:2:0 is going to be even smaller at 4K.

Quote:
BTW. freesync and gsync screen work out of the box with pretty old hardware and even entry level screens have often support for it (mostly useless like TVs).
and player seem to support it too. a high end gaming PC is clearly not needed.
https://github.com/mpv-player/mpv/issues/6137

https://forums.blurbusters.com/viewtopic.php?t=3509
Well, that is cool. I will look more into it.

But the broader ecosystem challenge is that there are plenty of devices that can't play the content back, and not much content exists that could truly take advantage of a variable frame rate. Most titles that used sources with different frame rates had those assets conformed to the project's fps before editing. Getting end-to-end VFR working would require big overhauls to editing and content creation software upstream and then a new generation of content using those technologies.

Still, it'd make it possible to have the 48p versions of The Hobbit movies work on home video, which hasn't been possible to date as 48 fps support is far from universal, and isn't a broadcast or HDMI standard.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 27th August 2020, 20:33   #20  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
for 1080p is pretty easy to show "major" difference been chroma scaler.

for UHD there is no 4:4:4 source i know of so i have to take game footage to show the difference which is again not that hard. games are simply different from movies.
huhn is offline   Reply With Quote
Reply

Tags
2160p, bluray, staxrip, upscale, x265

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:01.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.