Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 26th March 2019, 19:54   #1  |  Link
Forteen88
Herr
 
Join Date: Apr 2009
Location: North Europe
Posts: 556
x265: When to lower the resolution?

I wonder: If I encode video with x265 and want to keep the image-quality as good as possible, when do I know when it's better to set a lower resolution, like 1080p to 720p to save bitrate? To explain more what I mean: Isn't it for example better to make a 720p-encode than a bitrate-starved 1080p-encode (both encodes at same bitrate)?

Can I use "Avg QP" in x265 to determine that?
Thanks

Last edited by Forteen88; 28th March 2019 at 09:56. Reason: spell-correction
Forteen88 is offline   Reply With Quote
Old 26th March 2019, 20:14   #2  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
I do it by eye now, if it starts looking over-compressed I lower the resolution.

Finding an Avg QP that normally looks OK to you is probably a good rule of thumb but nothing beats looking at it.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 27th March 2019, 22:02   #3  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
In general, yes, there comes a point where you're better off reducing resolution.

However, with HEVC this is often lower than you think, because some of its features (like much bigger blocks in general) make it less bad at high resolution

Careful testing is the only way to figure out what works for you. The approach taken by Netflix is to use their VMAF metric to enable content-adaptive encoding where you dynamically adjust your ABR ladder based on source complexity. To a user doing an encode for offline playback the same principle could be applied, but it would probably be overly complicated.

Eyeball it.

I'd generally drop down to 720p when you get below 2-3 Mbps, but that's just a very broad starting point
Blue_MiSfit is offline   Reply With Quote
Old 29th March 2019, 10:16   #4  |  Link
K.i.N.G
Registered User
 
Join Date: Aug 2009
Posts: 90
Also, for those backing up their UHD discs... Many of those are upscaled from 2K masters so you might be better off scaling them back down to 1080p, depending on your bitrate it won't look much different.
K.i.N.G is offline   Reply With Quote
Old 29th March 2019, 21:04   #5  |  Link
chainring
Registered User
 
chainring's Avatar
 
Join Date: Sep 2006
Posts: 176
Quote:
Originally Posted by K.i.N.G View Post
Also, for those backing up their UHD discs... Many of those are upscaled from 2K masters so you might be better off scaling them back down to 1080p, depending on your bitrate it won't look much different.
Timely, as I was just looking around for a definitive (read: reliable) source that lists true 4K movie releases vs. upscaled.

Also, would 2K be considered Cinema 2K, which is 2048 on the horizontal? If so, that's barely better than 1920x1080, which seems kinda silly.
chainring is offline   Reply With Quote
Old 30th March 2019, 05:44   #6  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
Most VFX movies are mastered at 2k (even if the live action is acquired at 4k or more) because rendering VFX is so insanely expensive it's just not reasonable to expect to spend 4x (or more) the compute to render.

So, they do 2k rendering, master in 2k, and then upscale using very good tools. The results are good, but... yeah. There are some real 4k movies out there though.

https://4kmedia.org/real-or-fake-4k/

I'm sure this will change eventually, but for now if the VFX budget for a film is getting increased, they'd (wisely) spend it on higher quality VFX instead of higher resolution VFX.
Blue_MiSfit is offline   Reply With Quote
Old 30th March 2019, 19:36   #7  |  Link
chainring
Registered User
 
chainring's Avatar
 
Join Date: Sep 2006
Posts: 176
Just as the rendering VFX at 4K being expensive, both in time and monetarily, it seems like the time to convert 4K discs to 4K for personal use may not be worth the compute time if the source was 2K.

I'm using an Nvidia Shield hooked up to an LG B8 55" and a really good 1080 conversion looks great. With that in mind, I'm going to try some 4K > 2560 (on the horizontal) conversions and see how it looks. Why 2560? Well, it seems 2K Cinema is barely worth the increase over 1080, and I'm coming down from 4K, so 2560 may be a good in the middle compromise for resolution, speed of conversion, and file size. I've already seen a speed up of around 3X doing this, and I can up the CRF, and still end up with a massively reduced file size. Experiments...begin!

Last edited by chainring; 1st April 2019 at 18:47. Reason: spelling correction
chainring is offline   Reply With Quote
Old 1st April 2019, 03:52   #8  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
1440p is indeed a great compromise IMO
Blue_MiSfit is offline   Reply With Quote
Old 2nd April 2019, 01:14   #9  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,771
Quote:
Originally Posted by Blue_MiSfit View Post
1440p is indeed a great compromise IMO
I fully agree. Although there are some early UHD TVs that don't support 1440p.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 2nd April 2019, 03:44   #10  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
Wouldn't basically any client output a 2160p signal from a 1440p ABR layer, including a built-in smart TV app?
Blue_MiSfit is offline   Reply With Quote
Old 2nd April 2019, 16:42   #11  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,771
Quote:
Originally Posted by Blue_MiSfit View Post
Wouldn't basically any client output a 2160p signal from a 1440p ABR layer, including a built-in smart TV app?
If it's a Smart TV app, the output of the decoder goes straight to the compositor. If for some reason 1440p isn't supported somewhere it won't work. And there are some devices that didn't in the early UHD days. I don't have a sense of how common this problem still is.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 2nd April 2019, 21:51   #12  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
Got it, that does make sense.
Blue_MiSfit is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:27.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.