Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
26th March 2019, 19:54 | #1 | Link |
Herr
Join Date: Apr 2009
Location: North Europe
Posts: 556
|
x265: When to lower the resolution?
I wonder: If I encode video with x265 and want to keep the image-quality as good as possible, when do I know when it's better to set a lower resolution, like 1080p to 720p to save bitrate? To explain more what I mean: Isn't it for example better to make a 720p-encode than a bitrate-starved 1080p-encode (both encodes at same bitrate)?
Can I use "Avg QP" in x265 to determine that? Thanks Last edited by Forteen88; 28th March 2019 at 09:56. Reason: spell-correction |
26th March 2019, 20:14 | #2 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
I do it by eye now, if it starts looking over-compressed I lower the resolution.
Finding an Avg QP that normally looks OK to you is probably a good rule of thumb but nothing beats looking at it.
__________________
madVR options explained |
27th March 2019, 22:02 | #3 | Link |
Derek Prestegard IRL
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
|
In general, yes, there comes a point where you're better off reducing resolution.
However, with HEVC this is often lower than you think, because some of its features (like much bigger blocks in general) make it less bad at high resolution Careful testing is the only way to figure out what works for you. The approach taken by Netflix is to use their VMAF metric to enable content-adaptive encoding where you dynamically adjust your ABR ladder based on source complexity. To a user doing an encode for offline playback the same principle could be applied, but it would probably be overly complicated. Eyeball it. I'd generally drop down to 720p when you get below 2-3 Mbps, but that's just a very broad starting point |
29th March 2019, 21:04 | #5 | Link | |
Registered User
Join Date: Sep 2006
Posts: 176
|
Quote:
Also, would 2K be considered Cinema 2K, which is 2048 on the horizontal? If so, that's barely better than 1920x1080, which seems kinda silly. |
|
30th March 2019, 05:44 | #6 | Link |
Derek Prestegard IRL
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,989
|
Most VFX movies are mastered at 2k (even if the live action is acquired at 4k or more) because rendering VFX is so insanely expensive it's just not reasonable to expect to spend 4x (or more) the compute to render.
So, they do 2k rendering, master in 2k, and then upscale using very good tools. The results are good, but... yeah. There are some real 4k movies out there though. https://4kmedia.org/real-or-fake-4k/ I'm sure this will change eventually, but for now if the VFX budget for a film is getting increased, they'd (wisely) spend it on higher quality VFX instead of higher resolution VFX. |
30th March 2019, 19:36 | #7 | Link |
Registered User
Join Date: Sep 2006
Posts: 176
|
Just as the rendering VFX at 4K being expensive, both in time and monetarily, it seems like the time to convert 4K discs to 4K for personal use may not be worth the compute time if the source was 2K.
I'm using an Nvidia Shield hooked up to an LG B8 55" and a really good 1080 conversion looks great. With that in mind, I'm going to try some 4K > 2560 (on the horizontal) conversions and see how it looks. Why 2560? Well, it seems 2K Cinema is barely worth the increase over 1080, and I'm coming down from 4K, so 2560 may be a good in the middle compromise for resolution, speed of conversion, and file size. I've already seen a speed up of around 3X doing this, and I can up the CRF, and still end up with a massively reduced file size. Experiments...begin! Last edited by chainring; 1st April 2019 at 18:47. Reason: spelling correction |
2nd April 2019, 16:42 | #11 | Link |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,771
|
If it's a Smart TV app, the output of the decoder goes straight to the compositor. If for some reason 1440p isn't supported somewhere it won't work. And there are some devices that didn't in the early UHD days. I don't have a sense of how common this problem still is.
|
|
|