Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > MPEG-4 AVC / H.264

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th March 2015, 08:06   #1  |  Link
mike23
Registered User
 
Join Date: May 2003
Posts: 107
Better quality at same bitrate: higher vs. lower resolution?

Assume I want to rip/re-encode a Blue-Ray movie to a smaller video file (e.g. *.mkv or *.mp4).

The original movie has a resolution of something about 2560 * 1440.

I decide that a target bitrate of 2000 kbps should be sufficient for resulting video.

Ok so far.

Now I wonder which target resolution I should choose:
Leaving it at the original resolution or shrinking it to
1920*1080.

Keep in mind the bitrate is fix!

If I choose the bigger resolution then all the pixels would get a smaller portion of bits as content information. Therefore the quality per pixel would be worse than the quality per pixel in a smaller resolution.

So what is recommended in general:
Encode with more pixels (=resolution) in worse quality or less pixels (=resolution) in better quality (and expanding/scaling video later at watch time)?

Is there much difference at all?
mike23 is offline   Reply With Quote
Old 18th March 2015, 08:31   #2  |  Link
stax76
Registered User
 
stax76's Avatar
 
Join Date: Jun 2002
Location: On thin ice
Posts: 6,837
What's the point here using fixed bitrate against CRF? For something as low as 2000 kbps 720p should do much better.
stax76 is offline   Reply With Quote
Old 18th March 2015, 10:03   #3  |  Link
mike23
Registered User
 
Join Date: May 2003
Posts: 107
What does that mean 720p?

You mean: Lower resolution?
mike23 is offline   Reply With Quote
Old 18th March 2015, 11:02   #4  |  Link
kypec
User of free A/V tools
 
kypec's Avatar
 
Join Date: Jul 2006
Location: SK
Posts: 826
Quote:
Originally Posted by mike23 View Post
What does that mean 720p?

You mean: Lower resolution?
Yes, 720p generally means frame height up-to-720 pixels, with aspect ratio 16:9 and square pixels you'd get video resolution 1280 x 720 pixels.
I wonder though what Bluray source could give you 2560 x 1440 resolution... Current BD standards (not HEVC yet) support only resolutions up to 1920 x 1080.
kypec is offline   Reply With Quote
Old 18th March 2015, 14:55   #5  |  Link
Sharc
Registered User
 
Join Date: May 2006
Posts: 3,997
Just to add that the "p" stands for "progressive" (as opposed to "i" which would mean "interlaced").
Sharc is offline   Reply With Quote
Old 18th March 2015, 19:11   #6  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by mike23 View Post
Assume I want to rip/re-encode a Blue-Ray movie to a smaller video file (e.g. *.mkv or *.mp4).

The original movie has a resolution of something about 2560 * 1440.
I'm confused. If it's something you're ripping from a Blu-ray video disc, it'll be 1920x1080 max. Or are you trying to make a Blu-ray? I don't know any format or device that supports 2560x1400 as a media format. Many UHD TVs don't handle 2560x1440 well, even if they do 3840x2160 perfectly. It would probably play on a PC or tablet with the right hardware, but that would be iffy; 2560x1440 requires Level 5.0, and so has the same HW decode requirements as full UHD.

Are you just talking about BD-ROM as a storage medium? In which case "ripping" isn't the right term.

In general, a 2 Mbps ABR would be better suited to 720p than 1080p for typical film/video content. Especially if you need BD compatibility, which imposes some other restrictions on stream parameters. I've done anime down to 1080p24 at a 2 Mbps ABR with good results, though.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 18th March 2015, 19:19   #7  |  Link
CarlEdman
Registered User
 
Join Date: Jan 2008
Posts: 185
Quote:
Originally Posted by mike23 View Post
Encode with more pixels (=resolution) in worse quality or less pixels (=resolution) in better quality (and expanding/scaling video later at watch time)?
As a general rule, I would recommend always staying with the source resolution and pick your other encoding setting to reach the bit-rate target by which you are bound.

The reason is that down-sampling is a form of lossy compression, just a very dumb one--effectively throwing away the high-frequency components of your visual signal. H264 in general (and x264 in particular) are smart compression. It can at times and places throw out high-frequency components, just like down-sampling would, but it also has many other options that will often preserve detail that would be lost by plain downsampling. Usually, you are better off doing the entire compression by one smart algorithm, rather than strapping a dumb compression algorithm (down-sampling) in front of your smart compression algorithm (h264).

Sure, you will lose some fine detail by picking a lower encode setting. But so you will when down-sampling and usually more.

The only exception to this general rule that I can think of are:

(a) Situations where you know that the output resolution will be lower anyway and nothing would be gained by keeping the higher-frequency components.

(b) Situations where the encode or decode is so severely time-constrained that you cannot satisfactorily run x264 on the full sample.

(c) In extreme low bit-rate situations, where the h264 per-block overhead become significant.

In all other situations, run the full image through x264 and deal with bitrate elsewhere.
CarlEdman is offline   Reply With Quote
Old 19th March 2015, 02:08   #8  |  Link
xooyoozoo
Registered User
 
Join Date: Dec 2012
Posts: 197
3GPP has done some subjective testing for resolution vs bitrate.



xooyoozoo is offline   Reply With Quote
Old 22nd March 2015, 11:04   #9  |  Link
Boulder
Pig on the wing
 
Boulder's Avatar
 
Join Date: Mar 2002
Location: Finland
Posts: 5,718
Quote:
Originally Posted by CarlEdman View Post
As a general rule, I would recommend always staying with the source resolution and pick your other encoding setting to reach the bit-rate target by which you are bound.

The reason is that down-sampling is a form of lossy compression, just a very dumb one--effectively throwing away the high-frequency components of your visual signal. H264 in general (and x264 in particular) are smart compression. It can at times and places throw out high-frequency components, just like down-sampling would, but it also has many other options that will often preserve detail that would be lost by plain downsampling. Usually, you are better off doing the entire compression by one smart algorithm, rather than strapping a dumb compression algorithm (down-sampling) in front of your smart compression algorithm (h264).

Sure, you will lose some fine detail by picking a lower encode setting. But so you will when down-sampling and usually more.

The only exception to this general rule that I can think of are:

(a) Situations where you know that the output resolution will be lower anyway and nothing would be gained by keeping the higher-frequency components.

(b) Situations where the encode or decode is so severely time-constrained that you cannot satisfactorily run x264 on the full sample.

(c) In extreme low bit-rate situations, where the h264 per-block overhead become significant.

In all other situations, run the full image through x264 and deal with bitrate elsewhere.
There are many cases in which the source material doesn't have any details to lose when downsizing to 720p, apart from some loss of sharpness when upsizing back to 1080p for displaying on the TV. I always do a comparison (w/ slight sharpening to compensate) and there are not that many cases that have remained 1080p
__________________
And if the band you're in starts playing different tunes
I'll see you on the dark side of the Moon...
Boulder is offline   Reply With Quote
Old 22nd March 2015, 12:17   #10  |  Link
kuchikirukia
Registered User
 
Join Date: Oct 2014
Posts: 476
Quote:
Originally Posted by CarlEdman View Post
The reason is that down-sampling is a form of lossy compression, just a very dumb one--effectively throwing away the high-frequency components of your visual signal. H264 in general (and x264 in particular) are smart compression. It can at times and places throw out high-frequency components, just like down-sampling would, but it also has many other options that will often preserve detail that would be lost by plain downsampling. Usually, you are better off doing the entire compression by one smart algorithm, rather than strapping a dumb compression algorithm (down-sampling) in front of your smart compression algorithm (h264).

Sure, you will lose some fine detail by picking a lower encode setting. But so you will when down-sampling and usually more.
If the lower resolution is sufficient, it doesn't get much smarter than that.
Encoders do not always pick the best bit distribution, and the variances and valleys can be worse than just a set decrease in resolution. Downscaling may remove some places where it may have been smart, but it also takes away places where it would have been stupid.

Unneeded resolution is just noise.
kuchikirukia is offline   Reply With Quote
Old 30th December 2019, 21:39   #11  |  Link
HEVCer
Registered User
 
Join Date: Dec 2019
Posts: 1
Hello everybody,

First of all, sorry for replyng to this old thread but it came out when I googled "bitrate" and "resolution", and I think that Mike23's question will never get obsolete.

I'd just like to say that I agree with CarlEdman 100%. Every situation is different, but personally, I always keep the same resolution, I never downscale nor upscale. When you downscale, you lose some details that you will never be able to recover no matter the codec you use, whereas when you keep the same resolution, you have a chance to recover some details with the proper codec (eg x265) or x264 settings.

I made a test 5 years ago on a 4-hour 1080i@13Mbs tennis match and got better results when encoding it to 1080p@2Mbs than 720p@2Mbs (same AVC settings, just the resolution changes).

PS : Tennis matches give great results at high resolution and low bitrates because there are not a lot of motions. Most of the time, only the ball and the 2 players move, the rest of the screen is static.
HEVCer is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:13.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.