Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
18th March 2015, 08:06 | #1 | Link |
Registered User
Join Date: May 2003
Posts: 107
|
Better quality at same bitrate: higher vs. lower resolution?
Assume I want to rip/re-encode a Blue-Ray movie to a smaller video file (e.g. *.mkv or *.mp4).
The original movie has a resolution of something about 2560 * 1440. I decide that a target bitrate of 2000 kbps should be sufficient for resulting video. Ok so far. Now I wonder which target resolution I should choose: Leaving it at the original resolution or shrinking it to 1920*1080. Keep in mind the bitrate is fix! If I choose the bigger resolution then all the pixels would get a smaller portion of bits as content information. Therefore the quality per pixel would be worse than the quality per pixel in a smaller resolution. So what is recommended in general: Encode with more pixels (=resolution) in worse quality or less pixels (=resolution) in better quality (and expanding/scaling video later at watch time)? Is there much difference at all? |
18th March 2015, 11:02 | #4 | Link |
User of free A/V tools
Join Date: Jul 2006
Location: SK
Posts: 826
|
Yes, 720p generally means frame height up-to-720 pixels, with aspect ratio 16:9 and square pixels you'd get video resolution 1280 x 720 pixels.
I wonder though what Bluray source could give you 2560 x 1440 resolution... Current BD standards (not HEVC yet) support only resolutions up to 1920 x 1080. |
18th March 2015, 19:11 | #6 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
Are you just talking about BD-ROM as a storage medium? In which case "ripping" isn't the right term. In general, a 2 Mbps ABR would be better suited to 720p than 1080p for typical film/video content. Especially if you need BD compatibility, which imposes some other restrictions on stream parameters. I've done anime down to 1080p24 at a 2 Mbps ABR with good results, though. |
|
18th March 2015, 19:19 | #7 | Link | |
Registered User
Join Date: Jan 2008
Posts: 185
|
Quote:
The reason is that down-sampling is a form of lossy compression, just a very dumb one--effectively throwing away the high-frequency components of your visual signal. H264 in general (and x264 in particular) are smart compression. It can at times and places throw out high-frequency components, just like down-sampling would, but it also has many other options that will often preserve detail that would be lost by plain downsampling. Usually, you are better off doing the entire compression by one smart algorithm, rather than strapping a dumb compression algorithm (down-sampling) in front of your smart compression algorithm (h264). Sure, you will lose some fine detail by picking a lower encode setting. But so you will when down-sampling and usually more. The only exception to this general rule that I can think of are: (a) Situations where you know that the output resolution will be lower anyway and nothing would be gained by keeping the higher-frequency components. (b) Situations where the encode or decode is so severely time-constrained that you cannot satisfactorily run x264 on the full sample. (c) In extreme low bit-rate situations, where the h264 per-block overhead become significant. In all other situations, run the full image through x264 and deal with bitrate elsewhere. |
|
22nd March 2015, 11:04 | #9 | Link | |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,731
|
Quote:
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
|
22nd March 2015, 12:17 | #10 | Link | |
Registered User
Join Date: Oct 2014
Posts: 476
|
Quote:
Encoders do not always pick the best bit distribution, and the variances and valleys can be worse than just a set decrease in resolution. Downscaling may remove some places where it may have been smart, but it also takes away places where it would have been stupid. Unneeded resolution is just noise. |
|
30th December 2019, 21:39 | #11 | Link |
Registered User
Join Date: Dec 2019
Posts: 1
|
Hello everybody,
First of all, sorry for replyng to this old thread but it came out when I googled "bitrate" and "resolution", and I think that Mike23's question will never get obsolete. I'd just like to say that I agree with CarlEdman 100%. Every situation is different, but personally, I always keep the same resolution, I never downscale nor upscale. When you downscale, you lose some details that you will never be able to recover no matter the codec you use, whereas when you keep the same resolution, you have a chance to recover some details with the proper codec (eg x265) or x264 settings. I made a test 5 years ago on a 4-hour 1080i@13Mbs tennis match and got better results when encoding it to 1080p@2Mbs than 720p@2Mbs (same AVC settings, just the resolution changes). PS : Tennis matches give great results at high resolution and low bitrates because there are not a lot of motions. Most of the time, only the ball and the 2 players move, the rest of the screen is static. |
Thread Tools | Search this Thread |
Display Modes | |
|
|