Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
31st October 2010, 10:41 | #1 | Link |
the Interrogator
Join Date: Oct 2001
Location: DownUnder
Posts: 664
|
Why 720p?
When transcoding BD to x264, the most common resolutions to seem to be 1080p and 720p.
Using 1080p, one isn't losing anything by way of resolution downscaling, but I have always wondered "Why 720p?" Linearly it's two thirds of 1080p so by area it's 44.4 %. Obviously using 720p gives a significantly smaller encode than 1080p. Is there anything "special" about 720p which gives a better visual result than another resolution reduction, say to 50 % (approx. 70 % linearly), or to 64 % (80 % linearly), when the reduction in encode size is taken into account? |
31st October 2010, 11:13 | #3 | Link |
Registered User
Join Date: Dec 2005
Posts: 1,460
|
Why downscale?
Because it saves space and most Blu Ray don't have enough details to make 1080p necessary anyway. There is hardly more detail than could be kept with 720p on the majority of Blu Rays. All you get from the extra resolution is a more accurate representation of noise/grain in the source. Besides if you reencode instead of just remuxing the source you want lower filesizes and noise/grain is the first thing you lose anyway. Why 720p? Because it's a standard resolution, it's big enough to keep most of the details and small enough so you can encode with HP@3.1 restrictions which is the limit of some portable devices. It is a good compromise between compatibility, quality and filesize. A couple of percent more or less pixels wouldn't matter much for the latter two. Also I guess when people say they resize to 720p they actually resize to 1280*xxx depending on aspect ratio for 16/9 and wider ARs. |
31st October 2010, 19:51 | #4 | Link |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Most people that encode for themselves use 1080p. The others use 720p as they save bandwidth. It's not even 720p but ~544p as any google search can confirm.
And since these encodings are generally made for PC play, the optical resolution plays no role. |
31st October 2010, 23:27 | #5 | Link | |
Join Date: Mar 2006
Location: Barcelona
Posts: 5,034
|
Quote:
WTF? If I connect my PC to a 72 inch HD display the resolution is not important? Are you sure you are taking the right meds? Last edited by Groucho2004; 1st November 2010 at 00:51. |
|
1st November 2010, 13:54 | #6 | Link | ||
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
Quote:
Obviously, people downconverting to 720p do not have your setup. |
||
1st November 2010, 14:41 | #9 | Link | ||
Banned
Join Date: Apr 2008
Posts: 723
|
Quote:
A 1280x720 wide screen image might be 1280x544 after you decide not to encode the black bars, but it's still displaying at full width on a 1280x720, 16:9 screen. The video itself is still no different in resolution to the original 1280x720 video without the black bars removed.... yet it's no longer HD? A 720x576 (given you had to use PAL rather than NTSC) standard definition DVD, even when it's anamorphic pixels are stretched out, would still be only at best be equivalent to 1024x576 square pixels. Then crop off those same black bars and it's only equivalent to 1024x432 square pixels. In what way could a square pixel encode at 1280x544 be considered to have less resolution than a 720x432 anamorphic encode, which when displayed at it's original resolution would still only be 1024 square pixels wide? Quote:
I've not done any comparisons myself but I've read quite a few reviews of LCD TVs and the general consensus from those who should know is that when it comes to watching video on a typical size TV, 1080p video doesn't offer a substantial improvement over 720p even if the TV has a 1080p resolution. 720p vs. 1080p "We still believe that when you're dealing with TVs 50 inches and smaller, the added resolution has only a very minor impact on picture quality. In our tests, we put 720p (or 768p) sets next to 1080p sets, then feed them both the same source material, whether it's 1080i or 1080p, from the highest-quality Blu-ray player. We typically watch both sets for a while, with eyes darting back and forth between the two, looking for differences in the most-detailed sections, such as hair, textures of fabric, and grassy plains. Bottom line: It's almost always very difficult to see any difference--especially from farther than 8 feet away on a 50-inch TV." |
||
1st November 2010, 15:18 | #10 | Link | |
Join Date: Mar 2006
Location: Barcelona
Posts: 5,034
|
Quote:
If you keep doing this you're going to lose your credibility very quickly. I wonder if you are even aware of this very annoying habit. Last edited by Groucho2004; 1st November 2010 at 15:21. |
|
1st November 2010, 15:22 | #11 | Link | ||
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
720p has 720. However, the video was cropped, so on a PC one would see only, as an example, 544 lines. I am not talking about displaying on a TV set, because that would mean adding matting, as 1280x544 is illegal for any HD TV set. People that crop do this obviously for watching on a PC with a PC display. Quote:
However, I cannot stop commenting that your mathematics can bring nice results, too. 720 is just 1.25x PAL, and 1080 is just 1.5x 720. It follows that there should be less vizible differences between PAL progressive and 720p than between 720p and 1080p. But video is not all pixels. |
||
1st November 2010, 15:54 | #12 | Link | |
Registered User
Join Date: Dec 2005
Posts: 1,460
|
Quote:
And why would cropping imply that one watches on a PC? I crop. I watch my stuff on an HD TV. Why do I crop? Because there is no reason to keep the black bars. They don't contain any information and the player will add them back on playback. The only reason not to crop when you reencode is that you need the file to be some standard resolution like when you want to be Blu Ray compatible. Yeah, that's true if you ignore that difference in width. In reality for 16/9 content 720p is 2.22 times PAL and 1080p is 2.25 times 720p or 5 times PAL in terms of actual data (number of pixels). So ideally there should be about the same difference between the resolutions. What actually happens is that you get diminishing returns. |
|
1st November 2010, 18:18 | #13 | Link |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
For both issues: It's not me the one that created the standards.
For both issues again: it's your brain the thing that sees the top and bottom lines are black. And it were initially a "recogniscible" image there, but it just has been artificially matted. An SD image won't be an HDTV one just because one rounded it with black bars to add black pixels up to 1920x1080/1088, but for the purpose of a standard it would be. And you said already that if you want to observe the standards you don't crop. |
1st November 2010, 19:19 | #14 | Link |
Registered User
Join Date: Dec 2005
Posts: 1,460
|
The next time when you argue a purely technical point please make it more explicit, it would have saved me a lot of typing. The way I read it I thought you were saying that there is an actual difference between the cropped and uncropped video.
|
1st November 2010, 20:55 | #16 | Link | ||||
Banned
Join Date: Apr 2008
Posts: 723
|
Quote:
Quote:
Quote:
For the rest you had no answers. Quote:
Maybe not, but it's certainly not semantics which sees HD become SD simply because it's been cropped, when the picture itself contains the same number of pixels or scanlines either way. Yeah, but if it's a HD image that's been cropped, and if it's re-encoded with the black bars added back, it's still a HDTV image and it then conforms to the standard which has you arguing semantics where encoded black bars count towards a definition of HD while artificial matting does not. It's a silly argument. |
||||
2nd November 2010, 09:09 | #17 | Link | ||||||
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
Quote:
Quote:
However, the CNET testing of 720p vs 1080p is not an objective one. I said this long time ago, you may check my posts ( ), in short, the comparison needs two native chains for each resolution, from acquisition to display, which is hard to achieve, as no 1080p live cameras existed (in 2009) and no 720p scans of film movies. So one of the chains would have to display a resampled movie, which would disadvantage one of the two. Quote:
Quote:
Quote:
I know that your mind cannot accommodate the idea that black bars (the matting) are actually a part of the picture, which sometimes it's covered with subtitles (this might help you understand, because one cannot draw subtitles over something that doesn't exist ). |
||||||
2nd November 2010, 14:07 | #18 | Link |
Registered User
Join Date: Dec 2002
Location: UK
Posts: 1,673
|
Tangential to the discussion, but important...
http://en.wikipedia.org/wiki/Optimum...ewing_distance and http://carltonbale.com/1080p-does-matter FWIW I think the absolute limit for being able to detect a single pixel separate from the adjacent one is 1 pixel = 0.5 arcminute. This follows from the construction of the human eye: 120 cones (detectors) in 1 degree; and from mapping (at very best) 1 pixel to 1 cone. Put another way, 1 pixel on screen needs to fill 1.45 x 10^-4 times the distance from the screen (tan(1/120)). Bigger pixels could be visible as discrete pixels. Smaller pixels would be wasted. e.g. 1920 pixels * 3m away from the screen * 1.45 * 10^-4 = 84cm wide screen. If you convert this to feet and inches, and compare it with this graph... http://s3.carltonbale.com/resolution_chart.html ...you'll find that the "full benefit" lines on that graph give half the viewing distance I've calculated; they probably used 1 arcminute as the reference, equivalent to mapping 1 pixel onto 2 cones (detectors) in your eye, which pretty much makes sure you get the benefit. The "benefit starts to become noticeable" areas are kind of arbitrary, because the benefit might become noticeable even earlier. Even so, I think the numbers on that graph are far more applicable in real life than the theoretical limit for detecting pixels if the source is clean. Once you throw in some lossy coding artefacts, there's some justification for using the stricter 0.5 arcminute measure, because being able to just about resolve a single pixel means you can probably see any visible coding artefacts. Whether you care about these in motion when watching an interesting movie is a different thing entirely! Cheers, David. Last edited by 2Bdecided; 2nd November 2010 at 14:25. Reason: typos |
Thread Tools | Search this Thread |
Display Modes | |
|
|