Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion

Closed Thread
 
Thread Tools Search this Thread Display Modes
Old 31st October 2010, 10:41   #1  |  Link
JimmyBarnes
the Interrogator
 
JimmyBarnes's Avatar
 
Join Date: Oct 2001
Location: DownUnder
Posts: 664
Why 720p?

When transcoding BD to x264, the most common resolutions to seem to be 1080p and 720p.

Using 1080p, one isn't losing anything by way of resolution downscaling, but I have always wondered "Why 720p?" Linearly it's two thirds of 1080p so by area it's 44.4 %. Obviously using 720p gives a significantly smaller encode than 1080p.

Is there anything "special" about 720p which gives a better visual result than another resolution reduction, say to 50 % (approx. 70 % linearly), or to 64 % (80 % linearly), when the reduction in encode size is taken into account?
JimmyBarnes is offline  
Old 31st October 2010, 11:10   #2  |  Link
GodofaGap
Registered User
 
Join Date: Feb 2006
Posts: 823
I would think the main reason is compatibility. 1280x720 is defined in the Bluray standard.
GodofaGap is offline  
Old 31st October 2010, 11:13   #3  |  Link
nurbs
Registered User
 
Join Date: Dec 2005
Posts: 1,460
Why downscale?
Because it saves space and most Blu Ray don't have enough details to make 1080p necessary anyway. There is hardly more detail than could be kept with 720p on the majority of Blu Rays. All you get from the extra resolution is a more accurate representation of noise/grain in the source.
Besides if you reencode instead of just remuxing the source you want lower filesizes and noise/grain is the first thing you lose anyway.

Why 720p?
Because it's a standard resolution, it's big enough to keep most of the details and small enough so you can encode with HP@3.1 restrictions which is the limit of some portable devices. It is a good compromise between compatibility, quality and filesize. A couple of percent more or less pixels wouldn't matter much for the latter two.
Also I guess when people say they resize to 720p they actually resize to 1280*xxx depending on aspect ratio for 16/9 and wider ARs.
nurbs is offline  
Old 31st October 2010, 19:51   #4  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Most people that encode for themselves use 1080p. The others use 720p as they save bandwidth. It's not even 720p but ~544p as any google search can confirm.

And since these encodings are generally made for PC play, the optical resolution plays no role.
Ghitulescu is offline  
Old 31st October 2010, 23:27   #5  |  Link
Groucho2004
 
Join Date: Mar 2006
Location: Barcelona
Posts: 5,034
Quote:
Originally Posted by Ghitulescu View Post
It's not even 720p but ~544p as any google search can confirm.
How is 720p not 720p? I suppose you are talking about the resolution after cropping but how is this relevant? If it were relevant, how does it not apply to 1080p??


Quote:
Originally Posted by Ghitulescu View Post
And since these encodings are generally made for PC play, the optical resolution plays no role.
WTF? If I connect my PC to a 72 inch HD display the resolution is not important? Are you sure you are taking the right meds?

Last edited by Groucho2004; 1st November 2010 at 00:51.
Groucho2004 is offline  
Old 1st November 2010, 13:54   #6  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Quote:
Originally Posted by Groucho2004 View Post
How is 720p not 720p? I suppose you are talking about the resolution after cropping but how is this relevant? If it were relevant, how does it not apply to 1080p??
Because cropping 1080 would still land in the HD realm, while 544 can no longer be HD, as it's below regular PAL SD.

Quote:
Originally Posted by Groucho2004 View Post
WTF? If I connect my PC to a 72 inch HD display the resolution is not important? Are you sure you are taking the right meds?
The higher the screen size, the higher the needed optical/stored resolution.
Obviously, people downconverting to 720p do not have your setup.
Ghitulescu is offline  
Old 1st November 2010, 14:05   #7  |  Link
GodofaGap
Registered User
 
Join Date: Feb 2006
Posts: 823
Quote:
Originally Posted by Ghitulescu View Post
Because cropping 1080 would still land in the HD realm, while 544 can no longer be HD, as it's below regular PAL SD.
Not for ~2.35:1.
GodofaGap is offline  
Old 1st November 2010, 14:31   #8  |  Link
nurbs
Registered User
 
Join Date: Dec 2005
Posts: 1,460
Quote:
Originally Posted by Ghitulescu View Post
Because cropping 1080 would still land in the HD realm, while 544 can no longer be HD, as it's below regular PAL SD.
Not really. 1280*544 has 68% more pixels than 720*576 and that's ignoring that a 2.35:1 PAL DVD would only have around 720*436 pixels containing the movie.

Last edited by nurbs; 1st November 2010 at 14:36.
nurbs is offline  
Old 1st November 2010, 14:41   #9  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Quote:
Originally Posted by Ghitulescu View Post
Because cropping 1080 would still land in the HD realm, while 544 can no longer be HD, as it's below regular PAL SD.
I'd been wondering what your logic would be. I even came up with a few theories. I never would have guessed that one.

A 1280x720 wide screen image might be 1280x544 after you decide not to encode the black bars, but it's still displaying at full width on a 1280x720, 16:9 screen. The video itself is still no different in resolution to the original 1280x720 video without the black bars removed.... yet it's no longer HD?

A 720x576 (given you had to use PAL rather than NTSC) standard definition DVD, even when it's anamorphic pixels are stretched out, would still be only at best be equivalent to 1024x576 square pixels. Then crop off those same black bars and it's only equivalent to 1024x432 square pixels.

In what way could a square pixel encode at 1280x544 be considered to have less resolution than a 720x432 anamorphic encode, which when displayed at it's original resolution would still only be 1024 square pixels wide?

Quote:
Originally Posted by Ghitulescu View Post
The higher the screen size, the higher the needed optical/stored resolution.
Obviously, people downconverting to 720p do not have your setup.
Maybe the people converting to 720p are doing so according to what they can see and not according to a screen size related theory.
I've not done any comparisons myself but I've read quite a few reviews of LCD TVs and the general consensus from those who should know is that when it comes to watching video on a typical size TV, 1080p video doesn't offer a substantial improvement over 720p even if the TV has a 1080p resolution.

720p vs. 1080p
"We still believe that when you're dealing with TVs 50 inches and smaller, the added resolution has only a very minor impact on picture quality. In our tests, we put 720p (or 768p) sets next to 1080p sets, then feed them both the same source material, whether it's 1080i or 1080p, from the highest-quality Blu-ray player. We typically watch both sets for a while, with eyes darting back and forth between the two, looking for differences in the most-detailed sections, such as hair, textures of fabric, and grassy plains. Bottom line: It's almost always very difficult to see any difference--especially from farther than 8 feet away on a 50-inch TV."
yetanotherid is offline  
Old 1st November 2010, 15:18   #10  |  Link
Groucho2004
 
Join Date: Mar 2006
Location: Barcelona
Posts: 5,034
Quote:
Originally Posted by Ghitulescu View Post
Because cropping 1080 would still land in the HD realm, while 544 can no longer be HD, as it's below regular PAL SD.
And yet again (and as almost always) you're diverting from your original statement which was "It's not even 720p but ~544p". No mention of HD.

If you keep doing this you're going to lose your credibility very quickly. I wonder if you are even aware of this very annoying habit.

Last edited by Groucho2004; 1st November 2010 at 15:21.
Groucho2004 is offline  
Old 1st November 2010, 15:22   #11  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Quote:
Originally Posted by nurbs View Post
Not really. 1280*544 has 68% more pixels than 720*576 and that's ignoring that a 2.35:1 PAL DVD would only have around 720*436 pixels containing the movie.
It's not the number of pixels it's the number of active scanlines the decisive issue. Everything that has more scanlines than PAL is HDTV. For economy reasons, these resolutions have been standardized to 720p and 1080p and soon it would be added more.

720p has 720. However, the video was cropped, so on a PC one would see only, as an example, 544 lines. I am not talking about displaying on a TV set, because that would mean adding matting, as 1280x544 is illegal for any HD TV set. People that crop do this obviously for watching on a PC with a PC display.

Quote:
Originally Posted by yetanotherid View Post
I'd been wondering what your logic would be. I even came up with a few theories. I never would have guessed that one.

A 1280x720 wide screen image might be 1280x544 after you decide not to encode the black bars, but it's still displaying at full width on a 1280x720, 16:9 screen. The video itself is still no different in resolution to the original 1280x720 video without the black bars removed.... yet it's no longer HD?

A 720x576 (given you had to use PAL rather than NTSC) standard definition DVD, even when it's anamorphic pixels are stretched out, would still be only at best be equivalent to 1024x576 square pixels. Then crop off those same black bars and it's only equivalent to 1024x432 square pixels.

In what way could a square pixel encode at 1280x544 be considered to have less resolution than a 720x432 anamorphic encode, which when displayed at it's original resolution would still only be 1024 square pixels wide?


Maybe the people converting to 720p are doing so according to what they can see and not according to a screen size related theory.
I've not done any comparisons myself but I've read quite a few reviews of LCD TVs and the general consensus from those who should know is that when it comes to watching video on a typical size TV, 1080p video doesn't offer a substantial improvement over 720p even if the TV has a 1080p resolution.

720p vs. 1080p
"We still believe that when you're dealing with TVs 50 inches and smaller, the added resolution has only a very minor impact on picture quality. In our tests, we put 720p (or 768p) sets next to 1080p sets, then feed them both the same source material, whether it's 1080i or 1080p, from the highest-quality Blu-ray player. We typically watch both sets for a while, with eyes darting back and forth between the two, looking for differences in the most-detailed sections, such as hair, textures of fabric, and grassy plains. Bottom line: It's almost always very difficult to see any difference--especially from farther than 8 feet away on a 50-inch TV."
I don't trust CNET's reviews, and for the rest you have the answers above.

However, I cannot stop commenting that your mathematics can bring nice results, too. 720 is just 1.25x PAL, and 1080 is just 1.5x 720. It follows that there should be less vizible differences between PAL progressive and 720p than between 720p and 1080p.

But video is not all pixels.
Ghitulescu is offline  
Old 1st November 2010, 15:54   #12  |  Link
nurbs
Registered User
 
Join Date: Dec 2005
Posts: 1,460
Quote:
Originally Posted by Ghitulescu View Post
720p has 720. However, the video was cropped, so on a PC one would see only, as an example, 544 lines. I am not talking about displaying on a TV set, because that would mean adding matting, as 1280x544 is illegal for any HD TV set. People that crop do this obviously for watching on a PC with a PC display.
What's that supposed to mean? If I take a 720p (2.35:1) video and crop the black bars it turns into SD, but as soon as I add them back it's HD again?
And why would cropping imply that one watches on a PC? I crop. I watch my stuff on an HD TV. Why do I crop? Because there is no reason to keep the black bars. They don't contain any information and the player will add them back on playback.
The only reason not to crop when you reencode is that you need the file to be some standard resolution like when you want to be Blu Ray compatible.

Quote:
Originally Posted by Ghitulescu View Post
However, I cannot stop commenting that your mathematics can bring nice results, too. 720 is just 1.25x PAL, and 1080 is just 1.5x 720. It follows that there should be less vizible differences between PAL progressive and 720p than between 720p and 1080p.
Yeah, that's true if you ignore that difference in width. In reality for 16/9 content 720p is 2.22 times PAL and 1080p is 2.25 times 720p or 5 times PAL in terms of actual data (number of pixels). So ideally there should be about the same difference between the resolutions. What actually happens is that you get diminishing returns.
nurbs is offline  
Old 1st November 2010, 18:18   #13  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
For both issues: It's not me the one that created the standards.

For both issues again: it's your brain the thing that sees the top and bottom lines are black. And it were initially a "recogniscible" image there, but it just has been artificially matted. An SD image won't be an HDTV one just because one rounded it with black bars to add black pixels up to 1920x1080/1088, but for the purpose of a standard it would be.

And you said already that if you want to observe the standards you don't crop.
Ghitulescu is offline  
Old 1st November 2010, 19:19   #14  |  Link
nurbs
Registered User
 
Join Date: Dec 2005
Posts: 1,460
The next time when you argue a purely technical point please make it more explicit, it would have saved me a lot of typing. The way I read it I thought you were saying that there is an actual difference between the cropped and uncropped video.
nurbs is offline  
Old 1st November 2010, 19:36   #15  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Visually no.
Ghitulescu is offline  
Old 1st November 2010, 20:55   #16  |  Link
yetanotherid
Banned
 
Join Date: Apr 2008
Posts: 723
Quote:
Originally Posted by Ghitulescu View Post
It's not the number of pixels it's the number of active scanlines the decisive issue. Everything that has more scanlines than PAL is HDTV. For economy reasons, these resolutions have been standardized to 720p and 1080p and soon it would be added more.
Pixels or scanlines, it doesn't matter, the principle is exactly the same.

Quote:
Originally Posted by Ghitulescu View Post
720p has 720. However, the video was cropped, so on a PC one would see only, as an example, 544 lines. I am not talking about displaying on a TV set, because that would mean adding matting, as 1280x544 is illegal for any HD TV set. People that crop do this obviously for watching on a PC with a PC display.
I'd crop regardless, but your argument is silly. The actual image itself still only uses 544 scanlines on a TV set.

Quote:
Originally Posted by Ghitulescu View Post
I don't trust CNET's reviews, and for the rest you have the answers above.
Well I could provide more links of other reviewers saying the same thing. Feel free to provide one which contradicts CNET. Feel free to offer your own experiences when comparing a typically sized 1080p and 720p television, if you've ever done so.
For the rest you had no answers.

Quote:
Originally Posted by Ghitulescu View Post
However, I cannot stop commenting that your mathematics can bring nice results, too. 720 is just 1.25x PAL, and 1080 is just 1.5x 720. It follows that there should be less vizible differences between PAL progressive and 720p than between 720p and 1080p.
I've no idea how "it follows". I can scan a picture at 600 DPI and then again at 1200 DPI, but it doesn't necessarily follow there will be visible differences between them. It depends how the images are viewed or the size at which they're printed etc.

Quote:
Originally Posted by Ghitulescu View Post
But video is not all pixels.
Maybe not, but it's certainly not semantics which sees HD become SD simply because it's been cropped, when the picture itself contains the same number of pixels or scanlines either way.

Quote:
Originally Posted by Ghitulescu View Post
An SD image won't be an HDTV one just because one rounded it with black bars to add black pixels up to 1920x1080/1088, but for the purpose of a standard it would be.
Yeah, but if it's a HD image that's been cropped, and if it's re-encoded with the black bars added back, it's still a HDTV image and it then conforms to the standard which has you arguing semantics where encoded black bars count towards a definition of HD while artificial matting does not. It's a silly argument.
yetanotherid is offline  
Old 2nd November 2010, 09:09   #17  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Quote:
Originally Posted by yetanotherid View Post
Pixels or scanlines, it doesn't matter, the principle is exactly the same.
You want to start another polemic. However, before going into a debate please take your time and read the standards. After reading them (and understand them) you'll probably realise that I was right , it's not the principle, it's not the common sense, it's not your imagination, but it's the standard the one that decides what HDTV is.
Quote:
Originally Posted by yetanotherid View Post
I'd crop regardless, but your argument is silly. The actual image itself still only uses 544 scanlines on a TV set.
No, you're wrong. Read my statement few posts above. Your brain sees there 544 lines, the TV knows there are 720.
Quote:
Originally Posted by yetanotherid View Post
Well I could provide more links of other reviewers saying the same thing. Feel free to provide one which contradicts CNET. Feel free to offer your own experiences when comparing a typically sized 1080p and 720p television, if you've ever done so.
I make my mind reading real magazines, those that actually measure using laboratory tools, and they are quite independent, and not market oriented.
However, the CNET testing of 720p vs 1080p is not an objective one. I said this long time ago, you may check my posts ( ), in short, the comparison needs two native chains for each resolution, from acquisition to display, which is hard to achieve, as no 1080p live cameras existed (in 2009) and no 720p scans of film movies. So one of the chains would have to display a resampled movie, which would disadvantage one of the two.
Quote:
Originally Posted by yetanotherid View Post
I can scan a picture at 600 DPI and then again at 1200 DPI, but it doesn't necessarily follow there will be visible differences between them. It depends how the images are viewed or the size at which they're printed etc.
How wrong can you be. -> http://www.andromeda.com/people/ddye...-transfer.html or http://clarkvision.com/articles/scan...html#testarea1. There is a sensible difference in the details a native 4000lpi scanner yields vs a native 1200. I'm talking about the details that are implicitly there, not the desired printsize (for a Poster or a website).
Quote:
Originally Posted by yetanotherid View Post
Maybe not, but it's certainly not semantics which sees HD become SD simply because it's been cropped, when the picture itself contains the same number of pixels or scanlines either way.
You failed to understand and your logic is faulty: it's either the same number of pixels, or one image has been cropped (less pixels, less scanlines).
Quote:
Originally Posted by yetanotherid View Post
Yeah, but if it's a HD image that's been cropped, and if it's re-encoded with the black bars added back, it's still a HDTV image and it then conforms to the standard which has you arguing semantics where encoded black bars count towards a definition of HD while artificial matting does not. It's a silly argument.
You failed again to see the point. This is mostly due to your wrong perception of the things. This is why standards come into play, to avoid people guessing or interpreting things like they want or need.
I know that your mind cannot accommodate the idea that black bars (the matting) are actually a part of the picture, which sometimes it's covered with subtitles (this might help you understand, because one cannot draw subtitles over something that doesn't exist ).
Ghitulescu is offline  
Old 2nd November 2010, 14:07   #18  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
Tangential to the discussion, but important...

http://en.wikipedia.org/wiki/Optimum...ewing_distance
and
http://carltonbale.com/1080p-does-matter

FWIW I think the absolute limit for being able to detect a single pixel separate from the adjacent one is 1 pixel = 0.5 arcminute. This follows from the construction of the human eye: 120 cones (detectors) in 1 degree; and from mapping (at very best) 1 pixel to 1 cone. Put another way, 1 pixel on screen needs to fill 1.45 x 10^-4 times the distance from the screen (tan(1/120)). Bigger pixels could be visible as discrete pixels. Smaller pixels would be wasted.

e.g. 1920 pixels * 3m away from the screen * 1.45 * 10^-4 = 84cm wide screen.

If you convert this to feet and inches, and compare it with this graph...
http://s3.carltonbale.com/resolution_chart.html
...you'll find that the "full benefit" lines on that graph give half the viewing distance I've calculated; they probably used 1 arcminute as the reference, equivalent to mapping 1 pixel onto 2 cones (detectors) in your eye, which pretty much makes sure you get the benefit. The "benefit starts to become noticeable" areas are kind of arbitrary, because the benefit might become noticeable even earlier. Even so, I think the numbers on that graph are far more applicable in real life than the theoretical limit for detecting pixels if the source is clean.

Once you throw in some lossy coding artefacts, there's some justification for using the stricter 0.5 arcminute measure, because being able to just about resolve a single pixel means you can probably see any visible coding artefacts. Whether you care about these in motion when watching an interesting movie is a different thing entirely!

Cheers,
David.

Last edited by 2Bdecided; 2nd November 2010 at 14:25. Reason: typos
2Bdecided is offline  
Old 2nd November 2010, 14:14   #19  |  Link
Ghitulescu
Registered User
 
Ghitulescu's Avatar
 
Join Date: Mar 2009
Location: Germany
Posts: 5,769
Thank you David for bringing the charts, I knew about them but I couldn't find them in time for the reply.
Ghitulescu is offline  
Old 2nd November 2010, 14:26   #20  |  Link
Sulik
Registered User
 
Join Date: Jan 2002
Location: San Jose, CA
Posts: 216
You'll also notice that both 1080p and 720p have a convenient square pixel aspect ratio for a 16:9 display (1080x16/9=1920, 720x16/9=1280).
Sulik is offline  
Closed Thread

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:29.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.