Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion

Reply
 
Thread Tools Search this Thread Display Modes
Old 1st January 2010, 17:50   #21  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by Manao View Post
And even so, what legacy ? Is there an analog 1080i ? No. So there is no legacy to preserve here.
Broadcast, storage and interfaces usually does not support 1080p50/60. In other words, 720p60, 1080p30 and 1080i60 are the available options.

Especially for 24fps movie content, the case could be made for a 60i container. But it introduce an infinite list of possible screw-ups for engineers and content producers...
Quote:
Especially since the tradeoff is actually worse than people think. On the paper, interlaced may sound good : you get the full vertical resolution when there are no motion, and the full temporal resolution when it moves. So 1080p60 and 1080i60 are supposed to be comparable, with 1080i60 saving perhaps 25% bitrate after compression, and reducing the decoding needs.

That's on paper only. As it happens :[*]you don't get the full vertical resolution. Oh, sure, there are 1080 row of pixels, so when the video is still, you're supposed to look at a 1080p video. And you do. Except that video has been downpassed vertically, so you are actually looking at a content that only has 600 or so rows of pixels of actual information.
I have heard information to the contrary: SD interlacing includes a vertical lowpass filter, while HD interlacing should not. I do not claim to know this for a fact.

I would have guessed that a high-end interlacer could be content-adaptive, filtering only moving parts of the scene?

Anyways, using interlacing as an extra layer of lossy compression makes little sense. If interlacing is/was a good way of removing bits while keeping quality, then MPEG/ITU codecs would do interlacing internally on progressive signals. And then there would be complete end-to-end control of what had been done and how it should be converted. The same can be said about colorspace conversion and decimation, though.
Quote:
[*]You're sending an interlaced signal to the TV, so somebody has to deinterlace it. Guess what, deinterlacing isn't cheap. When accumulated with the cost for mbaff, I think we reach the computational cost of 1080p60. But I cheat a bit here, because for legacy purpose, you would have needed a deinterlacer for SD content (but not for HD)
Sony and Philips have invested heavily in deinterlacing. One might suspect that they have an interest in keeping legacy formats that other companies does not do equally well.

I remember reading a Philips paper in which they compared 25p, 50i and 50p when encoding as bitrate-constrained MPEG2. The conclusion was that MPEG2 + 50i + HQ deinterlacing had best rate-distortion characteristics. Perhaps because MPEG2 lacked deblocking?

Quote:
Now, I may be biased on the subject, and I might miss some arguments in favor to interlacing. But I don't see which ones.
There is a case for interlacing in sensors. If you are bandwidth/heat-constrained, then 60i may be better than 30p, especially if you can tailor OLPF and deinterlacing for the task. I would deinterlace as early as possible though.

There are some physical stuff in sensors that I do not understand very well. Integration time and bandwidth for instance.

Last edited by knutinh; 1st January 2010 at 17:57.
knutinh is offline   Reply With Quote
Old 1st January 2010, 18:10   #22  |  Link
Manao
Registered User
 
Join Date: Jan 2002
Location: France
Posts: 2,856
Quote:
I have heard information to the contrary: SD interlacing includes a vertical lowpass filter, while HD interlacing should not. I do not claim to know this for a fact.
That's interesting. I must admit that I noticed there was a lowpassing when I took a 720p50 content and tried to turn it into a 576i50 one. It was unwatchable without (strong) lowpassing, thus I concluded that lowpassing was necessary for interlacing, and extrapolated the same was happening in HD. I was confirmed in that opinion when I read that document, that explains how a 2160p video was transformed into a 1080i one (a 540p field is created by averaging four lines), but perhaps that document is incorrect.
Quote:
I remember reading a Philips paper in which they compared 25p, 50i and 50p when encoding as bitrate-constrained MPEG2. The conclusion was that MPEG2 + 50i + HQ deinterlacing had best rate-distortion characteristics. Perhaps because MPEG2 lacked deblocking?
There have been a lot of studies regarding 1080ixx vs 1080pxx. AFAIK, all of them but one (EBU's) concluded 1080i was better. I don't really know what to make of that, and I would have liked to see 720pxx thrown in the lot too.
__________________
Manao is offline   Reply With Quote
Old 1st January 2010, 18:42   #23  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by Manao View Post
That's interesting. I must admit that I noticed there was a lowpassing when I took a 720p50 content and tried to turn it into a 576i50 one. It was unwatchable without (strong) lowpassing, thus I concluded that lowpassing was necessary for interlacing, and extrapolated the same was happening in HD. I was confirmed in that opinion when I read that document, that explains how a 2160p video was transformed into a 1080i one (a 540p field is created by averaging four lines), but perhaps that document is incorrect.
Thanks for the link! Repeating it here for the discussion:
Quote:
Originally Posted by ebu
a) Interlacing to 1080i
For interlacing, every second 2164p-frame was shifted vertically two lines downwards. After deleting the first two and last two lines in the frames that were shifted (and the last four lines for the frames that were not shifted) to get 2160 lines again, each frame was filtered to 540 lines by line averaging (using Shake’s Box Filter). Horizontal filtering from 3840 to 1920 columns was performed using Shake’s Sinc Filter to benefit in perceived sharpness from the “oversampled” master. The two 540-line fields where then weaved into one single 1080-line interlaced frame. This process resembles the process in any video camera performing interlace in the basic default ‘Field Integration Mode’ – i.e. like a 2160 line video camera sensor reading out the
average of the sensor’s line 1+2+3+4 to Field 1; line 3+4+5+6 to Field 2; line 5+6+7+8 to Field 1 etc.
http://sci.tech-archive.net/Archive/.../msg00026.html
Quote:
Originally Posted by jens
Normally, if you have interlaced scanning with a video (NTSC) signal,
and do field integration, you put lines 1+2, 3+4, 5+6... together for
the first half-videoframe, and 2+3, 4+5, 6+7... for the second half-
videoframe.

This leads to a somewhat lower vertical resolution because of the
interpolation, but more than 240 lines.

In frame integration you normally get lines 1,3,5... for the first
half, and 2,4,6 for the second. So you get a better vertical resolution
(no interpolation), but you loose half of the charges what results in
higher noise.

Jens
http://www.damtp.cam.ac.uk/lab/digimage/cameras.htm
Quote:
To complicate matters further, different cameras construct the two video fields in different manners. In some cameras the even field corresponds to the even lines of pixels in the CCD chip, and the odd field to the odd lines of pixels in the CCD chip.
...
Slightly better are cameras which produce an average of the even lines and the preceding odd lines for the even field, and the odd lines and the preceding even lines for the odd field.
To me it seems that both ways of producing interlaced content is feasible, and might(?) be found in different video cameras?

There will typically be an optical lowpass filter in front of the sensor that smears out details to some degree (have to show some respect to Nyquist) and optics seldomly have perfekt spatial frequency response either.
Quote:
Originally Posted by Manao
There have been a lot of studies regarding 1080ixx vs 1080pxx. AFAIK, all of them but one (EBU's) concluded 1080i was better. I don't really know what to make of that, and I would have liked to see 720pxx thrown in the lot too.
I think that finding a suitable "success metric" is going to be difficult and political.

Do you use PSNR, SSIM, or real people?

Do you average across all codecs and deinterlacer implementations (optimizing mean viewer experience), or only for some idealized reference implementation?

I think that 1080p60 with good lossy compression will be best on a quality vs bandwidth benchmark. But is that all? How do you factor in quality vs price?

BBC concluded that 720p was enough for the UK public as long as screen sizes did not go much beyond 50".

Last edited by knutinh; 1st January 2010 at 18:56.
knutinh is offline   Reply With Quote
Old 2nd January 2010, 00:47   #24  |  Link
MfA
Registered User
 
Join Date: Mar 2002
Posts: 1,075
I know what to conclude, European standards agencies are superior ... it's like ITU vs MPEG, hardly a contest
MfA is offline   Reply With Quote
Old 2nd January 2010, 08:35   #25  |  Link
kieranrk
Registered User
 
Join Date: Jun 2009
Location: London, United Kingdom
Posts: 707
Quote:
Originally Posted by knutinh View Post
BBC concluded that 720p was enough for the UK public as long as screen sizes did not go much beyond 50".
It's a shame that they once published information that actually made sense.

Quote:
Originally Posted by knutinh View Post
I have heard information to the contrary: SD interlacing includes a vertical lowpass filter, while HD interlacing should not. I do not claim to know this for a fact.
I always thought SD/HD were treated the same like Manao.

Last edited by kieranrk; 2nd January 2010 at 08:39.
kieranrk is offline   Reply With Quote
Old 2nd January 2010, 14:47   #26  |  Link
MfA
Registered User
 
Join Date: Mar 2002
Posts: 1,075
HD displays are all progressive, so the old type of flicker for static thin lines (fine text for instance) doesn't exist anymore when the deinterlacer works well (of course deinterlacers don't always work well). Motion dependent aliasing is actually more likely though for HD (whenever something moves near a multiple of 1 pel per field vertically the vertical bandwidth is halved).

A sports broadcast with interlacing and without a flicker filter would be interesting ... the grass would look lovely I bet.
MfA is offline   Reply With Quote
Old 4th January 2010, 14:45   #27  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by kieranrk View Post
It's a shame that they once published information that actually made sense.
Do you think that the report does not make sense, or do you think that they have degraded since?

http://downloads.bbc.co.uk/rd/pubs/w...les/WHP092.pdf

-k

Last edited by knutinh; 4th January 2010 at 14:48.
knutinh is offline   Reply With Quote
Old 5th January 2010, 15:15   #28  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
All UK HD broadcasts are 1080i - there's no 720p - not "even" from the BBC.

And the official line is that while some old tests showed 720p looked better at lower bitrates than 1080i, encoders have improved and 1080i now looks better.

Cheers,
David.
2Bdecided is offline   Reply With Quote
Old 5th January 2010, 15:38   #29  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by 2Bdecided View Post
All UK HD broadcasts are 1080i - there's no 720p - not "even" from the BBC.

And the official line is that while some old tests showed 720p looked better at lower bitrates than 1080i, encoders have improved and 1080i now looks better.

Cheers,
David.
Do you have sources on this?

For 24p-originating content I think that it is feasible to prove that 1080i is "best" for most display devices. I think that it is difficult to prove the same for general content (e.g. sport) given that there are many bad deinterlacers out there. My VideoSeven lcd contains one of them :-)

The question includes "interlacers", lossy encoders, bitrate, and deinterlacers/scalers. Many variables...

-k
knutinh is offline   Reply With Quote
Old 5th January 2010, 16:00   #30  |  Link
scharfis_brain
brainless
 
scharfis_brain's Avatar
 
Join Date: Mar 2003
Location: Germany
Posts: 3,653
but most broadcasters just take a 1080i feed and bob-deinterlace it to 720p.
Their deinterlacers aren't the best either (staristepping etc.).
Then you display will upscale this again to 1080p.

I think the content should be left in its original format.
__________________
Don't forget the 'c'!

Don't PM me for technical support, please.
scharfis_brain is offline   Reply With Quote
Old 5th January 2010, 16:16   #31  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
Quote:
Originally Posted by knutinh View Post
Anyways, using interlacing as an extra layer of lossy compression makes little sense. If interlacing is/was a good way of removing bits while keeping quality, then MPEG/ITU codecs would do interlacing internally on progressive signals.
Of course they don't, even though interlacing does (at least partly) achieve the gains it's supposed to. That's why it's used. It's not a conspiracy, and it's not a mistake - it actually works (i.e. gives better quality / lower bitrates). Even with H.264 (if the encoder handles interlacing well enough).

The other reason is that 1080 is a bigger number than 720 (and does look sharper on most TVs - even the previously common 768-line ones) - but the technology isn't out there to do 1080p50 yet, so you're stuck with interlacing.


It does make logical sense that packaging the (adaptive) interlacing and (adaptive) deinterlacing into the encoder should make it work better than externally - but it's more complexity: more tuning in the encoder; more work in the decoder. Has anyone ever done it?

Cheers,
David.
2Bdecided is offline   Reply With Quote
Old 5th January 2010, 16:27   #32  |  Link
Manao
Registered User
 
Join Date: Jan 2002
Location: France
Posts: 2,856
Quote:
It does make logical sense that packaging the (adaptive) interlacing and (adaptive) deinterlacing into the encoder should make it work better than externally - but it's more complexity: more tuning in the encoder; more work in the decoder. Has anyone ever done it?
It would be easier. It moves : resize and encode in 720p. It's static, keep 1080p and halve the framerate. You could even do that somewhat adaptively per macroblock. It would still be easier than mbaff.
__________________
Manao is offline   Reply With Quote
Old 5th January 2010, 18:23   #33  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by 2Bdecided View Post
Of course they don't, even though interlacing does (at least partly) achieve the gains it's supposed to. That's why it's used. It's not a conspiracy, and it's not a mistake - it actually works (i.e. gives better quality / lower bitrates). Even with H.264 (if the encoder handles interlacing well enough).
I am not suggesting that it is a conspiracy, I am using it as an argument that you are wrong :-) Can you offer some references that h264 with interlacing has better PSNR/SSIM/subjective quality than h264 without?

For your statements to be generally right, I think one would expect that compressing any original 1080p50 sequence at:
1)1080@50p, h264, X mbps
2)1080@50i, h264, X mbps
3)720@50p, h264, X mbps

Would (on average) be best for 2) for any bitrate X. I highly doubt that to be true, but I have read Philips white-papers suggesting that they could make make 2) true if they used:
A)Philips' advanced deinterlacing
B)MPEG2 without deblocking filtering
C)At constrained bitrates

I think that B) was suggested as an important explanation.

The standardization organs are competitive about compression gain. If integrating interlacing/deinterlacing in the codec resulted in improved PQ for a given bitrate and a given implementation cost, surely someone would suggest it, have it implemented in the standard?
Quote:
It does make logical sense that packaging the (adaptive) interlacing and (adaptive) deinterlacing into the encoder should make it work better than externally - but it's more complexity: more tuning in the encoder; more work in the decoder. Has anyone ever done it?
Things such as deblocking-filter and B-frames (framerate upconversion) have been integrated into codecs, even though they initially seem to have come from outside the codec. Reason seems to be that they had good PQ to bitrate/complexity ratios and they could do better inside the codec than outside.

I think that all sense indicates that if the source is progressive (not always true), then doing interlacing within the codec will give major benefits for image quality and possibly total complexity as opposed to doing it externally. Advanced deinterlacers do all kinds of "artificial intelligence" that they should not have to do given precise signalling on how the content was actually produced. Motion vectors could be jointly optimized for tracking motion and describing candidates for filling in lines, saving a lot of cycles and having the luxury of optimizing for the ground-truth in the encoder.


It might be that I/we are setting the wrong background for the discussion. 1080p50 is not generally the source, and if one made 1080p50 cameras, they would have worse noise-performance. If that is the case, then interlacing could be a reasonable technology in the camera to overcome sensor limitations. If that is the case, then it may be the case that deinterlacing in the camera to 1080p50 does not increase quality/bitrate sufficiently, but does increase complexity considerably. I dont know.

-k

Last edited by knutinh; 5th January 2010 at 18:53.
knutinh is offline   Reply With Quote
Old 5th January 2010, 22:18   #34  |  Link
MfA
Registered User
 
Join Date: Mar 2002
Posts: 1,075
Quote:
Originally Posted by 2Bdecided View Post
look sharper on most TVs - even the previously common 768-line ones
It can look sharper in theory, but only if you allow aliasing ... if you remove all the aliasing you are essentially going to be halving the vertical resolution (which is why generally some aliasing is left and they reduce vertical bandwidth by ~70%).
MfA is offline   Reply With Quote
Old 6th January 2010, 00:44   #35  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by MfA View Post
It can look sharper in theory, but only if you allow aliasing ... if you remove all the aliasing you are essentially going to be halving the vertical resolution (which is why generally some aliasing is left and they reduce vertical bandwidth by ~70%).
I guess that one way of looking at it - assuming perfect content-adaptive interlacers and/or deinterlacers (prefe
reably exchanging metadata) would be a system capable of 1920x540@50p <-> 1920x1080@25p and anything in-between on a spatial/temporal as-needed basis. When doing something like a seemingly perfect 1920x1080@50p linear pan, that would be based on (possibly sensible) assumptions about how scenes are captured, but still doing bad errors.

Superresolution systems depend on aliased input. I think that they overlap a lot with interlacing (at least in theory).

Do you think that it is possible to use correlation to estimate if the interlacer used "field integration mode" or "frame integration mode" (or possibly some 70% vertical filtering), and use that information to select between different modes of deinterlacer agressiveness?

-k
knutinh is offline   Reply With Quote
Old 6th January 2010, 12:20   #36  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
Quote:
Originally Posted by MfA View Post
It can look sharper in theory, but only if you allow aliasing ... if you remove all the aliasing you are essentially going to be halving the vertical resolution (which is why generally some aliasing is left and they reduce vertical bandwidth by ~70%).
Well, if you have a 1366 x 768 display, then obviously a 1920 width source will look sharper than a 1280 width source in the horizontal direction. There can be no argument there.

Whether the 1080 line interlaced version or 720 line progressive version looks sharper depends on the factors you mention. At best, 1080 can look sharper (by as much as the 1080:720 ratio suggests), with dumb deinterlacing they're quite similar, but the 1080i version will visibly bob. It's rare for interlaced signals to be filtered to half the vertical resolution, so suggesting you'll get 540 vs 720 isn't realistic.

Cheers,
David.
2Bdecided is offline   Reply With Quote
Old 6th January 2010, 12:35   #37  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
Quote:
Originally Posted by knutinh View Post
For your statements to be generally right, I think one would expect that compressing any original 1080p50 sequence at:
1)1080@50p, h264, X mbps
2)1080@50i, h264, X mbps
3)720@50p, h264, X mbps
Like this...

http://www.ebu.ch/CMSimages/en/tec_e...tcm6-46693.pdf

In 2006, the EBU (inc the IRT, SVT, etc) tried very hard to convince everyone that 1080i wasn't worth it. A year earlier, the IRT were doing demonstrations of this, intentionally using the worst MPEG-4 codec they could find wrt interlacing capability!

Yet when it came to launching HD across Europe, broadcasters chose 1080i. The reason they give is that with newer encoders and full HD displays, 1080i is the current sweet spot.

Part of the problem is probably that they don't have 1080p easily available as a delivery format yet.


It's easier to do the test at SD resolutions, and just as valid. If interlacing is useless, then 720x576p50 at a given bitrate should always look better than 720x576i50 at the same bitrate.

With x264, I think that might be true. But broadcasters are saying that the hardware encoders they have available don't give this result.

I haven't seen any published tests that match this - quite the opposite...
http://ip.hhi.de/imagecom_G1/assets/..._hdtv_2008.pdf
...either the broadcasters are lying - or encoders have changed since that paper was written. Note: they create a 1080i50 signal using about the worst possible method in that paper.

Cheers,
David.
2Bdecided is offline   Reply With Quote
Old 6th January 2010, 12:50   #38  |  Link
Manao
Registered User
 
Join Date: Jan 2002
Location: France
Posts: 2,856
Quote:
It's easier to do the test at SD resolutions, and just as valid.
Not really. SD on a HD TV looks ugly, and on a CRT, you can't show 576p.
__________________
Manao is offline   Reply With Quote
Old 6th January 2010, 12:52   #39  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by 2Bdecided View Post
Well, if you have a 1366 x 768 display, then obviously a 1920 width source will look sharper than a 1280 width source in the horizontal direction. There can be no argument there.
Agreed. Smart displays could even benefit from a little subpixel scaling.

Quote:
Whether the 1080 line interlaced version or 720 line progressive version looks sharper depends on the factors you mention. At best, 1080 can look sharper (by as much as the 1080:720 ratio suggests), with dumb deinterlacing they're quite similar, but the 1080i version will visibly bob. It's rare for interlaced signals to be filtered to half the vertical resolution, so suggesting you'll get 540 vs 720 isn't realistic.

Cheers,
David.
The camera info that I found suggested that native 1080i capture will either be:
1)Non-filtered (at least electronically), meaning that you let through all aliasing allowed by the transfer function of optics and Optical Lowpass-filter.
2)Sensor line#1 and line#2 is averaged to produce line#1 of field#1. Line#2 and line#3 is averaged to produce line#1 of field#2. I believe this to be a vertical 2-tap boxcar pre-filter. It has a null at fs/2, lets through significant aliasing between fs/4 and fs/2, and attenuates some passband detail below fs/4.

For cases where interlacing is applied digitally on a progressive source, there should be many more options. Either tailor-made static filtering, or scene-adaptive filter cutoff. Do you know anthing about what the actually do?

For embedding 1080@24p inside 1080@60i (or 1080@25p inside 1080@50i), I think that they should employ no vertical filtering.

BTW, nice to see that hydrogenaudio-members are into video as well.

-k
knutinh is offline   Reply With Quote
Old 6th January 2010, 13:00   #40  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by 2Bdecided View Post
I only found a single page describing a setup. Was there supposed to be any results?

A good one. Thank you.

Quote:
Yet when it came to launching HD across Europe, broadcasters chose 1080i. The reason they give is that with newer encoders and full HD displays, 1080i is the current sweet spot.
But they are not academics. If the market responds more positively to "1080" than "720", they will offer it, no matter if it is technically "better", wont they?
Quote:
Part of the problem is probably that they don't have 1080p easily available as a delivery format yet.
Why is it a problem to use 720p?
Quote:
It's easier to do the test at SD resolutions, and just as valid. If interlacing is useless, then 720x576p50 at a given bitrate should always look better than 720x576i50 at the same bitrate.
I think that you are right. By having a display that has far higher resolution than the content, we can effectively "factor it out".

It might be that tests at 576i/576p/384p should be carried out at larger distances/smaller displays to be representative of 1080i/1080p/720p.


I believe that the tv-industry is quite conservative. Where IT change equipment and mindset every 3 years, these guys tends to have 20 year cycles. They have invested heavily in editing equipment and interfaces that is limited to 1080i. The big manufacturers have an interest in differentiating themselves through superior deinterlacing. For cameras, there seems to be a potential advantage to do native interlaced capture. For 24p content, they have a working (sort of) channel using 60i/50i.

-k

Last edited by knutinh; 6th January 2010 at 13:15.
knutinh is offline   Reply With Quote
Reply

Tags
content, deinterlace, interlaced, progressive, quality

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:07.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.