Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion

Reply
 
Thread Tools Search this Thread Display Modes
Old 30th August 2017, 15:24   #1  |  Link
Sci-Fi-Fan
Registered User
 
Join Date: Oct 2002
Location: England
Posts: 44
Interlaced Video

Can any of you guru's out there explain to me why a Tv Network here in the United Kingdom is still using Interlaced Video in 2017?

As far as I understand it, interlacing was used decades ago to double horizontal resolutions for ancient crt tvs that could only display 256 horizontal scan lines.

But with modern HD and 4K progressive digital displays that limitation has long since gone the way of the dinosaur.

So I just cannot fathom why a studio is still using such an obsolete technology in our modern digital world..

Can any makes sense of this?
Sci-Fi-Fan is offline   Reply With Quote
Old 30th August 2017, 17:28   #2  |  Link
Katie Boundary
Registered User
 
Katie Boundary's Avatar
 
Join Date: Jan 2015
Posts: 771
Quote:
Originally Posted by Sci-Fi-Fan View Post
Can any of you guru's out there explain to me why a Tv Network here in the United Kingdom is still using Interlaced Video in 2017?
Because whatever program they're showing was originally made that way and deinterlacing it would be more trouble than it's worth?

Quote:
As far as I understand it, interlacing was used decades ago to double horizontal resolutions for ancient crt tvs that could only display 256 horizontal scan lines.
Then you understand incorrectly. NTSC sets like the ones used in the USA and UK have 480 scanlines, and nothing about CRT technology inherently prevented us from making a TV standard with 100,000,000 scanlines or any other number we wanted. The real reason for interlacing was to maximize the screen refresh rate without increasing bandwidth usage.
__________________
If I ask "How do I do X?" or "what happens if I do X?", and X is a very bad thing that no one would ever normally do, assume that I already know this, and that I have Katie reasons for asking anyway.
Katie Boundary is offline   Reply With Quote
Old 30th August 2017, 17:28   #3  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
My understanding is interlacing is also a way to squeeze more out the available bandwidth.

It's true that very early CRTs could only scan a limited number of lines per second (as opposed to displaying them) so for interlaced video they'd refresh every even line, then every odd line, which also helped reduce flicker and increase the smoothness of motion. At 50Hz the limit was originally a little over 200 scan lines per 1/50th second, so the UK system began as 405 line system and later increased to 625 line as technology improved. I don't know when the UK switched but I'm pretty sure the very early episodes of Doctor Who were 405 line.

I'd imagine the motivation for interlacing today would still be bandwidth/compression related, but 1080i has a higher temporal resolution than 720p at the same frame rate, so I assume it'd be more likely to be used for sporting events where there's lots of motion, or maybe broadcasters think 1080i sounds more impressive than 720p, although I'd prefer the latter most of the time.

In Australia the switch from mpeg2 to mpeg4 has been slow because many early HD TVs and set top boxes can't decoded mpeg4. I watch almost no free-to air TV, but I think some broadcasters are using mpeg4 for some non-primary channels these days, while the rest is still overly-compressed mpeg2 crap. The latest "Freeview certified" devices only have to support 576p, 720p and 1080i, although mpeg4 is mandatory. We might have 4k TVs but I can't see free-to air broadcasts being 4k for a long time. I don't know what the situation is in the UK.

Last edited by hello_hello; 30th August 2017 at 18:12.
hello_hello is offline   Reply With Quote
Old 30th August 2017, 17:43   #4  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
Quote:
Originally Posted by Katie Boundary View Post
Then you understand incorrectly. NTSC sets like the ones used in the USA and UK have 480 scanlines, and nothing about CRT technology inherently prevented us from making a TV standard with 100,000,000 scanlines or any other number we wanted. The real reason for interlacing was to maximize the screen refresh rate without increasing bandwidth usage.
Not true. Aside from the fact the UK used PAL sets, not NTSC, a CRT picture is "drawn" one line at a time by an electron beam and there were lots of limitations to overcome, hence the original UK system only being 405 line.
https://en.wikipedia.org/wiki/Interlaced_video#History

Edit: Or depending how far back you go, as I discovered here, the BBC briefly broadcast using a 240 line progressive system and even a 30 line system before that.

Last edited by hello_hello; 30th August 2017 at 18:09.
hello_hello is offline   Reply With Quote
Old 30th August 2017, 18:15   #5  |  Link
Sci-Fi-Fan
Registered User
 
Join Date: Oct 2002
Location: England
Posts: 44
Thank you hello_hello

Ok from a broadcast / bandwidth pov it has some merits, but on a brand new Blu-Ray filmed in 2016, broadcast and released in 2017, surely it would be filmed and mastered in 1080p.
Then again doctor who didn't move from standard definition to HD until. 2010.

Interlacing may use less bandwidth, but imo the loss of quality is not worth the compromise.
Sci-Fi-Fan is offline   Reply With Quote
Old 30th August 2017, 18:19   #6  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,475
Quote:
Originally Posted by Sci-Fi-Fan View Post
Can any of you guru's out there explain to me why a Tv Network here in the United Kingdom is still using Interlaced Video in 2017?

As far as I understand it, interlacing was used decades ago to double horizontal resolutions for ancient crt tvs that could only display 256 horizontal scan lines.

But with modern HD and 4K progressive digital displays that limitation has long since gone the way of the dinosaur.

So I just cannot fathom why a studio is still using such an obsolete technology in our modern digital world..

Can any makes sense of this?
Because they have tons of equipment which supports only 50i and they not going to invest millions to replace it. TV is so legacy
Movies are actually transmitted in progressive mode ( at least on some channels), so it's not true that everything is shown as interlaced. It will also depend on original nature of the content. There is still plenty interlaced shot content. Next step would be to move to 50p (not 25p as this is problematic) which requires quite a big hardware changes. They just not willing to invest, that's why you have Netflix etc which was first to stream 4K/HDR etc. Technology is changing fast and broadcast is very slow to adapt as it require a lot of investment.

Last edited by kolak; 30th August 2017 at 18:24.
kolak is offline   Reply With Quote
Old 30th August 2017, 18:22   #7  |  Link
poisondeathray
Registered User
 
Join Date: Sep 2007
Posts: 4,485
Quote:
Originally Posted by Sci-Fi-Fan View Post

Ok from a broadcast / bandwidth pov it has some merits, but on a brand new Blu-Ray filmed in 2016, broadcast and released in 2017, surely it would be filmed and mastered in 1080p.
Then again doctor who didn't move from standard definition to HD until. 2010.
ok, but was it a "film", actually filmed ? A theatrical piece ?

Or was it "video", like you originally stated ?

Because one common reason is BD doesn't support 1080p50 or 1080p60 , so if you need the temporal resolution, they are released as "i" . Although 720p50 / 720p60 are valid for BD , marketing says "1080" "sounds" better than "720"

And if it's a 24p theatrical film conversion to "25p" , the actual content most likely is 25p, not interlaced .
poisondeathray is offline   Reply With Quote
Old 30th August 2017, 18:27   #8  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,475
Movies are actually transmitted as progressive (at least this is what my TV is telling me), so no problem here.
kolak is offline   Reply With Quote
Old 30th August 2017, 18:33   #9  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
Quote:
Originally Posted by Sci-Fi-Fan View Post
Thank you hello_hello

Ok from a broadcast / bandwidth pov it has some merits, but on a brand new Blu-Ray filmed in 2016, broadcast and released in 2017, surely it would be filmed and mastered in 1080p.
I'd agree, but according to this it's not a UK phenomenon. 1080i is a world-wide thing for free to air TV.
https://en.wikipedia.org/wiki/1080i#Broadcast_standard
Worldwide, most HD channels on satellite and cable broadcast in 1080i. In the United States, 1080i is the preferred format for most broadcasters, with Discovery Communications, Viacom, Time Warner, Comcast owned networks broadcasting in the format; along with most smaller broadcasters. Only 21st Century Fox-owned television networks and Disney-owned television networks, along with MLB Network and a few other cable networks use 720p as the preferred format for their networks; A+E Networks channels converted from 720p to 1080i sometime in 2013 due to acquired networks already transmitting in the 1080i format.

Would it necessarily be a bad thing if the content is progressive though, because the Progressive segmented frame system allows progressive video to be treated as interlaced, and I assume the fields only need to be recombined again for a progressive frame. If it's broadcast that way though, I don't know what happens at the TV end.... whether the fields are just combined rather than de-interlaced.... or if there's a way to distinguish it from real interlaced video, or if a TV would de-interlace blindly or only when combing is detected etc. I assume the chroma sub-sampling is the same as for interlaced but I've not really thought about any of that before. Someone else will probably know more, but in a perfect world PsF would be distinguishable from interlaced (EDIT: going by what kolak has said, it appears that's the case).

Quote:
Originally Posted by Sci-Fi-Fan View Post
Then again doctor who didn't move from standard definition to HD until. 2010.
I'm not sure if it's changed, but 576p (as opposed to 576i) was/is considered to be HD, at least in respect to government regulations in Australia. I kind of remember 576p being pretty common and being pretty peeved about it. For all I know, it still is.

Last edited by hello_hello; 30th August 2017 at 18:53.
hello_hello is offline   Reply With Quote
Old 30th August 2017, 18:49   #10  |  Link
Sci-Fi-Fan
Registered User
 
Join Date: Oct 2002
Location: England
Posts: 44
ok poisondeathray has hit the nail on the head. the uk pal standard of 25 fps is not blu-ray compliant at 1080p resolution, so as to be encoded as interlaced, or in the case of the blu-ray i'm referring to which is encoded as MBAFF.
so in all likely hood it is filmed as progressive but encoded as interlaced / MBAFF to be compliant with the blu-ray spec.

I Guess when they created the blu-ray spec they didn't take the PAL 25fps standard into consideration :/

Kolak also makes a very valid point about the cost of upgrading / replacing outdated equipment.
Sci-Fi-Fan is offline   Reply With Quote
Old 30th August 2017, 18:55   #11  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,475
Problem is that some older broadcast equipment doesn't even support PsF as this was introduce bit later than original interlaced specs.

This is probably how movies are transmitted- as PsF when they start at transmission room. At consumer end (on TVs decoders) this is treated as progressive and it's indistinguishable from "real" progressive.
kolak is offline   Reply With Quote
Old 30th August 2017, 19:12   #12  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
Quote:
Originally Posted by Sci-Fi-Fan View Post
ok poisondeathray has hit the nail on the head. the uk pal standard of 25 fps is not blu-ray compliant at 1080p resolution, so as to be encoded as interlaced, or in the case of the blu-ray i'm referring to which is encoded as MBAFF.
so in all likely hood it is filmed as progressive but encoded as interlaced / MBAFF to be compliant with the blu-ray spec.
According to the info here for Bluray compatibility 1080p25 has to be encoded with x264 using it's "fake interlaced" option, which I've never really understood completely.
The info here says it simply marks a progressive stream as interlaced, but given interlaced and progressive video are different in respect to chroma subsampling, I don't understand how that effects (or not) the way it'd be decoded.
hello_hello is offline   Reply With Quote
Old 30th August 2017, 19:19   #13  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,475
It's just treating 25p as interlaced, but with no offset between fields in time. Because there is no offset in time between them they can be perfectly put back into whole frame.
At the end you get the same picture (progressive). It's just a "hack" as 25p has been not included in BD spec for whatever reason!

Last edited by kolak; 30th August 2017 at 19:22.
kolak is offline   Reply With Quote
Old 30th August 2017, 19:48   #14  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
How does the decoder know it's fake interlaced though, as opposed to the real thing, and how would the player/TV know not to de-interlace it but treat is as progressive?

The same questions in relation to chroma subsampling. The way I understand it the chroma for each field is subsampled individually for interlaced video, given each field is a different moment in time, but how would the decoder know the "fake interlaced" video has progressive chroma subsampling rather than interlaced chroma subsampling? I'm thinking it should matter, but maybe not??

I assume for PsF, how it's done is part of the specification (although I can't find definitive information on how PsF's chroma-subsampling works) but "fake interlaced" seems like it'd be a different story.
hello_hello is offline   Reply With Quote
Old 30th August 2017, 20:16   #15  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,570
x264 "fake interlaced" is PAFF. In the bitstream there are flags to show the decoder which frames are progressive and which are made of two fields. This is allowed. And then x264 simply doesn't mark any frame as interlaced.
sneaker_ger is offline   Reply With Quote
Old 30th August 2017, 20:42   #16  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,475
Which should translate to perfect progressive decode.
PsF is related to SDI and handling it is defined by strict SDI spec. PsF doesn't really exist outside SDI world. PsF SDI signal by many video cards is translated to progressive HDMI signal (sometimes to interlaced).
It's all legacy.

Last edited by kolak; 30th August 2017 at 20:45.
kolak is offline   Reply With Quote
Old 30th August 2017, 20:47   #17  |  Link
Katie Boundary
Registered User
 
Katie Boundary's Avatar
 
Join Date: Jan 2015
Posts: 771
Sorry, I just checked and the UK uses PAL, not NTSC. I could have sworn that I once saw a standards map that showed the UK as NTSC.

I stand by everything else that I said.
__________________
If I ask "How do I do X?" or "what happens if I do X?", and X is a very bad thing that no one would ever normally do, assume that I already know this, and that I have Katie reasons for asking anyway.
Katie Boundary is offline   Reply With Quote
Old 30th August 2017, 22:08   #18  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
Quote:
Originally Posted by Katie Boundary View Post
I stand by everything else that I said.
Well if you're going to be wrong, why not be proud of it?
hello_hello is offline   Reply With Quote
Old 30th August 2017, 22:17   #19  |  Link
hello_hello
Registered User
 
Join Date: Mar 2011
Posts: 4,442
Quote:
Originally Posted by sneaker_ger View Post
x264 "fake interlaced" is PAFF. In the bitstream there are flags to show the decoder which frames are progressive and which are made of two fields. This is allowed. And then x264 simply doesn't mark any frame as interlaced.
Ahh..... that makes sense.

Cheers.
hello_hello is offline   Reply With Quote
Old 30th August 2017, 22:26   #20  |  Link
StainlessS
HeartlessS Usurer
 
StainlessS's Avatar
 
Join Date: Dec 2009
Location: Over the rainbow
Posts: 8,860
PAL
Code:
Afghanistan
Algeria
Argentina (N)
Austria
Australia
Bangladesh
Belgium
Brazil (M)
China
Denmark
Finland
Germany
Hong Kong
Iceland
India
Indonesia
Iraq
Ireland
Israel
Italy
Jordan
Kenya
Kuwait
Liberia
Malaysia
Netherlands
Nigeria
Norway
New Guinea
Pakistan
Singapore
South Africa
South W. Africa
Sudan
Sweden
Switzerland
Thailand
Turkey
Uganda
United Kingdom
United Arab Emirates
Yugoslavia
Zambia
NTSC
Code:
Canada
Chile
Costa Rica
Cuba
Dominican Republic
Ecuador
Japan
Mexico
Nicaragua
Panama
Peru
Philippines
Puerto Rico
South Korea
Taiwan
U.S.A.
NTSC, Rumoured to stand for Never Twice the Same Colour.
__________________
I sometimes post sober.
StainlessS@MediaFire ::: AND/OR ::: StainlessS@SendSpace

"Some infinities are bigger than other infinities", but how many of them are infinitely bigger ???
StainlessS is online now   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:52.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, vBulletin Solutions Inc.