View Full Version : Interlacing and channel bandwidth


Rinehart
07-30-2012, 03:34 PM
I've come across a curious statement in Albert Abramson's Zworykin: Pioneer of Television. Discussing the invention of odd-line interlacing to reduce flicker, he says "In addition [to reducing image flicker] doubling the field rate, which cut the number of lines in each field in half, afforded a considerable saving in channel bandwidth." (p. 118) But why would it? The number of lines scanned per second doesn't change, only the order does, and presumably this has no effect on the number of picture elements per line. Donald Fink in Television Engineering doesn't mention this as an advantage. On the contrary, he says "It must be understood that increasing the downward...velocities to twice the values they would have in progressive scanning does not mean that any more lines are scanned in the complete pattern." (p. 47, 1st edition, 1940) What am I missing here?

bob91343
07-30-2012, 05:24 PM
That is wrong; the bandwidth is unaffected by the scanning pattern.

I just hate it when someone is allowed to publish without intelligent editing.

Opcom
07-30-2012, 09:45 PM
Two hypothetical progressive scan cases where the horizontal pixel resolution and the number of H lines in a complete image on the CRT is the same as with an interlaced system:

If the 525 H lines are scanned progressively 60 times a second, the term "field rate" would have no meaning except as a V deflection rate and the frame rate would double from 30Hz to 60Hz, the H freq. would double from about 15.75KHz to about 31.5 KHz, and the required pixel rate or pixel clock, a form of bandwidth representation, would also double if the horizontal resolution were to remain the same. One solution would be to use more RF bandwidth.

The above would be similar to the non-interlaced 640x480 VGA 'standard' running about 60Hz V and 31.5KHz H.

If the 525 H lines are scanned progressively 30 times a second, the term "field rate" would have no meaning except as a V deflection rate and the frame rate would be the same at 30Hz, the H freq. would remain the same at about 15.75KHz, and the required pixel rate or pixel clock, a form of bandwidth representation, for the same H resolution, would remain the same.

There would likely be an annoying flicker due to the 30Hz vertical rate. It would be like a 24 frame film theater except that a CRT display is a bit brighter so the effect would be more pronounced, but offset by a 30Hz V rate. One solution would be to use be longer phosphors.


Therefore the interlacing scheme represents a compromise solution taking into account the relationship between these two factors:
1.) the pixel clock frequency
and
2.) the refresh rate for a given volume of pixels.

In most analog video systems, MHz equals pixels per interval.
resolution = pixels / time
bandwidth = information / time

The NTSC scheme interlaces half the image every 1/60 second. it takes 1/30 second to present the information. The first case above presents all of the information in 1/60 second. the second case presents the information in 1/30 second but in a progressive manner.

The author is correct in what he seems to have meant, but not in what was said. He may have not explained it completely or properly. It is possible that the clarity of his statement relies upon information presented elsewhere in the volume.

If what I have said is wrong, I am willing to consider rebuttals or corrections.

NewVista
07-31-2012, 01:52 AM
Isn't that the very reason TPTB chose I'lace: to halve the Chnl size.

old_tv_nut
07-31-2012, 05:34 PM
Interlace improves the TRADEOFF between flicker and bandwidth and spatial resolution; so the author's statement is correct - it just mentions one side of this three-legged stool.

By the way, movie projectors do not operate at 24 Hz because the flicker would be intolerable even at lower brightness. They always (at least) double-shutter to get a 48 Hz flicker rate.

ppppenguin
08-01-2012, 01:19 AM
For stationary pictures, as Opcom has explained, interlace does halve the bandwidth needed for a given resolution and refresh rate*. For moving pictures it's more complex. Yes, there is better temporal resolution for moving objects, though vertical resolution in those objects is reduced. Also when you try to de-interlace the picture, as required for LCD panels etc, you soon find out that it's not easy to do well.

It is now simple to do the TV equivalent of multiblade shutters as used in movie projectors. Framestores were a long way in the future in the 1930s:)

*It's not exactly half. There are additional artefacts caused by interlace that give a lower perceived vertical resolution than you might expect. The Kell factor is used to quantify this.

old_tv_nut
08-01-2012, 11:01 AM
...*It's not exactly half. There are additional artefacts caused by interlace that give a lower perceived vertical resolution than you might expect. The Kell factor is used to quantify this.

The "Kell Factor" relates to having scan lines (sampling in the vertical direction); there is an additional degradation if the lines are interlaced. But since all widely used TV standards were interlaced, the term Kell factor was applied to the net effect in interlaced pictures. This is taken to be roughly 0.7. This is a subjectively determined number and not a law of nature, and can vary greatly depending on the brightness of the picture, the viewing distance, the contrast of the test pattern details, and the refresh rate.

When non-interlaced sampling was considered (the horizontal pixel sampling in digital versions of 525- and 625-line systems, or vertical resolution of a progressively scanned system), a higher factor could be applied. For the hroizontal sampling, SMPTE and ITU standardized on filters that are 3 dB down at 0.85 of the Nyquist rate. With these specs, the system was judged to be transparent to the analog signal. The limiting resoluton is probably about 0.9 of Nyquist.

ppppenguin
08-02-2012, 02:29 AM
AFAIK there's nothing very scientific about the Kell factor. As oldtvnut says, it's determined subjectively without a great deal of theoretical backup. My feeling FWIW, is that it depends significantly on the vertical scanning aperture. In tube cameras this is gaussian which gives a falling vertical spatial frequency response. Usually corrected, at least partially by vertical aperture correction. Like wise in CRT receivers but without VAK. LCD displays and CCD cameras have a very different vertical aperture, much squarer.

The Kell factor has been used to justify decisions about choosing H bandwidth wrt number of lines. Has any good experimental work been done with modern cameras and displays? In any case all HD systems we use square pixels so if there is still a Kell effect there is a shortfall of vertical resolution.

old_tv_nut
08-02-2012, 02:36 PM
...In any case all HD systems we use square pixels so if there is still a Kell effect there is a shortfall of vertical resolution.

Not sure what you mean here. "Square pixels" is a lousy term, as it does not refer to the display elements having a certain shape. This phrase is used to mean that the centers of the pixels have the same spacing vertically and horizontally. Pixels are point values of color and do not have shape or dimensions.

If you mean the display elements have square shapes, then, yes, this implies a certain vertical and horizontal spatial frequency response, different from that with a Gaussian CRT spot.

ppppenguin
08-03-2012, 02:03 AM
Not sure what you mean here. "Square pixels" is a lousy term.....

This is a commonplace term to indicate that the pixel spacing is equal on H and V axes. By contrast to SD digital systems where the spacing is not equal.

As oldtvnut correctly points out, in all sampling theory the idealised sample is infinitesimal in length. (Dirac delta function if anyone is that interested). In TV this is generalised to 2 dimensions rather than one. Practical pixels have a finite size and shape. For LCD displays and CCD cameras this ideally approaches a square having the same dimensions as the pixel spacing. This gives a zero order hold function and hence a loss of HF response on both axes which follows a sin(x)/x curve.

The point I am trying to make is that the assumptions which underpin Kell Factor stem from the days when H scanning was a continuous function while vertical scan was sampled. These assumptions may well not apply when the picture is inherently sampled at the sensor on both axes. As a thought experiment consider a sensor and/or display where each pixel can be individually addressed. They can then be read or written in an arbitrary sequence*. I can conceive that this might affect motion protrayal (motion above a very slow rate is aliased in TV systems) but I cannot see how it might affect our perception of H and V resolution. Hence the Kell factor of a progressively scanned system using modern techniques should be unity.

I may have overlooked something here. For example unless there is some kind of optical filter before the sensor there can be H and V aliasing. Or there may be performance problems of the sensor that affect the axes differently.

*In doing this thought experiment I was influenced by BBC Research Report 1991/4 "Image Scanning using a Fractal Curve" by John Drewery. http://www.bbc.co.uk/rd/publications/rdreport_1991_04.shtml John Drewery had a superb understanding of scanning, sampling and spectra. Back in about 1975 I remember him demonstrating the 3 dimensional spectrum of TV signals (PAL in this case) using some wonderful models that he had the BBC Research Dept workshop make from pieces of coloured PTFE. Nowadays this would have been done by computer graphics.

NewVista
08-03-2012, 09:32 AM
There was some talk a while back that Europe, by delaying HD adoption,
intended to avoid any interlaced format in their new (1080?) standard.
What became of this?

old_tv_nut
08-03-2012, 03:02 PM
...Hence the Kell factor of a progressively scanned system using modern techniques should be unity.

I may have overlooked something here. For example unless there is some kind of optical filter before the sensor there can be H and V aliasing. ...


*In doing this thought experiment I was influenced by BBC Research Report 1991/4 "Image Scanning using a Fractal Curve" by John Drewery. http://www.bbc.co.uk/rd/publications/rdreport_1991_04.shtml John Drewery had a superb understanding of scanning, sampling and spectra. Back in about 1975 I remember him demonstrating the 3 dimensional spectrum of TV signals (PAL in this case) using some wonderful models that he had the BBC Research Dept workshop make from pieces of coloured PTFE. Nowadays this would have been done by computer graphics.

The Kell factor relates to how close to unity you can come, even with progressive scan. You can't get unity because the phase of sampling is restricted to the positions of the sampling points. That is, if the details lie exactly on the sampling points, you get full amplitude, but if they lie halfway between sampling pints, you get zero amplitude. The sampling becomes equivalent to a synchronous detector. So, the Kell factor says how close you can come to unity and still perceive a repetitive pattern correctly given that the filtering in the system consists of the sensor spot or element shape, the display spot or element shape, and the human eye optical function. Interlace makes it worse, but it's not unity for progressive.

For a full explanation of 3-dimensional spectra resulting from scanning, I also recommend an out-of-print book by Pearson:
http://www.amazon.com/Transmission-Display-Pictorial-Information-Pearson/dp/0727321013/ref=sr_1_1?s=books&ie=UTF8&qid=1344020526&sr=1-1&keywords=pearson+television+transmission

old_tv_nut
08-03-2012, 03:11 PM
Another note: Dr. Schreiber at MIT proposed random scan (random pixel sequence) as part of a high-definition TV system in the late 80s/early 90s. With proper frequency pre-emphasis / de-empahsis, channel degradations would appear as an increased noise level near edges, where it would be masked by the human visual system. Still images looked promising, but I don't recall if full high def motion was ever achieved. Such partially analog systems (Zenith also proposed one) were overtaken by the development of all-digital systems using MPEG compression.

ppppenguin
08-04-2012, 02:03 AM
You can't get unity because the phase of sampling is restricted to the positions of the sampling points. You can't get unity because the phase of sampling is restricted to the positions of the sampling points. That is, if the details lie exactly on the sampling points, you get full amplitude, but if they lie halfway between sampling pints, you get zero amplitude.

This would apply equally to both X and Y axes on a sampled system, so resolution is degraded euqally on both axes. It is also a misunderstanding of sampling theory. This works equally well in space as well as time. If you sample a signal using infinitesimal size samples at more than the Nyquist limit the original signal can be reconstructed exactly. This is also applicable to 2 dimensional sampling, as noted by Mertz and Gray in their famous 1934 paper. If the sampling aperture is finite you get a falling frequency response, the exact response depending on the shape of the sample aperture. For example a square aperture competely filling the sample pitch would give a sin(x)/x response with a little over 2dB drop as you approach the Nyquist limit. In the real world it is difficult to put a sharp cutoff optical filter ahead of the sensor so there will be aliasing. This suggests using a sensor with more pixels than needed for your TV system and filtering the output. This is equivalent to using oversampling ADCs. It is also one reason why some of the best SD pictures are obtained by downsampling an HD input.

NewVista
08-04-2012, 02:26 AM
. This suggests using a sensor with more pixels than needed for your TV system and filtering the output..

This should be a no-brainer yet only movie people use 4k cameras!
Always wondered why HDTV cameras 2k? Naive?

ppppenguin
08-04-2012, 06:32 AM
This should be a no-brainer yet only movie people use 4k cameras!
Always wondered why HDTV cameras 2k? Naive?

Depends on the tradeoffs when you're making the sensors. In the early days of CCDs they were struggling to make them work at all, you got as many pixels as you could reasonably make work at all. There are also 2 tradeoffs that may well be fundamental. For a given image size on the chip, if you have more pixels each one is smaller and hence collects less light. Also the fill factor, the fraction of the chip's surface that's sensitive to light, goes down. While a theoretically ideal sampler has infinitesimally small pixels it would also have infinitesimal sensitivity. Hence the sensor designer strives to fill as much of the space as possible with pixels and leave minimum space between them.

Image size is important. For cine camera replacement you want to be able to use your existing stock of 35mm prime lenses. Hence the sensor size needs to replicate 35mm film area. For TV the sensors are smaller.

I haven't looked at what size sensors Super Hi-Vision uses but the fundamental resolution is about 8k x 4k. I saw a demonstration a few days ago at BBC Broadcasting House, some recordings from the Olympics. NHK and BBC have worked together to televise parts of the olympics on this new system. Only 3 cameras so a refreshing return to old fashioned production values, lots of lingering wide shots, minimal pans or zooms. You don't need closeups when you have that much resolution available. From my seat, about 30 feet from a 25 foot screen the pictures were perfectly detailed and flawless, even under difficult lighting conditions such as fireworks.

The pictures were also being relayed to Bradford, Glasgow, Washington DC, Tokyo and Fukushima so some of you may have had a chance to see them.

NewVista
08-04-2012, 10:07 AM
..For a given image size on the chip, if you have more pixels each one is smaller and hence collects less light..

I hadn't considered that issue. HD clearly needs to migrate to a larger format chips
and cinema camera innovations will drive this change.

Have not noticed any Olympics motion artifacts with BBC?originated HD in 50HZ?
but probably 60HZ for HiVision?

The EBU needs to continue to push for 1080p for 7 & 8 mhz channels and dual 50/60hz standard

NewVista
08-04-2012, 10:16 AM
Perhaps the EBU need not worry about Interlace given the present
availability of 50/60p cameras:
With an alternate method of interlace generation, motion artifact
could be avoided in a similar manner to film scanning (see diagram)

ppppenguin
08-05-2012, 03:36 AM
In digital broadcasting there isn't really any such thing as "an interlaced channel". Just a coding standard and a bit rate. I don't have a citation to hand but it's probably easier to get a good picture at a lower bit rate when you start with a progressive source. As 1080/50p (and 60p) equipment becomes more readily available I think it will become standard for originating material. Hence removing the compomise between 1080/50i, 720/50p and 1080/25p. And the 60Hz related equivalents.

In the set of SMPTE standards for handling full bandwidth HD digits there are 2 basic bit rates, 1.5GB/s and 3GB/s. The latter is needed for 1080/50p and 1080/60p where the pixel clock is 148.5MHz. All the others (1080/50i, 720/60p and lots more) fit happily in 1.5GB/s with a 74.25MHz pixel clock. To add to the proliferation of standards all the 30Hz and 24Hz related standards have a variant with the pxiel clock multiplied by 1000/1001 to fit with the 59.94Hz NTSC field rate. This has always caused trouble with timecode. Now that NTSC is just about officially dead for broadcasting I can't see any reason for originating programme material on these standards.

NewVista's sketch is not unlike 1080/24psF. This is effectively 48 Hz interlaced int he channel but carrying 24Hz progressive. This allows material originated at 24Hz to be displayed on a CRT monitor without undue flicker.

Standards, don't you just love them. So lets have lots of them:no:

NewVista
08-05-2012, 11:11 PM
All right, I see they have a name for it: "30PsF". So my sketch
which I thought of a while back will not will not earn a patent :D

But it would obviate the need for complex & flawed motion comp
in deinterlacers if broadcasts could be somehow flagged to switch
off/bypass motion processor as for film sourced programming

ppppenguin
08-06-2012, 02:00 AM
30psf isn't actually in the list of SMPTE standards:) Implicitly 25psf has been used for years in Europe for film material where 24fps film has traditionally been shown at 25fps. The 4% fater running time is just accepted as normal, the sound pitch likewise or it can be corrected.

The US has suffered badly from its standards. 59.9Hz fouls up timecode for the prodcution people. 3:2 pulldown fouls up transmission of 24fps film. This can be overcome with advanced standards converters. These can convert 24Hz material to 30Hz without significant quality loss. They can also recognise and remove 3:2 artefacts.

NewVista
08-07-2012, 12:43 PM
.. 59.9Hz fouls up timecode for the prodcution people..

Were there problems editing with dropframe timecode?

ppppenguin
08-08-2012, 03:29 AM
Drop frame doesn't work too badly on short programmes, as all editing software has been designed to cope with its peculiarities. Generating it in the long term requires all sorts of corrections as there is no simple relationship between clock time and dropframe TC.

It's all due to a decision made at the start of NTSC colour, long before TC was invented. The relationship beween colour SC and sound carrier had to be set to minimise certain crossmodulation problems. It was felt that moving the sound carrier by 0.1% would upset too many existing receivers so they moved H, V and SC frequencies instead. We've been living with the consequences ever since TC was invented.

cbenham
08-18-2012, 02:44 AM
Drop frame doesn't work too badly on short programmes, as all editing software has been designed to cope with its peculiarities. Generating it in the long term requires all sorts of corrections as there is no simple relationship between clock time and dropframe TC.

It's all due to a decision made at the start of NTSC colour, long before TC was invented. The relationship beween colour SC and sound carrier had to be set to minimise certain crossmodulation problems. It was felt that moving the sound carrier by 0.1% would upset too many existing receivers so they moved H, V and SC frequencies instead. We've been living with the consequences ever since TC was invented.

The original color subcarrier frequency tested by RCA was 3.583125 MHZ which worked perfectly with 60/15750 on color receivers. However it caused visible moire patterns in received images on B&W intercarrier receivers during the tests because it beat against the 4.5 MHZ audio subcarrier.

Changing the H & V frequencies slightly to 15734.26 and 59.94 and reducing the color subcarrier to 3.579545 MHZ resolved the problem for the B&W sets, although it was never a problem for the new color sets.

Much Later, this change proved to be the undoing of videotape editing, special effects, standards conversion, and digital video.

Life in the engineering department would have been so much easier if they'd left it alone.
Cliff

NewVista
08-28-2012, 04:20 PM
More on Hi-Vision experiment
http://www.bbc.com/news/technology-19370582
Article says UK HD is 25 FpS
With 8mhz channels they should have followed EBU with 50p FpS
Especially since they have 12.5% more bandwidth than Continent
--and 25% more Bw than US channel (which can do 720 @ 60p )
As 25 FpS is really poor motion sampling rate

Penthode
11-29-2012, 01:17 PM
Some good points here. I recall John Watkinson wrote a paper in 1998 on Video Oversampling. He stated that because of the optical filtering ahead of the sensor, it is not necessary to use so many lines to deliver HD. If on the otherhand, the number of pixels on the sensor is substantially higher followed by the optical low pass filter, rescaling to fewer lines will not result in loss of spatial resolution. The only caveat is that oversampling only really works with non-interlaced video.

I believe we now underestimate the resolution of Image Orthicon video since resolution was limited by the structure of the target element and not by a digital imager's pixel array. Hence higher number of pixel imagers, rescaling and progressive scan is the future.

Nevertheless, I would have liked to have seen what a 4" IO tube could yield in terms of spatial resolution.

old_tv_nut
11-30-2012, 10:06 AM
Otto Schade did research in which the raster was reduced in size. With a fine beam, it showed that the target was capable of finer resolution.

In pickup tubes, it is necessary to have the beam wide enough to produce complete discharge of the target in one field (not one frame). Otherwise, there is an inverse raster of unread charge remaining on the target. The next scan will then produce a coarse moire' or flicker as it scans slightly out of register with the original scan. This can occur for either interlace or progressive. Of course, for interlace, the beam must be twice as wide as for progressive with the same number of total lines per frame. In old recordings, you can sometimes see this flicker in areas where the beam focus was too good. A similar effect could occur on kinescope recordings when they were rebroadcast, but there the moire' was more often a higher frequency swirly line pattern because the rescan had a much worse match to the original scan than the two fields in a camera.

In CCD pickups, it was found that using every other line of elements for interlaced scanning resulted in far too much vertical resolution, producing excessive interline flicker in interlaced CRT displays. Therefore, rows were combined (with a coefficient that was adjustable to set the vertical peaking in high-end gear). This was the equivalent of the wide scanning beam in tube pickups.

NewVista
12-12-2012, 05:33 PM
More on "PsF" (http://en.wikipedia.org/wiki/Progressive_segmented_frame), now being used more widely for 30fps!

ppppenguin
12-13-2012, 03:03 AM
I hadn't realised that PsF was proliferating. I know about 24pSF which is used to dress up progressive film material to look like interlace. Now they're doing it for 25, 29.97 and 30Hz.

I though that we were trying to abolish interlace but it's a hardy weed.

NewVista
12-13-2012, 10:48 PM
I think this is welcome news given the ubiquity 1080/60i, with even ABC switching to this after a decade of 720p - and maybe this is why: They now have more confidence for artifact-free images now that PsF can be originated from new generation cameras? (It's all done in the camera)

Penthode
12-14-2012, 10:54 AM
I think this is welcome news given the ubiquity 1080/60i, with even ABC switching to this after a decade of 720p - and maybe this is why: They now have more confidence for artifact-free images now that PsF can be originated from new generation cameras? (It's all done in the camera)

Artifact free? The low temporal resolution at 30 fps will always show artifacts. The artfacts are intentionally hidden with temporal smearing. As spatial resolution rises, so must the temporal resolution.

It is sad that 720p has had such a bad rap since it is capable of conveying higher resolution video than 1080i. Most of the problem stems from poor image rescaling in not only consumer equipment but also broadcast equipment. The point is that if the broadcast chain is properly set up so that production video is oversampled (1080p) and properly rescaled to 720p for delivery and then properly rescaled to 1080p for display, the image will knock the socks off 1080i. Interlace is archaic and it cannot be further condoned. And the temporal refresh rate has been ignored too long: it must be higher than 30 frames per second.

And where did you hear that ABC was departing from 720p? It doesn't make sense ABC going from a higher overall resolution of 720p down to 1080i? Unless it is to overcome the poor engineering practises of many current broadcasters.

lnx64
12-14-2012, 11:49 AM
ABC in a few areas are in fact in 1080i now. For example, in Boston it's 1080i now, and has been for the past couple years.

In my area, it's reported as 720p, but I'll need to double check to see what I'm actually receiving.

Penthode
12-14-2012, 12:56 PM
ABC in a few areas are in fact in 1080i now. For example, in Boston it's 1080i now, and has been for the past couple years.

In my area, it's reported as 720p, but I'll need to double check to see what I'm actually receiving.

Ahh... yes I have seen a number of affiliated stations change. But I expect network origination to maintain 720p until superseded with 1080p. But the issue is moot as linear scheduled over-the-air program distribution is superseded by other methods of delivery.

The important thing to remember is that today's commercialization of video media is through multi-platform delivery. This means the content must at some point be rescaled to another format. Interlace is totally irrelevant and is detrimental to the new world of multi-platform delivery.

NewVista
12-14-2012, 03:31 PM
Not sure what latest ABC policy is re-720 but my set indicates local affiliate has switched to 1080.
They must think 1080 looks better for local news & commercials or it could be expedient for equipment standardization.

But since 1080i/60 PsF = 1080/30p (not bad) (no intra-frame motion smear)
I think 720/60p is in trouble

Penthode
12-14-2012, 11:59 PM
But since 1080i/60 PsF = 1080/30p (not bad) (no intra-frame motion smear)

Certainly since temporal sampling is not spread over two fields, rescaling should be easier. That is unless adaptive interpolation in convertors recognize as much.

I think 720/60p is in trouble

Maybe. Newer 1080p imager cameras rescale to 720p well which brings about substantial improvement in the resolution of the picture. Some of the earlier 720p cameras were really quite bad and made the picture soft.

But it is now late in the game. The broadcasters are now, after many years, entrenched in their formats. Only ABC affiliates are rescaling at emission. I think broadcasters will mostly remain as they are until the next big change to 1080p and beyond.

NewVista
12-15-2012, 05:57 AM
.. Newer 1080p imager cameras rescale to 720p well which brings about substantial improvement in the resolution of the picture. .

You raise a good point, how 720 benefits, Kell-factor-wise, because of 2mp cameras & displays at home.
And deriving 60p from same imaging chips (which would also send 30 PsF for 1080 - we hope:scratch2:)

NewVista
04-19-2014, 11:10 PM
Just noticed this feature of latest studio camera:

"Newly developed 2/3-inch progressive CCD
Newly developed LSI (16bits A/D)
3G transmission
1080/60p/50p, 1080i/720p and all Psf signal format"
http://pro.sony.com/bbsc/ssr/cat-broadcastcameras/cat-hdstudio/product-HDC2000W/

Cool, Psf, definitely the way to go Sony!