Videokarma.org TV - Video - Vintage Television & Radio Forums

Videokarma.org TV - Video - Vintage Television & Radio Forums (http://www.videokarma.org/index.php)
-   Television Broadcast Theory (http://www.videokarma.org/forumdisplay.php?f=182)
-   -   How good was broadcast NTSC/PAL in practice? (http://www.videokarma.org/showthread.php?t=273229)

pidade 09-11-2020 09:46 AM

How good was broadcast NTSC/PAL in practice?
 
It's tough to say, since all the old analog recordings we have are degraded and, if not, weren't up to broadcast-spec anyway, so there's really no good reference for how live TV actually looked.

There's a lot of discussion already about hue errors, but in terms of resolution, was video bandwidth generally 4.2MHz (NTSC) at the receiver? What was the typical SNR of a broadcast signal?

Electronic M 09-11-2020 10:39 AM

Get a mid range to high end 90s CRT set (available free if you search craigslist long enough) and an LD player good local reception and or cable TV looked like that.

Luminance bandwith varied with TV manufacturer. From 1954 to the mid 70s comb and notch filters were not practical for consumer TVs so luminance was rolled off in the reciever around 3.5MHz. In the late 70s high end sets started getting comb filters which allowed luminance to reach I to the 4MHz rande without dot crawl and adoption of them gradually spread.
Some monochrome sets didn't limit luminance bandwith.

Modern LCD sets tend to handle NTSC noise worse than CRT sets did. I have played VHS and LD side by side on a decent CRT set and a modern HDTV and observed that the picture looked fine on the CRT but objectionably noisy and blurry on the LCD.

old_tv_nut 09-11-2020 10:58 AM

Regarding noise, there were two main sources:

1) RF noise - just depended on how strong the signal was
2) Camera noise - just acceptable with image orthicon cameras at first, improved a bit with installation of solid state preamps; Plumbicon cameras had greatly reduced noise levels compared to image orthicons. There was also a difference in the character of the noise. Image orthicon noise had a flat spectrum, consisting mainly of shot noise. Plumbicon noise had a triangular spectrum rising toward higher frequencies. This made it less visible for a given total r.m.s. value.

Colly0410 09-11-2020 12:09 PM

I was used to 625 lines PAL here in England & after having it drilled into me in technical books how much better 625/PAL was than 525/NTSC I was expecting imperfect pictures when I got to our rented apt in Miami in 1989. Got there & turned on the TV & a perfectly good colour picture appeared that looked just as good as I was used to in England. My wife who has epilepsy preferred the NTSC pictures on CRT TV's as they flickered less than PAL. When I went to Orlando last year the pictures seemed identical to here...

pidade 09-11-2020 12:25 PM

Quote:

Originally Posted by Electronic M (Post 3227444)
Get a mid range to high end 90s CRT set (available free if you search craigslist long enough) and an LD player good local reception and or cable TV looked like that.

Luminance bandwith varied with TV manufacturer. From 1954 to the mid 70s comb and notch filters were not practical for consumer TVs so luminance was rolled off in the reciever around 3.5MHz. In the late 70s high end sets started getting comb filters which allowed luminance to reach I to the 4MHz rande without dot crawl and adoption of them gradually spread.
Some monochrome sets didn't limit luminance bandwith.

Modern LCD sets tend to handle NTSC noise worse than CRT sets did. I have played VHS and LD side by side on a decent CRT set and a modern HDTV and observed that the picture looked fine on the CRT but objectionably noisy and blurry on the LCD.

I was thinking LD might be a decent reference, though it had 425 lines of horizontal resolution vs. 330 with broadcast NTSC. Not sure how SNR compares, I typically see 48-52 dB quoted for LD players.

Chip Chester 09-11-2020 07:01 PM

I have a Sony PVM 20M4U and a couple BVW 65 decks in my office. Without a TV station in the middle, master tapes look quite good. Same tapes on a higher res LCD monitor allow the tapes deficiencies to be seen more clearly. Still not awful, but way better than VHS anything, as it should be. DVDs look good on each monitor. During the analog transition, it was pretty common for SD source material to suffer a bit. That got better quickly once there was an A/B comparison available to engineering, and the public.

pgnl 04-23-2021 12:01 PM

I think in reality both systems were capable of a very good picture, with the line structure being visible only on larger screens. Flesh tones were perhaps more accurate with PAL, but the flicker was more obvious.

On broadcast, one thing that hasn’t been mentioned is ghosting, which used to occur when the broadcast signal would bounce off buildings etc and show as images duplicating normally to the right of the actual image. Obviously this wouldn't happen with cable or DVD/LD, but was something we just had to put up with. I think the Japanese invented a way of eliminating ghosting on broadcast TV, but it never reached the UK.

I remember staying in a Motel in Nevada in 1983 and thinking how good the NTSC picture was, but it was the exception, it was quite common to see green tinted faces on US TV sets i watched.

I remember comparing region 1 and region 2 DVD’s back in the day. Here in the UK the player was connected using an RGB SCART plug, the pictures were very good on both. Of course PAL DVDs was sped up from 24 to 25 fps, which is another issue. The flicker was main difference on the Region 2 disks, but in Europe we were used to it.

No flicker today of course as flatscreens don't suffer with it...

old_tv_nut 04-23-2021 01:18 PM

Quote:

Originally Posted by pgnl (Post 3233184)
...I think the Japanese invented a way of eliminating ghosting on broadcast TV, but it never reached the UK.
...

Philips was the major innovator in analog ghost canceling:
https://www.edn.com/edn-12-21-95-phi...ion-technolog/

Unfortunately, it was incapable of 100% cancellation in many situations, and it was decided that it wasn't worth it unless it was very nearly perfect. Customers would not take well to buying a "ghost cancelling" set and still seeing ghosts. "Ghost reducing" was not salable.

In analog, if a ghost causes a very deep notch in the frequency response, you can't restore that totally missing information. You would only introduce random noise.

In digital, because of the redundant forward error correction (FEC) bits that are scattered about the spectrum and bit stream, some amount of data can be totally lost and the signal can still be reconstructed. Hence the digital "cliff effect" of either a perfect picture or no picture. In analog, the improvement was by degrees, rather than working/not working.

mr_rye89 04-23-2021 03:26 PM

I remember in 2001 my dad/step mom renting a friend's apartment that had a nice big Trinitron and broadcast TV looked great on it. It was in town so no noise, and it probably had the fancy comb filtering too. I normally lived 30 mi out of town so there was noise/snow even with an outdoor antenna. Noise went away in '09 with the digital switch, but those pesky mpeg2 artifacts :nono:

etype2 04-23-2021 03:35 PM

EARLY NTSC IN AMERICA.

Man on the street or in my case, teenager observations.

Was around to see 50’s NTSC color, starting 1956. During the 50’s, sets often displayed green tinted faces. There was no color consistency from program to program, continual color adjustment required. I saw comet trails and flairs. Tuner drifting and low resolution. Films looked better than live studio or network broadcasts. I got the feeling color was a work in progress. By 1960 things improved and by mid 60’s much improved

pgnl 04-23-2021 04:36 PM

I remember in my teens and early twenties here in the UK, at that time there was a lot of mainstream US TV on the four channels. There is less so today, most is on pay Tv. US comedies like the Mary Tyler Moore Show, Rhoda, The Banana Splits, The Partridge Family, detective shows like Cannon, The Rockford Files and later Dallas and Dynasty, plus loads more, it was very popular.

General speaking the picture quality of American shows was very good, presumably because it was shot on film and broadcast as such here. Then in the mid eighties, if I remember rightly after about three seasons of Dallas, the picture quality suddenly deteriorated. I assumed at the time it was because the show was being shown as video rather than film. From about then on all US TV was of inferior quality to domestic stuff. You probably didn't notice it in the US, but to me it was glaringly obvious here.

Shows like Friends were originally shown in US video quality, but were clearly originally filmed as they are now being shown in HD and in widescreen, presumably they were converted to NTSC video for export originally.

Electronic M 04-23-2021 11:55 PM

Quote:

Originally Posted by pgnl (Post 3233184)

I remember staying in a Motel in Nevada in 1983 and thinking how good the NTSC picture was, but it was the exception, it was quite common to see green tinted faces on US TV sets i watched.

If that was in the solid state era you either watched a lot of the Incredible Hulk or you encountered a lot of broken or badly misadjusted TVs.
It's not uncommon for sets here to look a little different from each other side by side, but most look fine on their own, unless broken or misadjusted.

ppppenguin 04-24-2021 12:54 AM

Quote:

Originally Posted by mr_rye89 (Post 3233197)
.......Noise went away in '09 with the digital switch, but those pesky mpeg2 artifacts :nono:

Remember how striped shirts caused lurid colour patterns in PAL? In NTSC you got comb filtering before we did, simply because it was a lot easier.

pgnl 04-24-2021 06:50 AM

Quote:

Originally Posted by Electronic M (Post 3233213)
If that was in the solid state era you either watched a lot of the Incredible Hulk or you encountered a lot of broken or badly misadjusted TVs.
It's not uncommon for sets here to look a little different from each other side by side, but most look fine on their own, unless broken or misadjusted.

Well I think its fair to say more care would be taken to adjust a TV at home, of course my experience was based on hotel TVs. Plus of course folk with an interest in technology will always take more care to adjust it properly. My sister can’t be bothered with HD - standard definition (via Satellite) seems fine to her.

Flesh tones looked as good on PAL as modern Mpeg encoding, so i didnt expect to see people with green tinted faces. It was just an observation..

nasadowsk 04-24-2021 10:44 AM

1 Attachment(s)
Good god, even by the 70's you could get pretty good color on an NTSC set. They didn't really drift much one warmed up. 60's sets tended to be a bit worse, but still.

Motel sets were a great target for kids to adjust while bored...

This is a CTC-15 (Sylvania, really, but a clone). The convergence kinda sucks, but the color's not all green and distorted...

Growing up in the 80's, I don't remember anyone with a TV that was off in "Incredible Hulk " mode all the time.


All times are GMT -5. The time now is 08:22 AM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.