View Full Version : Standardisation of video levels and impedances


ppppenguin
06-10-2013, 03:32 AM
I have placed a similar thread on the UK VRAT forum: http://vintagetvandradio.myfreeforum.org/viewtopic.php?f=5&t=5449&p=56930#p56930 but I'd like to get the USA point of view too.

As I understand it video in the USA has been carried as 140 units peak to peak on a 75R co-axial cicruit. This greatly simplifies connecting different equipment. The size of the unit has changed over time, from 10mV (gives 1.4Vp-p) down to about 7.2mV currently. Not sure about how many units have been allocated to pedestal/setup and whether this has always been present on 525 signals. These values are a little different to those used in Europe but not sufficiently different to cause much difficulty.

Going back to 1936 and the BBC TV station at Alexandra Palace the Black Book shows that a variety of much higher levels were used. Much of the kit was directly connected without going to the trouble of matching to co-ax.

Leaving aside the fact that 1V to 1.4V p-p into 75R is not a terribly convenient level for valve kit, when and how did this standard emerge? I have a copy of Fink's 1952 Television Engineering but I haven't yet found a reference to baseband video levels, despite extensive discussion of the transmitted video waveform.

NewVista
06-13-2013, 09:55 AM
Thanks for link, definitely worth a Lurk (I mean Lark)
I just came across bound annual collections of PRACTICAL TELEVISION of the late 1950s years - Intriguing.

They cost 1/-3p (One shilling-threepence) (13) a copy in '58

old_tv_nut
06-20-2013, 11:18 PM
The video voltage units are called IRE units because they were standardized by the IRE (Now IEEE). The original setup spec for black and white was looser, and tended toward zero, but the NTSC color spec set it at 7.5 IRE. However, it was not stated in IRE units in 1953, but rather as percent of RF modulation, so the IRE scale may have come later but some time before the IRE and AIEE merged to become the IEEE (1963).

I think setup was ill-advised because it resulted in objectionable variations in black level when it was not carefully maintained. Specifying zero setup as Europe did would have served the industry much better, IMO.

[Edit - oh yes, thanks for the link!]

ppppenguin
06-21-2013, 01:40 AM
To modern eyes setup looks ill conceived. It serves no useful purpose in the studio and just wastes transmitter power. AFAIK its only purpose was to minimise flyback time artifacts on the screens of receivers.

I haven't trawled the documents but I think that 405 line (Sytem A) varied in its use of setup, finally settling on not using it.

NewVista
06-21-2013, 02:30 AM
405 line (Sytem A) varied in its use of setup, finally settling on not using it.

Like the protracted abandonment of - s - d

ppppenguin
06-22-2013, 01:44 AM
and the even more protracted US abandonment of degrees Fahrenheit, feet, inches and a rather undersized gallon.

I was looking at the 1954 IRE papers on NTSC. In particular that fateful decision to tweak the line and frame rates rather than move the sound carrier. Nobody knew it at the time but the hassle that would cause for timecode was huge and continues to this day. It's a problem we don't have this side of the pond:-)

NewVista
06-25-2013, 06:49 AM
rather than move the sound carrier.

By how much? Would existing TVs still work without modification?

old_tv_nut
06-25-2013, 10:30 AM
The required change is one tenth of one percent. This works out to 4500 Hz, which could affect different sets in different ways
1) how sharp are the audio traps
2) does the set have split sound (in which case the user could tune to center the audio IF frequency), or intercarrier sound (factory tuned to 4.5000... MHz)

4500 Hz is a significant portion of the full sound carrier deviation (+/- 25 kHz), so intercarrier sets could experience increased distortion. The tight tolerance on the broadcast signal was meant to allow all the variations in tuning to be in the receiver for reasons of economy.

NewVista
06-25-2013, 01:25 PM
They didn't want to tamper with compatibility.
As Yves Faroudja (1987 SMPTE David Sarnoff Gold Medal Award) once said:
"I learned the lesson (after failed product) not to do anything that isn't backward compatible".

ppppenguin
06-26-2013, 02:32 AM
I can understand them being shy of shifting the sound carrier after the whole CBS sequential colour business. There was an easy fix available and nobody could reasonably have foreseen the trouble it would cause later.

Subsequently, at least in the UK there have been changes that have required changes for all viewers. I'm not talking about digital switchover which we've all had, but the start of UK national Channel 5. This was carried on UHF channels that were generally used for connecting VCRs etc to TVs. There was a lot of scope for intereference so Channel 5 had to fund a national programme of retuning which potentially involved visiting every household in the Channel 5 service area, in other words much of the country.

wa2ise
06-26-2013, 03:43 PM
It was easier to shift the vertical and horizontal scan frequencies, as all consumer TV sets had consumer adjustments for locking onto these. In contrast, the FM intercarrier was set at 4.5MHz with no consumer adjustment possible. Aunt Tilly isn't going to get out a diddle stick and retune the sound IF transformers in her TV set... Most TV sets didn't even make the consumer adjust the vertical and horizontal holds anyway.

cbenham
07-28-2013, 05:17 AM
Leaving aside the fact that 1V to 1.4V p-p into 75R is not a terribly convenient level for valve kit, when and how did this standard emerge? I have a copy of Fink's 1952 Television Engineering but I haven't yet found a reference to baseband video levels, despite extensive discussion of the transmitted video waveform.

The US video standard is detailed in Harold Ennes' 'Principles and Practices of Telecasting Operations, 1953, First Edition'. The topic includes the only explanation for the use of 'setup' I've ever found by a credible author.
Pages 131-133 attached.

ChrisW6ATV
07-29-2013, 12:00 AM
Thank you for posting that excerpt.

ChrisW6ATV
07-29-2013, 12:06 AM
Subsequently, at least in the UK there have been changes that have required changes for all viewers. I'm not talking about digital switchover which we've all had, but the start of UK national Channel 5. This was carried on UHF channels that were generally used for connecting VCRs etc to TVs. There was a lot of scope for interference so Channel 5 had to fund a national programme of retuning which potentially involved visiting every household in the Channel 5 service area, in other words much of the country.
Considering the massive amount of whining and delays with our digital-TV switchover in the USA, despite all of the widely-spread notices of the changes and the nearly-free tuners available to millions of viewers, I can only imagine the outcry we would have here if some change required individual visits to homes by anyone.

ppppenguin
07-30-2013, 02:56 AM
The US video standard is detailed in Harold Ennes' 'Principles and Practices of Telecasting Operations, 1953, First Edition'. The topic includes the only explanation for the use of 'setup' I've ever found by a credible author.
Pages 131-133 attached.

That's a great reference. It's a really good explantion. Thanks.

soundman2
08-01-2013, 07:45 AM
I just want to comment about the Channel 5 retuning service. Here in the Bilsdale transmitter area they set all the gear to accept Channel 5 on broadcast channel UHF 37, it actually appeared on UHF 35. Whoops! UHF 37 was the output from Emley Moor in West Yorkshire. D'OH!