View Full Version : What happened to I/Q?
nasadowsk 03-31-2005, 06:15 PM So, how come I/Q demod in TVs died off? I know it was complex and yeah, in the early years I could see why they'd kill it. But, by the 60's or 70's, you'd think they would have tried again at it. But, AFAIK, all TVs are still narrowband X-Z. Is it just an issue of 'datasheet engineering' (i.e., copy someone's datasheet schematic, or a prior design), or was there a real reason for not doing it? Does it even matter anymore?
reeferman 03-31-2005, 10:43 PM Only one thing I can think of as primary reason: $$$$$$$$$$$. I doubt the mfgrs. were concerned about service/reliability.
lynnm 03-31-2005, 11:00 PM What was I/Q...enlighten me.
Chad Hauris 04-01-2005, 06:13 AM I am not an expert on this topic but I think the I/Q system used a wider bandwidth color demodulator to give better color rendition in small details. The small fine details of the picture are the highest frequencies of the video signal...regular sets use a fairly narrow bandwidth for the color signal and hence do not have true color in the very small areas of the picture. There may be more to the system but I think this is the most important detail (feel free to correct me if wrong!)
The I/Q system was used in the ctc-2 RCA sets (CT-100, 21CT55)
It seemed like there were also some '70s era sets that used it.
old_tv_nut 04-01-2005, 06:22 AM I will post a long note on I/Q demodulation including my experiences implementing it experimentally for a research project at Zenith in the 80's if anyone is interested. (Will take me a while to write it). Basically, the NTSC system uses I/Q axes of chroma modulation so that the I channel (orange/cyan color differences) can have wider bandwidth to preserve those colors in small details. I can give you a longer explanation about why it was chosen and why it wasn't very practical.
Yes, RCA did produce one chassis later on that used I/Q, but the extra bandwidth was used at only a low amplitude (I think because of practical difficulties in the IF design) and did not really enhance the color detail much.
Pete Deksnis 04-01-2005, 07:37 AM Quadrature modulation of course is a technique that allowed two chroma signals to be stuffed into the existing black and white television channel.
The word was that Philco got really ticked about RCA's declarations of parenthood for the fledgling NTSC color system. It was Philco who first implemented quadrature modulation for the developing compatible color specifications. They did it in 1951.
Not everyone used IQ demodulation in 1954. The CT-100 and the CBS-Columbia Model 205 did. The Westinghouse H840CK15 did not. By late 1954, when RCA brought out the 21-CK-55 (CTC2B chassis), their designers had dropped I-Q demodulation for an unusual Q/R-Y axis demodulation.
Years later, Donald Fink, an early video engineer, was reported to have said that the reasons for accepting I/Q demodulation by the second NTSC did not withstand the test of time.
There is no doubt that $$$$$, as Reeferman suggested, was the incentive for dropping I/Q demodulation in the CTC4. But an intriguing question for me is whether there was also a legal force behind the dumping of I/Q so quickly in the CTC2B.
http://home.att.net/~pldexnis/potpourri/rca-separateY_12-5-1950.html
Pete Deksnis 04-01-2005, 07:50 AM I guess it has not been said yet, but I stands for In-phase and Q stands for Quadrature, the name given to the two chroma signals that were stuffed into the existing black and white television channel.
lynnm 04-01-2005, 10:26 AM Thanks guys!
John Folsom 04-01-2005, 12:39 PM To this day, the NTSC signal is still broadcast using the wideband I and narrowband Q subcarrier quadrature modulation.
It was realized early that narrowband demodulation could save a tube or two, ($$$), and that the loss of color detail was not very noticeable. Westinghouse was demonstrating narrowband R-Y/B-Y (red and blue color difference signals) demodulation in prototype sets in 1952. This became the method most manufactures choose to use, and still use to this day.
It is also true that wideband demodulation has some additional undesirable visual artifacts which the narrow band schemes do
not.
frenchy 04-06-2005, 11:49 PM From my old 1957 "Color Television" annual magazine (the one with LUCY on the cover!), it said only the color in smaller detail is reduced. Maybe it was color detail that, while it might have been noticeable if you were up close to the screen, would not really have been distinguishable at all from normal viewing distances especially on little 21 inch screens. I.e. same logic behind NTSC where large colored objects are given full color, smaller objects are given only a few colors, and smallest objects are just black and white, due to what the eye is capable of seeing in the first place. And was decided was not worth the extra tube and capacitors etc?
Frenchy
|
|