#1
|
||||
|
||||
HERESY ... My CT-100 Color TV TUNER
I am currently watching glorious I-Q color TV being demodulated by my RCA CT-100
color tuner and displayed on a 13 inch pretty good quality Sony monitor. Its a bit too bright of course and has a bit of ringing, but its very very watchable. Its FAR better than later "tubized NTSC demodulators". Really. The dead 15GP22 is of course disconnected. What I did was manufacture a impedance-converter/reducer with three channels, each using one FET and one BJT. This will take a bit of tweeking because there is just too much red signal compared to green and blue. Also, the amplitude of the signals is being reduced using resistor pairs with a 1 megohm resistor and a 30 to 47K resistor, so I suspect they will need capacitors in parallel to get the response correct. Finally the Sony has its own DC restoration so that feature of the RCA does nothing ... when there is a fade to black, it really goes black. Last edited by dtvmcdonald; 05-25-2019 at 08:35 AM. |
#2
|
||||
|
||||
It took a bit of work to get things right. I had to put trimmer caps in my box to get the video response right. I had to adjust the red gain in the monitor to avoid overload on some scenes. Now the gray scale and color balance are excellent. Interestingly, Dorothy's dress is now actually white!
Here are pictures. Its interesting that they are better than with the original real 15GP22, which did have a mildly weak red gun, which clipped highlights. The monitor interestingly works best with the default setting of all controls, user and setup, except red gain. Blue and green gain are best adjusted on the CT-100, as is contrast. The CT100 cannot control brightness at all. The control has zero effect. http://www.videokarma.org/attachment...1&d=1557520910 http://www.videokarma.org/attachment...1&d=1557520910 |
#3
|
||||
|
||||
I looked at all my normal test patterns and test pictures, and a couple of whole
(and colorful!) vacation slide shows (played on a Sony BluRay player). I also watched Adventures of Robin Hood and The Hobbit I and some live shows. Results are all near perfect, and better than a real CT-100 on its CRT because of perfect DC restoration. A schematic of the converter is attached. It looks and is trivial, but took quite a bit of work to get all the levels (both voltage and bias current) working right. It uses a 12-15 volt wall-wart. Its intended for 75 ohm loads. Note that I'm using 1 watt film resistors for the 1 meg ones. The capacitance of these matters a very lot ... thus, the trimmers. http://www.videokarma.org/attachment...1&d=1557588331 |
#4
|
||||
|
||||
After a bit of experimentation with a new type color test pattern, I've
decided that the original RCA decision to use unequal bandwidths was just plain stupid, to the eye. It would have been better to use the same bandwidth for I as for Q. This was much easier to see on my new and much brighter display. My modulator provides equal bandwidth color, as does the DVD player. All that's needed is to unplug the Q demodulator and apply two color squares, Q and -Q (green/magenta), touching horiaontally. Above I is black, below it, white. Above -I is white, below it, black. The amount of feedthrough from the Q into I looks unnoticeable, and in fact is unnoticeable on program material, especially compared to the blur of Q with narrow bandwidth. For the CT-100 this would, however, require an additional triode section in order to make the Q amplifier circuit the exact same as I. I.e., more $. Later sets of course used cheaper demods, but they would still look lots better, just so long as they were broadband. Compensation in the $ department: a much shorter needed delay line for I. I just love playing with things like this! |
#5
|
||||
|
||||
I'm surprised no one has commented on your work yet , but I for one am impressed with the engineering you've applied to this and think it's just great ! To see good old fashioned electronic experimentation like this is one of my reasons for hanging around here at VK .
PS , In my opinion it's only Heresy if you can't put the TV back to it's original condition when your done , since your mods aren't irreversible I see no reason for the torches & pitchforks |
Audiokarma |
#6
|
||||
|
||||
from everything I've read, the Q channel was bandwidth limited to reduce crosstalk and distortion. Might matter to your eye, but the average consumer who saw color in those days was probably amazed it worked at all.
__________________
Evolution... |
#7
|
||||
|
||||
You can't fit 10 lbs of shit in a 5 lb bag. Using wideband I and Q wasn't feasible, and using a narrower bandwidth for both was felt to be inferior to using the unequal standard that was ultimately adopted in 1953. Any commercially adopted standard must be a collection of various compromises.
|
#8
|
||||
|
||||
The bandwidth was also selected to match the characteristics of the eye. We see the most detail in monochrome, followed by the orange-cyan I axis ( there were actually 2 color films made using only I axis colors), then followed by RGB... they gave the most bandwidth to what we see the most detail in.
Aside from the nicety of an IQ demod still displaying semi- passible color with one demod dead if they hadn't taken the detail perception of the eye into account we probably would have had a R-Y/B-Y transmission reception equal bandwidth system specified in the NTSC standard to simplify TX and RX circuits.
__________________
Tom C. Zenith: The quality stays in EVEN after the name falls off! What I want. --> http://www.videokarma.org/showpost.p...62&postcount=4 |
#9
|
||||
|
||||
My point is that using my modulation system, which is wideband in both I and Q on the low
frequency side, the bleedthrough of the wideband Q into the I is quite harmless. You can see it on test patters but not programs. Increasing the Q bandwidth a bit (you can't increase it a lot because either the gain gets too low or you get a timing mismatch) results in a better picture. So I wonder ... did the NTSC people back in 1951-1953 ever actually try 1 or 1.5 MHz on the lower side for both axes? That's not counting their PAL or PAF efforts. The PAF at 3.89 MHz looked pretty good. I'd like to see that set working again and the color circuits up to the tweek-up quality currently seen in my CT-100 or the ETF's one as seen at the convention. Remember that the modulation system at the ETF is probably also wideband on both axes on the low side. Edit: the restoration thread is at http://www.videokarma.org/showthread...ht=dtvmcdonald I add this to find it easily. Last edited by dtvmcdonald; 05-21-2019 at 12:19 PM. |
#10
|
||||
|
||||
The assymetrical wide band quadrature demodulator would provide better chroma detail at the expense of quadrature crosstalk. Because the crosstalk would occur at the higher chroma frequencies (where the lower I and Q sidebands are present) the effect would be strange colors in fine detail and at horizontal transitions. I recall an excellent article on quadrature crosstalk in an early sixties RCA Review. I will look for it.
The crude use of passive tuned filters made utilization of the capabilities of proper I Q demodulation impractical until more sophisticated hardware became available in the 1990s. It is curious that the NTSC were aware of the problems in 1952 hence trying to circumvent it by using Phase Alternate Line and Phase Alternate Field demodulation techniques. It is also more amazing that the British chose an 8MHz channel for 625 broadcasting in the early sixties to accommodate full double sideband R-Y and B-Y quadrature demodulation NTSC in 1966 before making only the last moment move to PAL to fall in line with most of Europe. Last edited by Penthode; 05-24-2019 at 10:05 PM. |
Audiokarma |
#11
|
||||
|
||||
Not only were there lots of compromises, but unspecified things in the NTSC standard that produced problems in the introduction of IQ demodulation in the 90s, when it was possible to produce a good IQ receiver.
One biggie was that all transmitters and receivers had an audio trap at the equivalent baseband sound frequency of 4.5 MHz. So, the upper I and Q sidebands were sharply curtailed +920 kHz. But the NTSC never specified that the Q channel should have a symmetrical trap at minus 920 kHz, either after the Q modulator or in the Q baseband. As a result, a receiver with full wideband I response would get quadrature distortion (or not) depending on the particular brand and model of color encoder. I went through a series of experiments at Zenith when RCA briefly brought out their solid state I/Q set, to see if it was worthwhile. We could not understand why RCA had the high frequency I content turned down so much until we tried it at full amplitude and saw the problem. The problem was that with a strong Q axis color, the quadrature I interference was noticeable, even if it was only 10% of the main signal in the Q. There was another problem even with proper I/Q filtering. The idea that the eye can't see Q axis color detail is true enough for grating patterns of green/purple. But, a single narrow object is like a pulse of color and therefore contains all frequencies, high and low. This means that the typical Technicolor yellow titles would turn noticeably orange on the vertical strokes of small letters, while being yellow on the broad strokes. This effect would happen on any color that contained both I and Q. (Of course, faces weren't affected, since their color is mostly on the I axis.) When I showed this to engineering management, they said no way, it looks too startlingly different. Again, a reason not to turn up the color detail very much. I suspect that the NTSC committees convinced themselves that things were acceptable because their test material did not contain much strong color detail that was off the I axis (remember, their repeatable material all would have been from film, which already was biased toward the I axis to prevent off-color skin tones), and because of most of the viewing being on small screens. Edit: Hypothetically, the single sideband part of the I signal should be boosted 6 dB by a "shelf filter" in the receiver to regain full detail amplitude. but I don't know if anyone actually implemented this. Then there was the change in IF design over the years from the flat, sharp cutoff to a "haystack" response, where the NTSC assumed the former and even enshrined a phase correction filter into all color transmitters that was meant to compensate a receiver flat IF response. By the 90s, surface wave IF filters were available that could have the "haystack" amplitude response but with the exact phase response that the NTSC had assumed back in 1954, so that problem actually could have been fixed. In the experiments, the quality went like this: Separate baseband Y, I, Q with I and Q filtered to 1.5 MHz and 0.5 MHz: the best, but some titles and some objects that were between I and Q color phase would jump out as the wrong color on narrow parts and edges. Composite with I and Q demods but no RF modulation or sound trap: like the above, but with more cross-color than equiband demods. RF modulated composite with I and Q demods: quadrature distortion pops up, especially if the encoder Q filter did not have the 920 KHz trap. In all these stages of experiment, there was improvement in the detail saturation of red objects, which was deemed worthwhile, but not with the other artifacts that crept in. Management's judgement was that it was better to lose color saturation in the details without introducing artifacts. Since the home viewer didn't have the original studio scene to compare to, this was deemed less objectionable than obvious changes in the hue of edges and narrow objects. It must be noted that receiver color bandpass designs were simplified over the years, and simplified LC filtering means more gradual cutoff of the bandpass. So, equiband sets produced a bit more color detail than a sharp 500 kHz cutoff would. This extra detail contained quite a bit of quadrature distortion above 500 kHz, but the extended bandwidth was low in amplitude, and therefore took advantange of the eye's reduced hue discrimination that the NTSC originally touted. At normal viewing distance this gave some apparent increase in chroma resolution, and when people used composite or S-video inputs from DVDs, there was a visible increase in color resolution because there was no sound trap messing up the upper sidebands. Last edited by old_tv_nut; 05-24-2019 at 11:35 PM. |
#12
|
||||
|
||||
I only understood about 10% of this, but it's really cool to be in the presence of people who can have detailed discussions on the topics. I am assuming you guys know what you're talking about, I wouldn't know if it were mostly gibberish you made up to screw with the newbies.
Last edited by AlanInSitges; 05-25-2019 at 07:08 AM. |
#13
|
|||
|
|||
Back in '66-'67, RCA field reps would visit their dealers periodically, to discuss technical/service issues. A question came up about why the color had such low resolution compared to the luma signal. The rep explained in newbie-freindly terms that the chroma signal didn't need to be high-rez because color is simply "painted into" the BW picture.
|
#14
|
||||
|
||||
On a different topic ... I've been running my RCA-Sony setup with the full HV
system running in the CT-100, just disconnected from the CRT, with the AC input at 108 volts to reduce heat. This works just fine (the thing runs well down even to 100 volts). But I was thinking ... would it be better to simply unplug the HV and focus rectifier tubes? Would this reduce stress on the horizontal output tube and transformer? Or would lowering the load cause the voltage to rise up, causing even miore stress? I could try it, but what about your opinion? Note that this is not the same as doing that on a B&W set because color sets have essentially constant HV current from the HV rectifier due to shunt regulation. |
#15
|
||||
|
||||
Quote:
With our 8MHz channel we had enough space for wideband U and V. Hence no need for the phase shifted I and Q axes. That's true for both PAL and NTSC. There was some unease about the differential phase and gain problems of NTSC. Also the need for a hue control and potentially very innacurate colours as a result of user adjustment. I think we rightly ruled out SECAM as it was a nightmare in the studio. You can't fade or mix a SECAM signal without a lot of hassle an quality loss. Good quality low cost 64us quartz delays came at just the right moment (from Philips) to make PAL a very strong contender. NTSC would have made the UK the odd one out among all the PAL and SECAM countries, though transcoding PAL<>NTSC on the same line standard isn't too hard. Just as NTSC<>PAL-M is also fairly easy. Was PAL the right decision? Probably at the time. In retrospect it was much harder to get clean decoding of PAL than NTSC but this has now been largely overcome. |
Audiokarma |
|
|