#46
|
||||
|
||||
I hope we can take it as true that with reasonably modern broadcast kit and reasonable engineering standards both PAL and NTSC will give good results.
There are artefacts with both systems. Some relate to the udnerlying scan rates, others, such as lurid patterns on fine detail, are a side effect of the NTSC and PAL systems. These cross colour and cross luminance effects can be minimised by comb filter decoders which were much simpler on NTSC than PAL. Hence they were much more common in NTSC TVs. NTSC has lower chroma bandwidth so transitions between highly saturated colours are a little worse in NTSC. Readily seen on the green/magenta transition on colour bars. PAL and NTSC have different dot crawl effects. These are primarily visible on monochrome sets. Since PAL subcarrier is a higher frequency they are probably less visible in PAL. Phase errors should be minimal with decent kit and reasonable engineering. Now wind the clock back to the early 1960s o even to the 1950s. It is obvious from the work at Hazeltine labs, Telefunken and others that colour phase problems were of great concern. NTSC broadcast kit needed a lot of engineering attention to give consistent colour and the TVs weren't much better. You needed a hue control which can readily be misadjusted by viewers. The idea of colour phase alternation as a solution to this was first raised at Hazeltine labs (c1955?) but was judged impractical then. CPA could be done on dot, line or field basis. The latter 2 were totally out of reach back then. Bruch picked up the CPA idea, did on a line by line basis and invented PAL. At the time (late 1950s to mid 1960s) BBC engineers wanted to use NTSC and tried both 405and 625 NTSC systems. They reckoned they could wok to high enough standards to keep phase errors acecptable. Aided of course by much more modern kit than was available in the US in 1954. At the same time the french were pushing SECAM as a solution. Totally hideous in the studio and not really capable of being improved by better comb filters and suchlike. PAL was seen as the best answer AT THE TIME. Looking back, 625 NTSC would likely have worked perfectly well. Hindsight is gloriously 20:20 vision. In the US the coming of NTSC brought the decision to offset the line and frame rates by a harmless fraction of a percent. To avoid moving the sound subcarrier by a similar amount. Who was to know back then the sheer amount of grief that would cause for broadcasters when timecode was invented. Grief that continues to this day as all the 1080 and 720 systems have widely used options for 59.94Hz and other field rates with a 1000/1001 offset. The whole PAL/NTSC debate is now well behind us. For some years nobody (I'm sure somebody will find me an example of a small station in Africa that still uses PAL) has been producing new material in a composite format. High quality decoders are available to decode PAL and NTSC to their components with excellent results. Almost nobody is even radiating PAL or NTSC now. |
#47
|
||||
|
||||
Don't you just wish the Beeb had decided to go with 405 line NTSC in 1956... all those glorious shows we might have in colour!
__________________
____________________________ ........RGBRGBRGB ...colour my world |
#48
|
||||
|
||||
Actually NTSC wider - 1.3m vs 1 m
With regards to monitoring in PAL-S... Come to think of it, some monitors had PAL-S & PAL-D switch for convenient signal evaluation (clever). |
#49
|
||||
|
||||
Quote:
It is interesting that the new 625 line UK standard had wider bandwidth to accommodate full double sideband R-Y and B-Y. And the 625 standard the video - audio carrier spacing was set for NTSC so that the aural carrier would be an integer multiple of the horizontal scan frequency to facilitate proper chroma-luma interleaving. |
#50
|
||||
|
||||
Quote:
The NTSC originally pursued CPA because of limited bandwidth available for the interleaved chroma channel and CPA would facilitate R-Y/ B-Y full vestigial sideband operation with quadrature crosstalk cancellation. Unfortunately the electronic technology still had a long way to go to effectively use CPA and ultimately the vestigial sideband I and double sideband Q was adopted. The picture would have reduced chroma bandwidth but produced superior pictures at the time. The NTSC made the right decision when forty years later the electronic technology could more effectively use the standard and hue errors had become a thing of the past. It is interesting to consider that 50's designed NTSC sets today display pictures consistently much better today than they did when they were new simply because the signal source now is consistently much better. |
Audiokarma |
#51
|
||||
|
||||
The pseudo PAL generated by a DVD or video game will still have the 4.43MHz color subcarrier. True 525/60 PAL would have a 3.57 or thereabouts color subcarrier.
[QUOTE=dr.ido;3114384]If you want to see 525/60 PAL for yourself check the options menu in your DVD player. Some have an option to output 525/60 PAL when playing back NTSC encoded DVDs. This mode is compatible with most PAL only TVs while avoiding some of artifacting caused when then player converts an NTSC encoded disc to 625/50 PAL. |
#52
|
||||
|
||||
Yes, I'd forgotten about the subcarrier. I guess that leaves models that actually have an option for PAL M for the country or two that uses (used?) it. Or a couple of consoles that had to have a 4.43MHz crystal fitted to output PAL.
Either way the end result is probably more comparing consumer encoder ICs rather than comparing PAL and NTSC themselves. I guess the closest to seeing real NTSC for those of us who have never been to an NTSC country would be an NTSC Laserdisc? They're analog NTSC composite on the disc itself? |
#53
|
||||
|
||||
Just watching a Poirot in glorious NTSC on my "kerbside" find 1990s Panasonic. The set handles NTSC from DVD well via composite... but my technical knowledge is limited on the exact form of signal produced off the DVD player.
I have set the player up as if it is connected to an NTSC set. The Tint control becomes active and screen logos change size etc. And the TV set requires changes to brightness and contrast to reflect the change in black level. And the reds change!
__________________
____________________________ ........RGBRGBRGB ...colour my world |
#54
|
||||
|
||||
If I was watching PAL, I'd modify the decoder for PAL-S (the way Telefunken originally intended it) that way you would get back the vert chroma resolution the delay line robs you of!
A color system should really not give the public a 'Tint' control because its set up really requires a test signal. Even my 1954 GE hides the "Hue"(Tint) on the back panel (not sure if that was a good idea back then) but my mid 60s RCA has it on the front panel with no "correct-setting"detent (as all-tube chassis not stable enough for this?). |
#55
|
||||
|
||||
As I understand it DVD video is an MPEG2 stream which decompresses into digital component. The player encodes this as NTSC or PAL depending either on the flags set on the disc itself or the user settings. Most players default to auto and set the output according to the disc. Most players also have settings to override this and output either PAL or NTSC for all discs.
|
Audiokarma |
#56
|
||||
|
||||
Quote:
SMPTE 170M(1993) gives more or less the same figures as for PAL but the USB isn't transmittable in a standard M channel. 170M notes the earlier NTSC standard where Q is 2dB down at 0.4MHz. A lot of NTSC coding has been done with narrowband 600kHz U/V axes rather than the complication of I/Q. This is discussed in SMPTE EG27. I would attach a copy but it's SMPTE copyright. Here's a quote from EG27: Quote:
|
#57
|
||||
|
||||
Quote:
http://www.snellgroup.com/documents/...des/edecod.pdf __________________________________________________ ____________ "...there are no modern [NTSC] receivers that utilize the theoretically possible wide- band I demodulation.." What about premium TVs like RCA 'Dimensia' (touted full chroma bandwidth in Ads), Pro-Scan, Sony 'Wega' and the incredible progressive scan Panasonic Xr-series? |
#58
|
||||
|
||||
Strictly that's John Watkinson's paper, published by S&W. JW is a very well respected engineer here in the UK. His books include "The art of digital audio" and "The art of digital video". Both of these books are always to hand by my desk.
He covers historic practice as stated in the SMPTE docs and then correctly states that modern NTSC coders often use 1.3MHz chroma. This too is correct, my own designs do, as do many others. I don't bother to switch filters when changing between PAL and NTSC. This is fine in the studio. However the upper sideband of a 1.3MHz chroma signal will be heavily mauled by a system M transmitter. Strictly the coders maintain 1.3MHz for U and V, not I and Q. Though if U and V are both 1.3MHz, I and Q will be too. Poynton, in "A Technical introduction to Digital Video" pp187-190, takes a similar view to Watkinson. He notes that SMPTE170M encourages the use of wideband (1.3MHz) chroma in the studio but also says that the practical broadcast chroma BW is only about 600kHz. The subtleties of I/Q coding have been largely ignored in practice. Most broadcast coders simply encode on the U/V axes and bandwidth limit before the TX. Hence even a receiver with full chroma BW and I/Q demod will not find any benefit on virtually all material. Any claims like this are markting puff. Last edited by ppppenguin; 09-08-2014 at 12:30 PM. |
#59
|
||||
|
||||
Quote:
It is worth noting that differences in color vision among viewers with normal color discrimination can account for up to a +/- 20% difference in the ratio of red and green to get a yellow and therefore a corresponding difference in flesh tone matching. This probably should be corrected by an adjustment to white balance for each viewer, but has never been contemplated because: 1) you can't do this for different simultaneous viewers; and 2) you would never be able to teach non-technical viewers how to make this adjustment. Even with the white adaptation that occurs in all viewers, they will still see differences in flesh tones, and a hue control lets at least one person in the room compensate the rendition for his/her vision. See: A study of the need for color controls on color TV receivers in a color TV system operating perfectly, Hirsch, Charles J. ; Radio Corporation of America, Princeton, N. J., Broadcast and Television Receivers, IEEE Transactions on (Volume:BTR-10 , Issue: 3), Nov. 1964, Page(s): 71 - 86 |
#60
|
||||
|
||||
Quote:
So if I ran a TV station back in the 60's and 70's, I would have low passed the luma to remove anything above 3MHz, and then mix in the chroma subcarrier, then transmit that. Thus producing much less artifacts on viewer's TV sets. People would say that my station looks cleaner... B&W sets made after NTSC color was introduced low pass filtered the luma as well, so those viewers would not see a lack of fine detail either.
__________________
|
Audiokarma |
|
|