View Full Version : "OTA" analog on a new LCD TV


dtvmcdonald
06-10-2013, 10:05 AM
I was finishing calibration of my 9T246 last night and decided to feed
it from a Blu-ray player playing a calibration HDTV DVD-ROM I made from
a file I found on the web (plain DVDs can do HDTV, just short ones)
rather than the Digital Video Essentials DVD.
This was taking the audio/video outputs, feeding them to a very cheapie
very modern VCR (only, no DVD) and using the RF output.
This DVD has a really good set of patterns.

When I finished, I decided to look at the disc on my very recent
46" Sony Bravia ... through the RF input, which I had never done
before, always before using HDMI. The pictures were stunningly
good! The very last pattern on the disc is a standard color bars
which alternates "flag versus no flag" (does anybody know what this means?
... there is no difference flag or no). Of course using the HDMI output
this pattern is absolutely perfect, no color fringing, etc. But on the RF
input it was incredibly good ... there was color fringing, especially when viewed through the usual Video Essentials color filters, but it was far far less
than any I have ever seen before (except of course the CPA prototype
at the ETF convention.)

How does this happen? I presume that the Blu-ray player is outputting
plain double-sideband (not vestigial) chroma on the video output. The
luma extends to 5 MHz. But the modulator must be cutting
it off somehow, as the sound was not bothered by color patterns.

Does anybody know how these modern cheapie devices are doing
it so well ... especially the cheapie modulator?

When finished with the patterns I decided to do something very
odd on the 9T246, so I watched not Wizard of Oz in glorious
9" B&W, but the Harry Potter I Blu-ray all the way through in B&W.
I must admit that B&W sometimes made seeing things a bit difficult,
even though Harry Potter is rather low in the color department,
especially as compared to the Wizard. But very oddly the dialog
was much easier to understand in mono than on my super-duper
5.0 surround system. Maybe it was a different track on the Blu-ray.
(Note that my Blu-ray player won't output video program if the HDMI
connector is plugged in, so it is aware its doing composite. It does
output composite in that case, but its a message telling you to
unplug the HDMI if you want program on composite. )

Doug McDonald

ChrisW6ATV
06-18-2013, 01:33 AM
on my very recent
46" Sony Bravia ... through the RF input... The pictures were stunningly
good!

How does this happen?
Does anybody know how these modern cheapie devices are doing
it so well ... especially the cheapie modulator?
As much as some people "like to dislike it", the answer is... because it is DIGITAL! :)

Seriously, the disc itself has pure-digitally-generated test signals, though they are converted to analog at the outputs. The modulator does not do anything but modulate, so there is nothing much there to go wrong. The Sony LCD flat panel demodulates the NTSC RF, and then (hee hee!) converts it to digital data for the rest of the processing and scaling to the LCD panel. Digital signal processing can be utterly amazing when done right, and you have seen a very good example.

Chip Chester
06-18-2013, 09:20 AM
The flag/no flag issue might be an aspect ratio signal, to tell the display device the aspect ratio of the source, so it can correctly apply whatever settings you've entered. There's also a copy flag, and likely numerous others buried within DTV...

In your test signal scenario, keep in mind too that you don't truly have an analog "spread" of possible signal values. You have an analog representation of stairstepped digital levels. Take for example "banding", commonly seen on sunsets or gradated backgrounds. When coming from an truly analog source, there is a smooth gradient of values from light to dark, for example. To decide which of two neighboring digital values to display, decisions have to be made by the circuitry, and compromises taken. With a digitally-originated signal such as yours, there are discrete values, and no "middle" values between the possible digital steps. Decoding circuitry doesn't have to figure this out -- it just displays what's there, which already "fits" the display. (Kind of a "no dithering" approach, although I'm not sure dithering is the right term.)

A good comparison would be a true analog signal, compared with a digital signal of the same image. Not an easy pair of signals to come up with "in the wild" though. You'll wind up testing the cameras more than the display.

Chip

dtvmcdonald
06-18-2013, 10:08 AM
Comments on the two reply posts.

First, I suspect that the Sony does NOT demodulate the NTSC
and then digitize it! I suspect ... but have heard that sets do this ...
it digitizes the IF somewhat undersampled (i.e. for a 45 MHz IF,
at say 15 MHz), just as it would do for ATSC, and use a DSP to
find and phase lock the carrier(s) and then demodulate them
digitally. In fact, whether in fact it does that really is the question!

As to analog/digital differences in quality ... 256 level is absolutely
enough for TV, zero quibble, given the gamma used. And a test
DVD (HDTV that is, even if not Blu-ray) is going to have an absolutely
great super-high bitrate that is going to show no banding ... and this one
does not show bands. The information content of these test discs is very very low indeed! Remember that except for motion patterns there is ZERO
interpolation to do between A and B frames! All the bits can be
put in those A and B frames.

ChrisW6ATV
06-19-2013, 11:59 AM
Ah, if it does IF DSP, then it can do an even more amazing job than starting with DSP at baseband video. Very cool.