h a l f b a k e r ySee website for details.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
The method used to transmit high definition television (HDTV) in the US apparently has little or no forward error correction included. As a result, little glitches in the signal can cause large hunks of the picture to drop out. This is because the picture is encoded using methods similar to JPEG and
MPEG files, and a small dropout can ruin the encoding for large rectangular blocks of the picture. (As opposed to a small glitch across a line or so as you would have in an analog TV signal).
For the next few years, US broadcast stations are going to be transmitting both an analog and HDTV signal side by side as we transition to the new format nationwide. At some point, the analog channels will be dropped and reassigned to other uses. (This is currently scheduled for 2006, but frankly I wouldnt bet money that it happens on time). However cable TV, because of deregulation, is not required to cut over entirely to HDTV. So cable hookups will probably have analog signals for some time to come.
I propose that an intelligent HDTV receiver could take advantage of the presence of these analog signals to make the blocky MPEG dropouts less noticeable. In the background, it could be decoding the analog version of the same channel, and up-converting the low-res picture into the equivalent resolution being transmitted on the HDTV signal. Then, when a dropout occurs, instead of showing rectangular blocks of random or previous pixels on the screen, show the equivalent data from the analog signal.
The glitches would still be visible (since the blocks would be at lower resolution) but not nearly as much. Essentially it would be treating the combination of analog NTSC plus digital HDTV signals, as if it were a progressive encoding of a single signal.
Youd need a pretty hefty DSP to do this work, and synchronizing the frames between the two pictures would be a hairy problem. Also, it wouldnt help you with dropouts that occur near the right and left edges of the frame, since NTSC is 4:3 and HDTV is 16:9. But it might be very useful in areas where over-the-air reception of HDTV is poor.
Low Bandwidth Correction of High Bandwidth Broadcasts
http://www.halfbake...dwidth_20Broadcasts previous discussion of the problem by [st3f] [krelnik, Oct 04 2004, last modified Oct 05 2004]
Please log in.
If you're not logged in,
you can see what this page
looks like, but you will
not be able to add anything.
Annotation:
|
|
How about taking the feed from another digital channel? In bad weather you could watch a Venezuelan soccer game where Caracas takes on the Playboy channel. |
|
|
One difficulty with this sort of approach is that the two signals will rarely line up in time, space, or 'quality' parameters (brightness, contrast, etc). Unless there were some way of synchronizing them, the attempted cure could be as bad as the disease. |
|
|
Yeah, I realize implementing this would be non-trivial, hence the need for a really high powered DSP chip. But I'm guessing that correctly received frames could be compared against each other to adjust things like contrast, brightness and such every so often. |
|
|
What will happen after 2006? Will the TV crowd be used to drop outs, just as now the computer crowd us used to system crashes? |
|
|
An optimist would say that the broadcasters will have worked out the kinks in their signals by then and receivers will be better, reducing the incidence of dropouts. |
|
|
A realist would agree with you: we'll just have to get used to it. |
|
|
Even those of us with rabbit ears still sometimes get dropouts if the network affiliate's feed gets iffy. |
|
| |