You don’t say the source of your HD programming. If it’s cable or satellite, my WAG would be that the bandwidth simply gagged. If it’s over the air, I got nothin’.
True, but it’s really not so simple. JPEG’s are non-redundant - delete one byte of data, and the whole file is unreadable. Over-the-air HD, however, is incredibly redundant. You can lose 40% of the data and still get a perfect picture. Lose 41%, though, and the picture drops out (numbers not exact).
Analog TV quality decreases fairly linearly. Digital TV quality stays constant long after analog TV looks like crap - but then it drops to nothing very sharply.
Is that true? If it was the last 1K of a progressive jpg then you would not really notice the difference. In a normal jpg you would just lose the picture after the loss. Only if it was the header information then the jpg would be useless.
He means the signal quality dropped, and with digital signals there’s a very (very) sharp dropoff between clear picture and no picture at all.
When a digital signal drops, you see the screen break up into squares and freeze or go black, rather than the familiar gradual noise buildup you get with the old analogue signals.
If you have a DVR, and this happened while watching a recorded program, it’s also possible that the hard drive is starting to fail; the picture and sound can sort of pixellate, “seize up”, and refuse to play any more.
My experience with satellite tv is pretty crappy. Great picture unless there is a storm around. The picture will display artifacts or just black out. I live in south FL so this happens quite a bit. I will be going to digital cable very soon.