I wanted to figure out what it meant to have a “progressive scan” dvd player. Upon my research, I learnt that tv displays video at 60 fields per second (30 frames, with two fields a second). It is my understanding that 1 frame is split into two fractured “fields”, one showed after the other within a fraction of a second, and looks like the original image. My question is, why bother with all of this interlacing business. Why can’t just 1 full frame be shown at a time on tv?
Interlacing was developed back in the time when the phosphors (coating on the glass that emits light) on tv sets would not stay bright enough for a long enough time to wait for the beam to draw the whole screen then come back and refresh that spot.
So, by interlacing, the beam retraces that general area of the screen twice as often, and thus your screen appears to stay a nice constant brightness.
Also, by displaying info at 60fps instead of 30fps, you get much smoother motion. You then sacrifice percieved resolution (eg 640x480 display drawn interlaced will appear to have approximately 70% of the resolution as a 640x480 display drawn progressively), as well as introduce some jaggy lines when there is horizontal motion.
The new HDTV modes have lots of support for progressive displays. 480i/p, 540i/p, 720i/p, 1080i are the current standards in the US, I believe. It is a shame that the highest resolution (1080i, which is 1920x1080 interlaced) is only interlaced and not progressive.
For openers, you can’t think of a TV as some sort of movie projector: it isn’t just shining a light through a tiny image, and blowing it up onto the screen. In its simplest, black-and-white form, there’s just a single beam of electrons being fired from the back of the tube. A electromagnetic coil around the “neck” of the tube then steers that beam on the backside of the “screen.” Where the beam hits, the phosphors light up.
So, it can’t show the whole “frame” at once. It has to start in one corner, go across, drop down a row, come back across, drop down a row, etc. That’s “progressive scan.” As Diddly has explained, the scanning rate wasn’t fast enough to produce a convincing image this way - the phosphors near the top of the screen would stop glowing by the time the beam got to the bottom of the screen, which produced an uneven effect. So they used “interlaced scan,” instead. With this technique, the beam is steered across rows 1, 3, 5, 7, etc., until the bottom of the screen, at which point it comes back and moves across rows 2, 4, 6, and so on.
A progressive scan DVD player will only give you a better picture if you also have a HDTV, or some other kind of progressive scan monitor. And even then, you probably won’t get any benefit when you watch TV series on DVD, because they’re interlaced on the disc, and combining two interlaced fields into one progressive frame would make the motion choppier and introduce “tearing” artifacts.
[nitpick] NTSC TV signals actually draw 29.97 frames per second
I do 3 D animation and video work, and still have a lot of issues with what format I should choose, and which field should be displayed first. Oh the confusion!
There is at least one, maybe two, TV’s that support 1080p. No broadcasts in that format or anything, but damn just thinking about it makes me want to drool.
I don’t think so. I do not think phosphors being too fast was an issue at all. You can see plenty of oscilloscopes from that era with really persistent phosphors. WWII-era radar scopes had very long persistance. I believe the problem was not with the phosphor at all but with the electronics and the bandwidth they could handle.
Early TV sets used the mains frequency for picture sync so the picture frequency had to a a multiple or submultiple of the mains. Using 60 frames/sec would have been ideal but with the bandwidth available it would only yield a picture of about 250 lines which was not enough. Using 30 fps and 525 lines gave enough flicker that some people found it unacceptable. So interlacing is a trick way to effectively send 30 FPS and yet refresh 60 times/s to avoid the flicker which is not caused by the phosphor but by the refresh rate. Try using a computer monitor today at under 60 Hz and a lot of people will complain no matter what the phosphor.
In Europe they did the same interlace trick but with their lower 50 Hz mains frequency they could get more lines into the picture (625) using the same bandwidth. Note that 50625 is about the same as 60525. Bandwidth was the limiting factor, not phosphor.
BTW, I believe movie projectors do (sort of) the same thing and refresh the projected image 48 times/second, twice the 24 FPS at which it was recorded, because many people find a refresh rate of 24 FPS objectionable.
Sailor, if they do that (not sure) it’d be the same frame shown in its entirety twice, as opposed to interlaced which shows half the frame, followed by the other half (like weaving your fingers from each hand together Venitian blind style).
It’s why video (60 fields/second) looks so much smoother than film (24 frames/second). I prefer the look of film, and not just for the higher resolution, color and contrast. The framerate has a lot to do with it. Video just feels too… “newsy.”
PS - for those of you not really familiar with the different looks, watch a rerun of All in the Family (shot on video) or a soap opera (also shot on video) and compare it to something like Seinfeld, which is shot on film, or any major movie other than the 2nd episode of Star Wars. 28 Days Later was also shot on video. I know there are other 80’s sitcoms that were shot on video, but can’t remember which ones off the top of my head. But once you recognize the differences, they are blatantly obvious.
I thought interlacing was chosen to reduce the required bandwidth, not because of any phosphor decay issues.
Yes, and. . .? Maybe I am missing something but I do not see how your post contradicts, modifies or expands and what I posted or what others posted. To me it appears to be a non-sequitur to my post but you address it to me so maybe you can explain what I may be missing.
Actually, both explanations are correct. Back in the early 1930s you could counter the phospher decay problem (phospher not holding the electron charge long enough), and its associated flicker, by increasing the field scan rate. But if you increased the field scan rate, you had to jack up the bandwidth. Solution: line interlacing.
Interlacing, like DVD and digital audio artifacting, is something you learn about only to wish you hadn’t. Once you notice it, it’s much easier to notice, and eventually all interlaced video looks interlaced.
Practically all DVD players, even the cheapest, are now progressive-scan, but most TVs are still not progressive-scan capable. So progressive scan on a DVD player becomes the kind of feature commissioned salespeople like to use to tell you how great a product is when you can’t really use it. (warning: OT story about commissioned salespeople) The satellite-TV monopoly here is currently attacking the digital-cable monopoly with a “hey, where’s my digital?” campaign, and I recently heard a commissioned-salesperson try to sell satellite TV to someone on the premise that digital cable ‘isn’t really all digital’. Like progressive scan on DVD players, this only matters for people with HDTVs (except that the non-digital channels don’t have 5.1 Dolby Digital signals), but obviously the salesperson won’t mention that. The rebuttal argument made by the digital-cable monopoly is that satellite TV stops working in bad weather.
I don’t think so. Flicker has almost everything to do with scan rate and almost nothing to do with image persistence. You just cannot get a good image with low scan rate no matter what. A phosphor with long persistance (and I believe they were easily available at the time) will not solve the problem for two reasons: (1) Too much persistance will make rapidly moving images leave trails and become blurry and (2) movement is achieved by changing the pattern of illuminated pixels so with every image you are illuminating new pixels which were dark and these are instantly turned on so you get new nearby pixels being illuminated progressively with the frequency of the scan rate which makes for flicker. I believe the reason for alternating scan lines has everything to do with dealing with limited bandwidth and nothing to do with persistent phosphors not being available. Oscilloscopes and radar scopes of the period had way more persistence than was required by a TV set. Not to mention that a movie projected in a theater has zero persistence and has the same problem that flicker becomes noticeable at slower frame rates. Phosphor persistence is not necessary because the retina does all the persistance which is needed. What you need is a high frame rate, not high persistance. Long screen persistance and low frame rates give flicker and lousy images with movement. Short persistance (or no persistance like in theathers) and high frame rates give good images.
Wow, does that sound a bit snippy to other people, or just me?
Anyway you had written:
“…movie projectors do (sort of) the same thing and refresh the projected image 48 times/second, twice the 24 FPS at which it was recorded…”
I wanted to clarify that to other readers because you didn’t mention the double exposure per frame, and one obviously doesn’t want to project half a frame followed by the other half. If something is shot at 24 FPS and projected back at 48 FPS, then you’re basically watching a movie at double speed. I apologize if my explanation was a waste of your reading time.
Sailor didn’t write that movies were projected as 48 frames per second, he wrote that 48 images are seen per second. Although his wording was awkward, he is correct, with an explanation. For screens larger than 45 feet in width, or when extra light is desired, a double shutter for the projector is recommended to reduce flicker. The double shutter has two blades instead of one 180-degree blade, and thus shows each frame twice.