I know that more is generally better, but is there any real, noticible difference in the picture on a 50in 1080i set and a 50in 1080p set? If so, does it depend on the source material (i.e., DVD v. Blueray)?
Yes. You’ll be able to tell the difference. It’s getting pretty subtle at that point, though. For some things (fast sports), even 720p will produce a noticably (again subtly) better picture than 1080i.
And yes, the source matters. Broadcast TV is never 1080p, because of bandwidth limitations. Local devices (DVD players, game consoles, DVRs) can produce it, but the source needs to have the data available. DVDs, for example, are being upsampled, you won’t see much difference because even for 1080i they’ve gone beyond the information present on the disk. For a Blu-Ray/HD-DVD, the information is present, and you’ll get the best image with 1080p.
Would I pay even $100 extra for a 1080p set vs. a 1080i one? Probably not, although it’s rare to have to make that distinction. I would (and did) pay extra for a 1080 set that actually had 1080 lines of pixels - a suprisingly large number of sets that claim 1080 don’t actually have that many pixels.
In such cases what does the 1080 mean?
It means that they can receive a 1080 signal (i or p) and do something with it (usually, downrez it to fit their lesser-resolution display).
1080 lines of vertical resolution. i or p stands for interlaced (i.e. the images are scanned in alternating lines) and progressive (each actual frame is a single image). The i protocol gives you twice the effective frame rate, but can result in flicker, particularly if the images change dramatically between frames, so progressive scan is almost universially regarded as superior. You can (which high definition media and bandwidth) see a definite difference between i and p protocols; however, you’d be hard pressed to distinguish much difference between 720p and 1080p at standard home theater ranges and sizes; for the human eye to resolve a difference between them at viewing distance would require something like an 80" diagonal screen. If your choice is between 1080i and 1080p, you’re better off buying a 720p screen and a better quality HD-DVD/Blu-Ray player or audio system.
Stranger
It means it’s capable of actually accepting the 1080 signal and generating a picture from it (via downsampling) at whatever resolution is actually available. Deceptive marketing at it’s finest.
Stranger, we’re talking about TV’s that advertise 1080<whatever> capability, but don’t actually have 1080 vertical lines of pixels (which seems to be most of them, these days). Also, to address your comments: one of the advantages of high def is that viewing distances can be smaller. On a 50" screen (not excessively large) at 12 feet, the difference between 720 and 1080 is pretty clear, although you’d have to pickier than I am to care.
I remember an article in a high end video magazine saying flatly that the average viewer would be unable to distinguish between a 1080i and a 1080p picture at a standard viewing distance, regardless of the quality of the set or player.
A 720 picture may be noticeably different. Above that we’re talking increments too small to care about.
I would not doubt this at all. Though I’d be curious if there have been any studies done to confirm this inkling?