What is the difference exactly between 780p, 1080i and 1080p?

This is an excellent suggestion; my wife and I did something similar when we were installing built-in shelves and cabinets around the fireplace in our family room back in Seattle. We knew that we wanted to buy a flat-panel TV to go over the fireplace, but were torn between a 50" or a 42" inch set.

Eventually, we looked up the dimensions of the two on the company’s web site, then with a T-square and tape measure drew two rectangles of the appropriate size on the old panelling (which was coming down as part of the renovation anyway). Then we got chairs out, and sat in various places in the room to see how large the TV would appear.

Although we’d been leaning towards the 42", after this little experiment we settled on the 50", and it worked out great.

Oddly enough, the sets often look larger at the store, because the displays are designed to draw your attention to them. Once installed at home (especially if they’re wall-mounted), they don’t seem nearly as large.

As a follow-up; we just had a 42" installed in the bedroom in our new house, and it’s not at all intrusive. Of course, it replaced an old 27" CRT tube that was balanced atop a dresser, so that probably has something to do with it :wink:

I don’t think so, but SD programs, especially analog, often don’t look as good on an HD TV as they do on an SD TV. If possible, I usually watch SD content on their HD channels-- it’s a better picture. You don’t have the option of stretching it to fill the screen, but I actually prefer it unstretched. This is something that comes as a surprise to many people when they get their first HD TV.

To some extent they do; that is, they have to adapt an image with one set of dimensions to another set of dimensions. However, there are a couple of differences:

  1. On a computer screen the conversion is relatively simple and just uses an algorithm to double the height or width of some pixels, and not double others (although this may depend on the brand of screen; some may use the next method).

  2. Screens designed for video will use what’s called a scaler (or, on older HD RPTV sets, a line doubler). What this does is interpolate the missing lines (in the case of going from say 720 to 1080) by figuring out what it thinks should go there based on the surrounding lines. Depending on the resolutions being converted from and to, they may also look at the previous frame and the next frame.

Note that even with a scaler, computer text will not look nearly as good as when the image resolution exactly matches the screen’s native resolution. The scaling is less noticeable with pictures, and even less noticeable with moving images, which is why, in most cases, it’s not noticeable to the average viewer.

:dubious:

What television do you have? I suspect that it is not deinterlacing a 1080i signal properly, due to poor algorithms/cheap deinterlacing chipset.

Could someone explain this, please? Why shouldn’t the 1080 display, with 50% more vertical resolution, look noticeably better than the 720 display, assuming an incoming 1080 image, and a suitably close seating position?

I keep hearing people saying this, and I just can’t see how it could be true.

We actually took a laptop with us to Best Buy and tried 5 LCD panels and 3 Plasmas. We research for months. It had the best computer text reproduction by far, a full 170º viewing angle and excellent Energy rating. The picture is perfect. The difference between text viewing from VGA to DVI is pronounced.

We’ve had it about two years now.

It is a Westinghouse 37" LCD Monitor. LVM-37w1 37 LCD TV (Westinghouse-LVM37W1).

Well, I was talking about the TV signal. That is only a 1080i signal not 1080p. 1080i and 720p are very close in quality. The 1080p will look better if you feed it from PC via HDMI or DVI or from an HD DVD.

Wouldn’t 1080i be the same as 540p? Maybe i am getting my eggs mixed up. :confused:

I found my own answer at Wikipedia:

Still, I have to think that with many images, 1080i will look better.

No.

1080i has the exact same resolution as 1080p on material 30fps and below (which includes both television (30fps) and film (24fps) material).

The difference between 1080i and 1080p is how the video frames are sent to the display. In 1080i (interlaced), the frames are split up into two alternating fields and sent to the display separately. 1080p is sent full-frame.

Interlace is a holdover from CRT displays which use an electron beam to “paint” the picture by exciting phosphors on the tube. The phosphors “fade out” after a short time, so if we were to “paint” the screen line by line from top to bottom, by the time the picture was complete, the top would have already begun to fade and you would have some seriously massive flicker, and likely a headache in short time. They solved this problem by alternating lines so that either even or odd lines were being refreshed with each scan of the electron gun. This method cuts down on the flicker dramatically (though if you look close enough, you can still discern the flicker).

Fixed pixel displays like LCD have to display whatever signal it is sent in its native format. So if you have an LCD panel which has a native resolution of 1920x1080, and you send the panel a 720p signal, the panel has to upconvert the 720p signal to match the native resolution of the panel.

If you send a 1080i signal to the same panel, and the panel utilizes a quality upconversion chipset, you should see little to no discernible difference between a 1080i signal and a 1080p signal.

Wait, what?

You said earlier, and I quote:
"I use my 1080p LCD panel as both my family room TV and as an occasional use monitor off a computer. Via a DVI output, I push 1080p signal and in looking at text, it is actually very easy to see difference between 1080p and 1080i. "

1080i/1080p does not have anything to do with whether you connect your laptop via analog (VGA) vs. digital (DVI) connection.

I suspect what you meant to say was:

"I use my 1080p LCD panel as both my family room TV and as an occasional use monitor off a computer. Via a DVI output, I configured my display settings to 1920x1080 and in looking at text, it is actually very easy to see difference between DVI and VGA "

Which actually makes sense and is correct. Is this what you meant?

I guess so, in reading I had done leading up to the purchase, I had read that VGA max signal was 1080i and DVI/HDMI was 1080p. It sounds like this was an approximation and not technically correct? Now I am confused.

Jim

Pardon. I was always under the assumption that DLP (when compared to the same resolution LCD or Plasma) was heads above the other technologies as far as picture clarity when viewed from optimal seating distances. No matter the size.