Monitor questions: DVI vs. VGA connector?

So I got a new computer and monitor a couple weeks back, and I’m very happy with them. However, the monitor came with a standard connector cable, which required an adapter to plug into the video card (which only has DVI outputs). I decided to look around for an actual DVI cable to make the most out of the monitor’s and card’s capability. I browsed around Best Buy and the only option they had was a 10 foot cable for $50. First off, I need maybe 4 feet. Second off: OMG Sticker shock!

I asked the tech guy there how much of a difference the cable would make over what I was using, and he gave the comparison of “DVD vs. VCR”. Well…ok. “Or Monster Cable vs. generic cable”, which twitched my BS-meter. Some browsing around online on amazon and newegg came up with cables more in-line with the price I was expecting. But I figured I’d get a second opinion, so I’m bringing this to the gathered genius of the Dope:

[ul][li]Monitor: 22" flat panel LCD. Available inputs are two 15-pin D-sub (one of which I’m using now), and a DVI-D with HDCP. Running at 1680x1050 and (I think) 60Hz refresh rate.[/li][li]Video card: nVidia 8800 GTS. Available outputs are two dual-link DVI-I connectors.[/li][li]Currently using a converter (came with the card) to connect to the monitor cable (came with the monitor).[/li][li]Main uses for the computer are web surfing, games, and some text editing. I don’t do any graphics design or photo manipulation work (at least nothing that requires spot-on accuracy on the monitor)[/ul][/li]All the tech specs are from the manuals, I really have no idea what they mean. What’s the difference between DVI-I and DVI-D? Is “15-pin D-sub” the same as “VGA” (which is what I think of when I see those pinned monitor ports)? And most importantly: is there a real visible difference between the DVI and non-DVI connectors? Yes, I understand that digital -> analog can lead to sampling artifacts, but is it noticeable under normal usage circumstances?

I’m sure others will come in and provide the details, but I’ll start off the anecdotes. I’m currently running a dual monitor setup, one of which is dvi, and the other standard. Personally, I don’t see a difference whatsoever. Both screens look the same to me.

I do lot of graphic design, so I’m obviously an expert on this! :wink:

*this isn’t to say there aren’t any differences, just that IMO they might not be visual. Perhaps the dvi connection provides less latency?

Also anecdotally, my colleague is a visual engineer and I asked him the same thing not so long ago.
15 pin D-Sub (VGA) will go blow for blow with DVI upto WQXGA 2560×1600
After that it has to be DVI … I don’t remember the reasons :frowning:

Seconded, the difference is trivial.

Is analogue VGA accurate enough for ClearType to work well?

DVI-I is “Integrated” - it has both VGA (Analog) and Digital signals on the connector. DVI-D has only the Digital signals, and can’t be used with a VGA monitor.

15-pin D-sub is the standard VGA connector.

I find that DVI provides perceptibly better performance than VGA. There always seems to be a slight shimmer on VGA, whereas DVI is rock-solid.

There is a slight but definite difference between digital and analog displays. On analog, you’ll see a slight fuzziness or moire effect, especially with text. You can use the auto-adjust, and you may be able to use the monitor’s on screen settings to make fine tune the display, but that usually just moves the fuzziness somewhere else on screen. With DVI, all these adjustments are rendered moot; the display is rock-solid.

Ok, so the DVI-I on the output of the video card means that it’s able to push either digital or analog data, while the DVI-D on the monitor means it will only accept pure digital signals on that port (while obviously the VGA input ports will be accepting analog). That makes sense. And I haven’t noticed any shimmer or fuzziness, but then again I haven’t really been looking for any (and didn’t have the DVI cable to compare).

Given that I’m able to get cables from newegg for ~$20 (and can purchase a length that won’t leave 6 feet of slack lying around to trip over), I think I’ll make the switch.

Thanks for the help.

In my experience there’s no fuzziness with a good quality analog signal. Unless you have poorly shielded or very thin, long vga cable, crappy digital-analog conversion, or are displaying a non-native resolution on an lcd monitor, it should be nearly impossible to tell the difference between dvi and vga.

That said, there’s absolutely no reason why you should have to pay that much for a dvi cable. It seems these days buying cables of any sort at a big box retailer is just asking to be fleeced. I just googled up a 6’ dvi cable on Newegg for 10 bucks. That’s the price range you should be looking at.

When I bought Supreme Commander, I bought an LCD to use as a secondary monitor the same day, and like the OP, it came with only a VGA cable, so I went all over town looking for a store that carried DVI cables (after 5:00 on a Saturday). I finally found one at Radio Shack… for $70! That was the clearance price; the original price was more than $100. I decided I wasn’t in that much of a hurry to play, and picked one up at another store the next day for $20.

Mail order is your friend. You can get a DVI-I cable for £5.