So I got a new computer and monitor a couple weeks back, and I’m very happy with them. However, the monitor came with a standard connector cable, which required an adapter to plug into the video card (which only has DVI outputs). I decided to look around for an actual DVI cable to make the most out of the monitor’s and card’s capability. I browsed around Best Buy and the only option they had was a 10 foot cable for $50. First off, I need maybe 4 feet. Second off: OMG Sticker shock!
I asked the tech guy there how much of a difference the cable would make over what I was using, and he gave the comparison of “DVD vs. VCR”. Well…ok. “Or Monster Cable vs. generic cable”, which twitched my BS-meter. Some browsing around online on amazon and newegg came up with cables more in-line with the price I was expecting. But I figured I’d get a second opinion, so I’m bringing this to the gathered genius of the Dope:
[ul][li]Monitor: 22" flat panel LCD. Available inputs are two 15-pin D-sub (one of which I’m using now), and a DVI-D with HDCP. Running at 1680x1050 and (I think) 60Hz refresh rate.[/li][li]Video card: nVidia 8800 GTS. Available outputs are two dual-link DVI-I connectors.[/li][li]Currently using a converter (came with the card) to connect to the monitor cable (came with the monitor).[/li][li]Main uses for the computer are web surfing, games, and some text editing. I don’t do any graphics design or photo manipulation work (at least nothing that requires spot-on accuracy on the monitor)[/ul][/li]All the tech specs are from the manuals, I really have no idea what they mean. What’s the difference between DVI-I and DVI-D? Is “15-pin D-sub” the same as “VGA” (which is what I think of when I see those pinned monitor ports)? And most importantly: is there a real visible difference between the DVI and non-DVI connectors? Yes, I understand that digital -> analog can lead to sampling artifacts, but is it noticeable under normal usage circumstances?