How do I know what LCD monitors with digital inputs I can use with my PC?
I have a Dell 4600, bought in June of '03, whose graphics card is a “128 MB DDR ATI Radeon 9800 with TV-out and DVI for Dimension Systems” according to the order documentation. It has the ordinary analog VGA style connector I am using with a CRT monitor, and also has a white connector I think of as a “DVI” connector. This graphics card was an upgrade when I bought it, chosen so I could run two monitors including a digital input one.
Now I see there are DVI-D, DVI-I, and DVI-A input monitors for sale. I am not sure there are any monitors that are just DVI. Is this one of those things they have messed up so horribly that I should not even try?
Also, how do I know what native resolution monitor the card can support? The desktop > properties > settings > advanced dialog lists 13 modes beyond the 1024*768 I am using now. If I dig around I find a dialog listing about 80 modes (these are resolutions * rates * color depths). Should I try to write them all down before I go shopping?
Jeez. I found Wikipedia and other articles and am readiing page after page of this stuff. I seem to have a DVI-I connector on my graphics card, and think I should be able to buy any DVI-D or DVI-I input monitor and probably a DVI-D cable and it will work. But I’m hardly certain, and still don’t know how to make sure the card can support the monitor’s native resolution mode unless I write all 80 of them them out ahead of time. Any advice would be appreciated!
FWIW I’ve used LCD monitors at other than their native resolution before, on other systems, and to me it looks like absolute hell, so I am sure I want to use the native resolution.
Well, I sure don’t know what happened, but I went to Best Buy and bought a 22" Envision LCD monitor with a DVI input, and plugged it in, and it worked. It’s beautiful. And here my stomach was in knots all the way home, because I was sure that this would make my entire system malfunction and I wouldn’t have any idea what my bank accounts looked like for the entire month before Christmas…
Baissicly any one would have. There are only two common ports and a monitor usually comes with an adapter. PLease do me a favor and set your monitor to the max resolution or the native and throw up the refresh rate to the max.
ChrisBooth12, I’m only too happy to comply. Set my monitor to the max or native resolution? I set my graphics adapter to the monitor’s native resolution, if that’s what you mean. I also set the new monitor to a modest refresh rate (because LCDs are always on and don’t flicker) and set the old CRT to 72 Hz (because I can’t see it flicker at 72 whereas I mind it at 60). And I told Windows not to “smooth” the type. I don’t know why others appear to like it, but to me building type out of mixed shades of pixels looks like something just broke.
There are still some weirdnesses. I set the new DVI LCD monitor to be the primary monitor, but when the system comes back from blanking the screens it has reversed 1 and 2 so each monitor is using the settings I intended for the other one and their positions on my physical desktop is the reverse of their positions on my Windows desktop. I don’t know how to fix this, so I changed my definition of which one is the Primary.
And only about half of the new monitor’s adjustment choices are available on its menu. I can’t adjust the brightness or contrast, for example.
And there’s a little bit of a dialog box visible at the top edge of the new monitor. I can’t drag it onto the screen. Maybe I’ll move the screen up in Properties to see if I can reach it…
And after I rearranged the two boxes on my desk, I find a funny pinkish purple cast to the CRT screen in its upper right corner. This looks to me like a magnetic effect, like maybe I disrupted its shield and need to degauss it. The cast is there no matter what the screen is trying to show, including when Windows is not yet running.