My old Radeon 9200 broke, so I went and picked up a new Geforce 6200 today.
I wasn’t paying attention and the store clerk handed me a version of the 6200 with no DVI connection. I realized this upon unpacking at home, and thought, OK, no biggie, my new LCD monitor does both analogue and digital.
I have just spent the last 2 hours trying to get the stupid POS to work in the correct resolution (1680x1050) and a refresh rate higher than 60hz. It seems that the card will do every resolution at every refresh rate EXCEPT the right one.
Having any resolution other than 1680x1050 gives me a hideous washed out picture, but at that resolution the refresh rate gives me hideous screen flicker. Since the new 6200 is supposedly a more advanced card, I can think of no reason for this other than some limitation on analogue connections that doesn’t allow that particular combination. I know refresh rates really have no relevance for an LCD monitor, and in any case the card will happily do HIGHER resolutions at HIGHER refresh rates, so I don’t understand what’s going on here.
So should I go back to the store and get one with a DVI, and will that solve my problem? It will cost me $10 extra, I’m not really worried about the cost as much as I am the hassle of going to the store during work hours (got busy work week ahead) so if there is some easy software solution, I’d be happy to hear it.