Is there a limit on refresh rates for analogue monitor connections?

My old Radeon 9200 broke, so I went and picked up a new Geforce 6200 today.

I wasn’t paying attention and the store clerk handed me a version of the 6200 with no DVI connection. I realized this upon unpacking at home, and thought, OK, no biggie, my new LCD monitor does both analogue and digital.

I have just spent the last 2 hours trying to get the stupid POS to work in the correct resolution (1680x1050) and a refresh rate higher than 60hz. It seems that the card will do every resolution at every refresh rate EXCEPT the right one.

Having any resolution other than 1680x1050 gives me a hideous washed out picture, but at that resolution the refresh rate gives me hideous screen flicker. Since the new 6200 is supposedly a more advanced card, I can think of no reason for this other than some limitation on analogue connections that doesn’t allow that particular combination. I know refresh rates really have no relevance for an LCD monitor, and in any case the card will happily do HIGHER resolutions at HIGHER refresh rates, so I don’t understand what’s going on here.

So should I go back to the store and get one with a DVI, and will that solve my problem? It will cost me $10 extra, I’m not really worried about the cost as much as I am the hassle of going to the store during work hours (got busy work week ahead) so if there is some easy software solution, I’d be happy to hear it.

There is a limit to the maximum refresh rate on both analogue and digital monitors. The bandwidth of the RAMDAC to be precise.

I would say your problem is more down to pushing an analogue signal at such a high resolution. Using an analogue card with a digital LCD means the monitor has to try to sync and convert the incoming analogue signal to map to the digital display. One side effect which you have seen is a washy, wiggly picture.

Take it back and get a DVI version - this will give you a rock steady picture. I was going to talk about the refresh rate you are pushing into the LCD but the end result is as before - you need a DVI card.


You’ve got a LCD monitor, not a CRT. Refresh rate shouldn’t matter. I wonder if the problem is that the 6200 can’t really cope with the odd screen size. For a Nvidia card, I’d suggest getting one of the 7 series rather than the 6 series. But why not get a direct replacement ATI card?

I think it really was an issue with the monitor’s analogue to digital conversion. I just plugged in the new 6200 with DVI, and the picture was perfect. I didn’t change anything, the “refresh rate” is still set at 60hz.

They had 7300s for $129, the 6200 with dvi was $69 with $10 rebate. I’m trying to do things on the cheap here, the rest of the computer isn’t exactly a hot rod so any more graphics power would be wasted anyway.