Monitor refresh rates: what's the deal?

So, for reasons i don’t need to go into here, i spent half of yesterday backing up my files and doing a fresh install of WinXP on my computer. Anyway, all my programs are up and running again, and everything seems great.

I was playing around with my display settings, and noticed that my monitor’s refresh rate is set at 75Hz. It also has options to set it at 65, 85, and 100.

Now, i’ve heard that the higher the rate, the less flickering there is. What i’m wondering is whether there is any downside to choosing the 100Hz rate? I tried selecting it, and the only thing that appeared to change was that my desktop no longer stretched to the edge of the screen, requiring some fiddling around with my monitor controls.

Anyway, if anyone has any adivce or insight about refresh rates, i’d be glad to hear it. In case it matters, my display specs are:

WinXP Home
nVidia GeForce FX 5200 video card
Samsung 19" 997DF monitor
1024 x 768 screen resolution

The only input I have is the folllowing.

Higher rates are definitely better (easier to look at) but they are more difficult for the monitor to maintain. The better monitors have higher rates at each resolution they display at.

However, beyond about 75 the difference is hard to notice. You’ll notice if you set it at 60Hz.

I don’t think humans can see flicker at anything above 70 Hz, so I’m not sure why you would bother with the higher rates. The beam that draws the screen has to travel faster at the higher rates, and at high resolutions I’d expect to be some smearing if it doesn’t have enough bandwidth (and bandwidth = $). If yours look equally sharp up to 100 Hz, then there’s no reason not to use that, but then again there’s no reason to.

Thanks for the replies. It looks fine at 75, so i guess i’ll leave it there unless anyone can convince me that there’s a reason not to.

I am human. I can detect and see any refresh rate below about 90. and am less likely to get headaches at 100 or higher. The disadvantage is that it does affect the performance of the pc, seems to use more memory. If you can’t see the flicker, keep it as low as you can stand.

Sometimes you get used to the flicker, though.

I had a crappy old PC and monitor, which had a refresh rate of 60. I had no idea that was “bad.” When I started using better computers/monitors, and especially after I starting using an LCD monitor (which doesn’t have the same flicker issues as CRT), I can’t look at a CRT—even with a high refresh rate—and not notice the flicker (and be bothered, if it’s low enough). It’s not my imagination, either.

Anything under 100 drives me crazy; it makes my eyes feel tired, and I get headachey.

The refresh rate is the number of times per second that the screen is updated. For a CRT, the higher the better, as it reduces eye-strain. The standard refresh rate is 60Hz, at which refresh rate only 4% of the population will notice a flicker, IIRC. I’m one of that 4%. I prefer refresh rates of 85Hz or better, and run this monitor at 1600x1200x16M@100Hz.

Some games won’t run with certain refresh rates. I don’t know why, but I had the hardest time finding the right resolution and refresh settings for the latest Vampire Chronicles game.