Hells yeah, 60 and 72Hz cause me so much annoyance and eye-strain. 75 is alright, but 85 (or better) is best.
60Hz and below bug me, but not excessively so. I grew up through computer eras where your monitor was a television set. When I moved into the 16-bit era and eventually got a monitor for that, it had a maximum refresh rate of 60Hz, and when imported PAL games were played it would drop down to 50Hz.
These were low-resolution days though, so you weren’t exactly examining highly detailed images that required extra focus on the screen. Once I moved into the PC realm, I couldn’t stand 60Hz refresh rates. I had to set it at 75Hz or above. Last Christmas I went LCD when I finally found one that met my high standards (no ghosting/afterimage, no bleeding, no sub-pixel masking issues, high contrast, high resolution, accurate scaling, bright backlight, clean, crisp image, etc.) and I couldn’t possibly look back now.
Low refresh rates do me too. 75 Hz is about my minimum, with 85-100Hz being better. If memory serves, about 96% of people are supposed to be okay with a 60 Hz refresh rate. In the places I’ve worked, I’d put it at 70% or so, but still a clear majority. But LCDs are vastly better: I gave up a 22" CRT (1600x1200@100Hz) for a 19" LCD at home, and a 20" CRT for two 17" LCDs at work. I’ve got two LCDs at home now.
Yup, definitely makes my eyes bleed. I really can’t even look at a screen with a flicker like that, and I’ve made people change their refresh rate if they want me to help them out with something that requires me to look at their twitchy screen.
Mr. Athena can’t tell the difference. I had to change the refresh rate on his new monitor once, and he absolutely could not see what i was seeing.
I’d be interested to know why some people have issues with this and others are completely oblivious.
One more vote for hating anything below 75 Hz on a CRT and preferring even faster.
I don’t perceive much difference in an LCD, but I also haven’t reallytried low refresh rates on LCD machines; all our gear is now good enough that I don’t have to trade low refresh to get hi res or high color depth.
I truly believe this is something that will be of fundamental interest to researchers in vision processing. It seems so clear-cut to me like a color-blindness test. To those of you that can’t see what we are talking about, the older-style CRT monitors (not the flat ones getting more popular very day) flash in an obvious way we they are set to their lowest refresh rate of 60 hrz and that is often the default on many set-ups. Changing the setting isn’t an incremental difference. It makes the problem go away almost completely like an ON/OFF switch and some people can see it and others can’t.
If you want to visualize it for yourself, a slightly exaggerated version of the effect can be seen by anyone when you see computer screens shown on TV such as the news. Filming shows an obvious flash that (hopefully) anyone can see.
Not only do I see no difference in 60 and 75, but 85 (the highest this monitor can handle) actually induces nausea.
But, this monitor and my eyes both suck, so…
Can’t abide anything below 85 Hz, though I could probably get used to 75 Hz if I had to. I seem to be very sensitive to it, even more so than my husband.
LCDs don’t work quite the same way CRT monitors do. There’s no cathode ray gun and no phosphor-coated tube for the guns to excite into displaying something.
CRTs use cathode ray guns to excite the phosphors to a certain degree to display an image, one pixel at a time, left to right, top to bottom. This is known as raster scanning. Each raster line takes a particular amount of time to get from the top of the screen to the bottom. It also takes time for the raster line to move back up to the top and start its next scan – a period known as the vertical blanking interval (VBI) – and during this time the image created from the last pass begins to fade. The number of times the raster line can make that trip from the top to the bottom constitutes the screen’s refresh rate, and many people can detect this at around 60Hz due to the contrast between the displayed image of the next frame and the fading image caused by the VBI from the last. The lower the refresh rate, the fewer times the raster scan occurs per second, and the more the image fades during the VBI, making it more noticeable.
Modern LCD monitors work differently, though. There is no raster line as such. Instead the screen is composed of thousands of discrete microprocessor-controlled thin-film transistors (TFT) that form a grid over the screen. An image is constructed by the microprocessor in the monitor taking pixel-by-pixel information from the video card and sending a signal of specific strength to the coordinates of each corresponding transistor on the screen, which in turn excites the liquid crystal at that grid location into displaying a certain colour. It resembles a raster line but it’s much, much quicker. If anything, poor performance out of an LCD monitor occurs in the opposite manner of a CRT: Where a CRT’s low refresh rate results in noticeable flicker due to the image disappearing before the next frame can be scanned, an LCD’s poor refresh rate occurs when the crystals don’t “turn off” quick enough, resulting in “ghosting” – the afterimage from the last frame (or few frames) of image data because the crystals can’t change over quick enough. Refresh rates in LCD monitors are measured in milliseconds, and with LCDs, lower is better.