Because they really bother me. It’s like looking at a strobe light and trying to work on a spreadsheet in the middle of it. But I have four coworkers around me who all agreed that they couldn’t tell the difference between 60hz and 75hz on a computer after one got a new monitor. Not only can I tell a difference, but I can’t look at a 60hz refresh rate without getting a monster headache. They think I’m crazy.
On a similar note, what is the difference between people who can see the difference and those who can’t? Is one characteristic indicative of “better eyesight” than the other?
You are in for a world hurt (hertz?) the Christmas season, as cheap LED christmas lights become more prevalent. Most of these have a 60Hz flicker that is guaranteed to make your head ache. I’ve had to build a normalizer circuit to make ours flicker free.
60Hz is plain evil. 72 is the lowest acceptable to me, although we are a wholly LCD house now.
This is a topic that I know about well. I work in a consulting company with about 1200 employees in my location and a rather open floor plan that allows you to see hundreds of computers just by walking to the bathroom. 60 mhz refresh rates drive me insane. I have a theory (I am not really kidding) that people that aren’t that bright can’t process information fast enough to see the difference. My job requires me to assist many different types of coworkers and I ALWAYS mention it when I go to their desk and their monitor is set too low. I can’t stand looking at the screen.
As I fix it, I tell them to watch carefully and ask them to see if they can tell a difference and I usually don’t tell them what the difference should be exactly. The really bright ones don’t have it set that way to begin with but sometimes someone will say that it is much smoother and more stable and that they isn’t know it could be any other way. Others pretend like they get it but can’t really describe what is different so they are probably lying. At least half say they have no idea what I am talking about even as I have them watch as I switch back and forth. They tend not to be the sharpest bulbs in general but I have also “outed” a few dimwits that surprised me as well.
It seems like a colorblindness test to me and seems clear as day to me. It is testing something I tell you.
I’ve gone LCD for my monitor at home, and it’s a godsend. But at work, it’s terrible. We’re the only department who still has to use CRTs in the whole friggin’ company, but whoever it was designed our support pages and the systems we use to only run in 1280x1024. Which for some reason our monitors only run in 60hz, with the occasional one running at 72hz. I’ve had to leave mine at 1024x768 so I can get 85hz (the only truly comfortable refresh rate for me to use for extended periods), and learn how to either keyboard shortcut through the screens or get used to constantly scrolling/rearranging screens so I can see what I need.
Assholes.
Hubby doesn’t have a problem, but he’s got much worse eyesight than mine (though mine’s hardly a paragon of ocular spectacularity)
Absolutely. I have poor vision and a fairly large blind spot (as in, I can’t see the red light switch on power strips at night if I stare straight at them), but this drives me nuts. I can’t see the difference if I am looking directly at the center of the screen, but if I move my center of vision slightly to the side, I can see the screen refreshing at anything below 75hz. Since it requires that I actually look away from the screen to consciously tell, I often go for periods of about 10-15min using a foreign computer before noticing. My first warning is getting very pissed off and a huge headache without knowing why, and since I’m already pissed off, it takes me even longer to calm down enough to remember to check the refresh rate.
Huh. Just tried mine, which had been set at 60, and it does seem a little more vibrant. I could be imagining the difference, though. Would it matter that I’m using an LCD screen?
Yes. LCD’s don’t refresh in the same way, so they’re “stable” at pretty much any refresh. Many DVI connections (almost always used for LCD monitors) can’t do anything other than 60, anyway.
Put me down as the “hate it” camp. I’m all LCD now, but I occasionally have to take a test or something at one of those “Prometric” places. They all seem to have the displays set at 60 or worse. I’ve “fixed” this for a number of people over the years, all of whom said something like “Wow, I thought I just had a crappy monitor!”
This is strange. I suddenly started getting bad flicker on some games I have played for a long time. This happened following a disastrous upgrade following daylight savings time change. The company suggested that I change the refresh rate on my monitor (it was set at 60) and all the flickering went away (changed to 72).
I personally don’t see any difference except that the flickering during the game has gone away.
Anything less than 75 and I get nausea after a short time. A friend of mine for years hated computers and could never concentrate at all when working at home until I suggested he buy a new monitor that could handle a high refresh rate. Havn’t heard a complaint since.
Currently I have mine set to 85Hz and I can go as high as 100 but the optimal setting seems to be 85 for me (no sounds from the monitor and crystal clear to me).
I think so. LCDs and CRTs have different persistence rates; although they can be determined by chemical content during manufacturing, I believe, right now a LCD screen’s image lasts longer, so would seem to flicker less.
I base this on the observation that “ghost” images for moving objects on large LCD screens are common and the challenge seems to be to reduce the persistence for those screens, otherwise sports events on bigscreen LCD TVs look poor.
Perhaps someone else has more technical information, and I could be wrong.
On a CRT, anything below 75 Hz is a very noticable flicker for me, and gives me serious eyestrain and headaches after a short while. It is even distracting and annoying if it is on a neighboring desk’s computer in my periphery. I prefer 90Hz or above.
If I walk over to someone’s desk to help them on their machine, and they have a low refresh rate like 60 Hz, I reset it before even beginning to try and work over their shoulder.
I’ve even gone through the office floor and done stealth-configures on people’s PCs when they stepped away from their desk to bump up their refresh rates.
I once got into a very heated argument with a games graphics programmer (for a flight simulator) who had read a study saying that the human eye could not perceive anything greater than 30Hz, and so was going to make that the game’s maximum frame rate. The study he had read was actually written in the '60s and was referring to film projector shutters.
Anything less than 75Hz drives me batty - headaches, eyestrain, nausea, the works. 85 is even better, and I don’t see any difference going up from there. That’s all on a CRT, of course. On an LCD, 60 is fine.
I’m fine with 75Hz on a CRT, but 60Hz is very annoying. The flicker bothers my eyes and would give me a headache if I let it go on very long. But, like others who have posted, if I sit at a CRT monitor and it’s at 60Hz, I change it almost immediately. I’ve had a few people say “Wow, thanks!” and some say “Don’t look no different to me.”
I think much of the problem comes from working under florescent lights (as in most office environments). The flicker is somewhat less noticeable to me if the room is lit with incandescents. Even modern florescent lights have a 60Hz flicker (though not nearly as noticeable as they did forty or fifty years ago), and if the monitor is refreshing at the same rate, it drives me bugfuck.
Fortunately, I don’t have to work with CRT monitors much these days.
Did he never play games himself ? It’s extremely easy to see the framerate difference between 30 and 60 and 100 FPS. I’m actually quite surprised that an individual who worked in graphics didn’t experiment beforehand to see the differences.
Also on a somewhat related note, I hate how often I’ve seen people with good computer systems save some money by ‘getting a good deal’ on a cheap monitor. Shouldn’t the monitor be one of the pieces in your brand new system that you are willing to pay more for in order to ensure excellent quality?