Can you see the diff. between refresh rates?

60 Hz refresh rates look to me like they are flashing, but 75 and above look okay to me.

But I read on this board that TVs only have about a 25 Hz refresh rate. Yet, TVs don’t look to me like they are flashing.

Any reason that would be?

-FrL-

I think that 25Hz is a bit low for a refresh rate. I would be really surprised, if that is what most TVs run at.

According to this site, the average TV runs at about 60Hz, which sounds much more reasonable to me.

On a side note, 60Hz usually looks fine to me, except when I see the screen in with my peripheral vision. For some reason, then I see the flashing.

LilShieste

I think you’re confusing refresh rate with frame rate. The frame rate of the tape of a tv show is 24. the refresh rate of the TV will be much higher.

And yes, I do notice the difference between 60 and 75. The other day I realized I’d had my monitor set on 60. I set it to 75 and it was a lot easier to look at.

WAG, but normal electric lighting is effectively refreshing at twice the mains frequency, so either 100 or 120Hz depending where you are. I presume there will be interference between this and the refresh rate, which might be the visible flicker.

I have mine at home set at 75Hz, and I can really tell the difference when I use the school computer lab computers. The flicker is really hard to look at. I don’t know what they’re set at, but I can’t change them anyways, so I deal. I can see how some people complain of eye problems and headaches when they use the computer. I can spend hours at my home computer with no problems, but not the ones at school.

Some people are confusing Easter with arse.

The strict answer is that the TV frame refresh rate is 30 Hz in the USA and 25 Hz in Europe.

Refresh rates under about 50, even with high persistence phosphor, will cause discomfort.

For TV the bandwith required to refresh the frame at that rate was too high unless you made the picture unacceptably small. The solution was to refresh the picture in two halves so you divide the frame into two interlaced fields and refresh each field at 30 hz (25 in Europe). The actual refresh rate of the frame is 30 hz but alternate lines are refreshed staggered by half that.

60htz gives me a headache after a few minutes, and I can see the difference right away. 85htz seems perfectly stable, so that’s what I use for all resolutions.

All standard broadcast tv’s in North America are interlaced, and repaint half the screen, every other line, 60 times a second. The odd lines are repainted, then the evens, etc. Each of these is called a field. Combine the two fields and the whole picture is repainted; this is called a frame. Thus tv refreshes at 60 fields per second, or 30 frames per second. VHS tapes work this way also, repainting 60 fields and 30 frames per second.

Thus, broadcast television has a refresh rate of 60htz interlaced. This would indeed be hard on the eyes on a computer screen, but on a tv, it doesn’t bother you. This is for two reasons. First, you sit much closer to a computer screen; typically about a foot from the screen compared to 5 or 10 feet from a tv. Second, a televison displays a nearly constantly moving image. Display a static image and sit very close to a tv, and you’ll experience the same headache inducing flicker you get from a 60htz interlace signal on a computer monitor. DVD’s have more lines of resolution, but mostly operate at 60htz interlaced.

Movies on DVD have 24 frames on them for every second, each frame split up into two fields, so there are actually 48 fields to be displayed. This doesn’t match the 60 fields per second of video, so every forth field, or the second field of every other frame, is repeated, creating 60 fields per second to match the tv’s refresh rate.

Progressive scan dvd players played through a tv that can display a progressive signal work slightly differently. They refresh the whole screen at once (like a computer monitor), not in two separate fields, at 60 frames (not fields) per second. But there are only 24 frames on the DVD to be displayed, each split into two fields. The progressive scan dvd player integrates the two fields of each frame before displaying it, then displays the odd frames twice, and the even frames three times. Which means frame A gets played twice, frame B three times, etc., which creates exactly 60 frames displayed in one second.

HDTV works a little differently. The most common resolution is 1080i, which works somewhat like standard tv, but with twice the vertical and three times the horizontal resolution. It refreshes two alternating fields every 1/60th of a second to create 30 full frames each second, but with 6 times the detail of standard tv. 720p, the second most common HDTV resolution, is progressive, in that it redraws the whole screen every 1/60th of a second. Technically, this means that it’s possible to have 60 unique frames every second, but in practical terms, it doesn’t work that way. Most HD video is recorded in 1080i, and for stations that use 720p, the signal is downconverted. This means that if you’re watching a 720 progressive signal, it most likely has only 30 unique frames shown per second, with each frame shown twice. It still looks fantastic, though.

I suffer from the same thing, essentially. Although, I can still see the flicker if I look straight on, but out of the corner of my eye it’s just maddening…