LCD and MHz

OK, when LCDs first came ot, dpi was touted as the measuring stick. The more dpi, the better the picture. Now, I see commercials where they mention a LCD’s MHz. OK, so what does this have to do with the TV?

The refresh rate. How often the picture changed per second. Higher = better. CRTs have always had a better refresh rate, even in HDTVs, LCD and Plasma can just be made bigger. Higher refresh rate will also reduce eye strain

It has to do with the refresh rate of the picture. The higher MHz, the smoother and more eyepleasing the picture. Less strobe-y.

MHz? For refresh rate? I think someone’s confused here.

Probably just an extra “M” tacked on the front there. Some new TVs are being made with 120 Hz refresh rates, instead of the standard 60 Hz. Primarily it’s a way of differentiating between top-end and mid-level screens, since all but the cheapest are already 1080p. Not much sense in increasing the resolution again, might as well bump the frame rate…

Though that’s not to say there’s no room for improvement on that front. I can see it making a difference for, say, sports broadcasts or action movies.

Actually, the new benchmark is 240 Hz. Frankly, I just don’t see the point anymore…

I have to admit, I never understood the high ‘refresh rate’ on an LCD TV. I thought HDTV signals only went up to 60 Hz? And doesn’t each pixel continue to display the same value until refreshed? What’s the point of refreshing the screen more frequently than that?

It’s to combat motion blur (or motion smear) - it’s meant to be better for watching sport or action movies, but it’s not clear how much difference it actually makes in the real world.

I have no idea if this is how it works or not, but imagine if rather than displaying an image at 60 Hz exactly when it is broadcast, a HDTV delayed the image by some small amount, say 0.1 seconds. That would not be very noticable to the the viewer, but it allows the TV to process the signal and know in advance what the pixels need to change to. So rather than changing a pixel from red to blue in 1/60th of a second (real time signal), it can start to fade the red pixel and fire up the blue pixel in 1/240 second increments (the processed signal). Result is that in 1/60th of a second the pixel changed from red to blue, but it started the change in 3/240ths of a second earlier, rather than instantaneously. Or maybe some other sort of manipulation of the digital signal.

The big win for 120Hz systems is playing 24fps movies on 60fps systems: Display motion blur - Wikipedia
(See the section title 100 Hz +)