Is there any reason not to use the highest possible refresh rate on a monitor? Why is it usually not the default? Does it use more electricity? Do some people enjoy watching the screen jump?
No, there’s no reason not to use the maximum rate. But monitors and video cards both have maximum refresh rates, and you should be careful not to exceed either limit. Usually video cards can’t operate faster than its own limit. But it is possible to damage a monitor by feeding it a higher refresh rate signal than it’s designed for. (Or so I’m told, I’d guess it’s rare to actually damage the monitor.)
If the default is slower than the maximum refresh rate of the monitor, it’s probably because the computer (specifically, the video card driver) doesn’t know what kind of monitor it’s connected to. So it defaults to something like 60Hz, which is very unlikely to damage any type of monitor. In Win95/98 you can set up the monitor type somewhere in Control Panel. (Sorry, I’d tell you exactly where but I don’t have access to an English language version of Windows.)