I have a question relating to the time clock on computers - why is it that every computer I’ve ever had has a slow clock? It seems that every computer that I’ve ever had at home (my work one seems fine, which may be because it’s on a network) has a clock that will fall behind by about 5-6 minutes after a couple of months.
Do you think this might have to do with the fact that I’m running it off of the same processor that I’m running programs, so the clock program might get slowed down slightly just the same as a normal program? That’s my only posssible explanation. Otherwise, ya got me.
It’s usually just bad quality control by the manufacturers of the clock componant. It reminds me of my first 4-head VCR years ago that gained 9 minutes per month. The VCR was “repaired” under warranty but the dealer had to go through 4 or 5 chipsets until one was within the manufacturer’s spec’s. It lost 7 minutes per month but that was within the tolerances allowed for the clock. PC clock makers are just trying to keep costs down by making “good enough” products instead of trying to invest in quality. 1 of my home PC’s loses about 5 minutes a month, the other two stay right on target.
While lack of quality in the components is a common problem with a slow clock and good battery, the OS and some apps. can also cause problems. Certain MS OSes with certain settings can make clocks run slow. There’s MS technotes about these problems (but I have better ways of dealing with it :)).
Almost all PC’s (I would say all, but then someone would find an exception) have a real time clock circuit that runs independent of the CPU, any programs, OS, etc. In fact it runs when the computer isn’t even turned on.
If it is not keeping good time, this circuit is most certainly the problem.
At one place I worked, they set up their own time server which synched to one of the public time servers. Then the work pcs could synch to the work time server. interface2x’s work pc could be set up to synch automatically to a work time server or directly to a public server.
On my home pc, my clock drifts around one minute per month, and I synch it once a month.
Computers (meaning IBM PC architecture, since the 286 days) have a real time clock chip. When the operating system starts up (dos, linux, windows) the operating system reads the current time out of the chip. From then on, while the computer is running, it updates its time based on the timer tick interrupt. So, you have two quartz crystals involved, one in the real time clock chip and one used to generate the system clock (which actually drives a bunch of different timers, including that stupid original pc speaker). When you shut down the computer, linux stores the current time back into the real time clock. Not sure about windows. DOS doesn’t.
The accuracy of the clock depends on the manufacturing tolerance of the crystals. No crystal is anywhere near as accurate as an atomic clock of course, but the more accurate the crystal the more it costs, which is why PC clocks always tend to be a little off. Could be faster or slower, but it’s not likely to be exactly right.