I have two cheap Casio digital watches, slightly different (one has a blue backlight and is a bit thicker).
One of them (at least!) is inaccurate. I synchronised them but a week later they are 4 seconds different. Is this normal? I thought quartz digi’s were supposed to be accurate to a few seconds a year, how can one be so far out in so short a time?
I don’t know the technical details of why this happens but I have a lot of watches, and one of the most accurate ones I’ve ever had was a Mickey Mouse watch I bought at Disney World about 10 years ago for under $50. I also have a Swiss Army chronograph and an Invicta left-handed[sup]1[/sup] chronograph that are both very accurate, within a couple of seconds a month measured against my radio-synched clocks. The Swiss Army cost about double what the Invicta cost. I have other quartz watches that are all over the map in terms of accuracy.
So cost is not a reliable predictor of accuracy.
BTW I have a Rolex Oyster Perpetual Datejust (mechanical, not quartz) that is one of the least accurate watches I own, and the only one that requires regular maintenance, and the most expensive.
A left-handed watch, you say? I am left-handed and wear watches on the right hand. And I have to take them off to change the date or adjust the time. The Invicta has the stem and buttons on the left side so I can adjust it without taking it off.
The “quartz” is a quartz crystal which oscillates at a very stable frequency. The microprocessor divides this number down to get a stable seconds counter.
.6 seconds/day is (.6/86400)* 100 = .00069% accuracy.
Show me ANY stand-alone measuring device in your possession that even remotely approaches that level of accuracy! Scales, thermometers, measuring cups, rulers, calipers - you would be lucky to achieve .1% accuracy with any of these, and you think that a $25 watch that achieves .00069% is inaccurate! You are one tough customer.
For the record, the British government paid out what might be the largest prize in history (in constant pounds) for the first clock that could achieve ~2 sec/day accuracy:
No. You could make a number of different types of oscillators. The advantage of the quartz crystal is that it is accurate and cheap. And pretty much all microprocessor driven consumer electronics would use a quartz crystal. There are resonators and RC oscillators as well but they don’t have as good a stability over time and temperature as a quartz. If it has a clock, most likely it would have a quartz crystal.
Quartz has in interesting property: It is piezoelectric, which means that it develops and electric field when flexed, and flexes when put in electric field. This leads to it’s use as a resonator, where it stabilizes the frequency of oscillation by only permitting a very specific frequency to be amplified - all other frequencies are damped out. The frequency of oscillation is almost 100% determined by the physical shape of the crystal, so as long as the temperature is stable (like, skin temperature), the oscillations can be very, very precise.
Effectively, yes. All clocks are based on a mechanism that does something very regularly or very steadily. It could be the sun moving across the sky, water emptying out of a jug, a spring unwinding, a pendulum swinging back and forth, or, in this case, a quartz crystal vibrating in an electrical current. These vibrations are also in the form of electricity, in this case a very precisely fixed number of pulses per second. Use circuitry to divide them down to one pulse per second, use that to increment the display, and you have a watch.
(As Rain Soaked said, you could stick a capacitor and a resistor in sequence to make an RC circuit, where the capacitor is charged and then discharges very steadily, as controlled by the resistor, but Rain Soaked also explained why that’s not as good as quartz.)
If you don’t make the crystal very well, you get a watch that loses four seconds a week. Compare that to a badly-made mechanical watch.
It doesn’t have to have a quartz crystal, but those are bot accurate & cheap.
The other common design is to count the cycles in the electrical power (usually quite accurate at 60 cycles per second) and use that as the time control.
Some more advanced electronics (TVs, etc.) may actually have a shortwave radio receiver built into them that monitors the National Bureau of Standards radio time broadcasts.
Electromechanical clocks did that as well at one time; in fact, some of them did it so long ago that they were geared for 50 cycles per second, our old AC standard, and had to be re-geared by hand when the power frequency switched over.