Seeking information to support experimenting with subsecond time-of-day measurement, including: the PC clock (NTP, software or methods to investigate clock drift, methods to get PC timestamps on for example messages received on serial or USB or ethernet); GPS time signals such as NMEA output strings and PPS (pulse per second) outputs; WWVB clocks (and antennas to improve reception so they don’t have to be located on the north wall of the upstairs guest room closet to work right); standalone clocks with means to trigger alarms or mark events; the tools amateur astronomers use for occultation timing; unusual special interest web sites such as leapsecond.com; and indeed anything that falls into the range better than 1 second but cheaper than true atomic clocks and Stratum 1 timeservers.
My need: 1) I’ve always been fascinated, and 2) I keep finding myself trying to extend the accuracy of coordinated physical experiments from, say, 1 or 10 seconds to perhaps 1 or 10 ms.
I’m not sure what you are looking for, here, but I’m a “precision time freak”, too, so if you can clarify, I may be able to post you some links you would be interested in. What exactly are you after?
If you use NTP, you can get your PC’s clock to within a few tens of mS of true time. Then, you need to worry about the granularity of your ticker, and the latency between getting the time and taking your measurements.
Rather than just resynchronizing the clock with NTP, can you also get an estimate of how wrong it was before the resynchronization? Are there any utilities that probe this error?
I believe there are Unix tools for doing this, but I haven’t used any myself.
But, you could certainly write something that took the current time, and then sync’d to NTP and recorded the difference.
Napier - something tells me that you have [cough] too much time on your hands…
Cheshire Human, if you are a precision time freak, what I am after might be whatever interests you on the subject. I keep coming back to it wanting more. But here is some of what I am working on this week:
Since I have been running an experiment on a machine that extends over a larger distance than it is practical to rig one datalogger, I am using 6 different dataloggers of 3 different designs, plus sometimes up to 8 other single channel dataloggers. Yet I would like to be able to reconcile their results to better than a second after the fact. My first decision was to declare my corporate laptop as the master clock, but I wish I knew more about how its clock works. It’s set up to sync itself regularly using NTP to a timeserver that is part of our corporate network, in another building a couple miles away, and that timeserver is just a computer of some sort that is using NTP to set itself to some internet timeserver. How stable does all that make the clock on my laptop? By how much might it jump, during one of these syncs? How do I know when the syncs occur?
I have a Casio $13 wristwatch that keeps amazingly good time. When I leave it in a drawer for a year or two I find it is within something like a minute, if I use a handheld GPS for a reference. For my experiment, I declared it the portable transfer standard between my laptop and the various dataloggers and experimenters taking various physical steps at designated times. I got the Casio to match my laptop displayed time to about 1/3 second (though this is a bit difficult because the laptop displayed time advances its seconds at what is obviously not regular 1 second intervals). Yesterday I was trying to confirm some things relative to a part of the experiment we did 4 days previously and was alarmed to see the Casio and laptop had a 7 second discrepancy. I don’t think the Casio could have done that but am surprised a network sync’d laptop could do it either, and am mulling how to resolve this, either for yesterday or for the next phase of the experiment.
One of the datalogger designs is very powerful and versatile, and would let me log ASCII strings on an RS232 line as one of its input channels. It can also watch a digital input and, rather than recording its status every 100 ms, can instead trigger on its transitions and timestamp these to the millisecond. So I am thinking of writing a program for this logger (I mean, using its own simple but beautiful programming language to write a program internal to the logger) to exploit one of my Garmin receivers (the no-display kind they mount on a staff for a boat). This receiver can spit out NMEA 0183 standard messages including the timestamp plus time dilution of precision message, and it also has a 1 PPS line that transitions upward on the second and downward a programmable fraction of a second later, which it does with specified 1 microsecond accuracy. So I could program this thing, I think, to use the Garmin to set its internal time of day clock to something like millisecond absolute accuracy. I could also program it to timestamp further PPS signals, so I could analyze its drift after the fact. I could also program features that let me use this thing as a transfer standard, for example having it spit out analog and digital signals that I sometimes connect to each of the other loggers in turn, to let me reconcile them all in postprocessing.
Well, this stuff all just goes on and on, and I would like to tinker with it some at home, too. I am looking for all manner of inputs on these things, but to start, would really like some simple utilities that give me more access to the PC clock and its accuracy and adjustment history.
I might suggest poking around on this website: http://www.ieee-uffc.org/main/index.asp
Once again, I’ve lost the ability to create a freaking link. Damned senior moments and all.
Beowulff, that IS cute, isn’t it?! Now for the price to come down.
LouisB, the IEEE frequency control folks are interesting, too. Reminds me of a crystal design project I always thought would be fun: design the most predictable frequency source for a slender purse. I mean, undo all the other compromises. Choose a center frequency and cut specifically for high Q and stable operation, maybe low drift but more importantly a drift that is predictable, and no way to trim the frequency. Dealing with all those issues should happen in software downstream of the XO.