How does a computer keep accurate time?

Specifically the following scenario;

I burn a DVD from machine A.

I then play it in both machines B and C starting the DVDs at exactly the same time using the exact same software to play the DVDs in a continuous loop.
If machines B and C are the exact same make and model would they play the looped DVDs in exact synchrony?

Would the machines need to be plugged into the same power source (IOW, does the input power affect the timing of the machines)??

Lastly, if B and C were different makes of computers would they be able to play the loop in synchrony???

None of it should matter. DVD drives have their own quartz clocks they use to regulate the playing of the disk.

In other words, no, they wouldn’t play in exact synchrony on different machines. But for all practical purposes it would very likely be close enough. Maybe one would play a millisecond longer than the other, for example. You wouldn’t notice it.

The time shown by the computer is from either/both of:
** NIST & USNO **
The internal clocks in any computer are close to but not precisely in step!

As Alfred E. Newman said, more than once: “What? Me worry?”

Surprisingly, the answer to your question is no. By the end of a long movie, if you have the two computers side by side, both playing audio, you will likely notice a faint echo/reverberation in the sound. The difference in the video will probably be too small to see, but you will definitely hear it the out-of-sync audio. The problem will only get worse as time goes on, unless you stop and resynchronize the two computers.

It doesn’t matter if you use the same exact model, or different ones. The power source doesn’t have anything to do with it, either - everything inside the computer runs on DC, not AC.

There are specially-designed protocols designed to correct this problem, and synchronize media playback exactly across multiple devices. I couldn’t tell you much about them, though.

It’d be close enough for humans to not know. Assuming you’ve actually burned two discs from machine A to play on B and C simultaneously, there will be differences in the discs (ie: errors) that might make one take a tiny bit longer to decode a particular frame. On the average, the two should end at the same time.

If you’re getting into analog video processing, you’d need to genlock the whole system, so everything’s running on the exact same video sync signal.

Computer clocks are surprisingly inaccurate, so if it was relying on the computer clock, there would be significant variation over time.

The internal clocks in computers are often in error by 100 PPM (parts per million) or more. They are designed to be cheap, not accurate. An internal quartz crystal oscillator is the typical timing reference. Some old computers used the power line frequency as their clock source, but that is very rare today.

If you want to keep more accurate time on a computer than the computer clock itself can give you, you want to use something like NTP (Network Time Protocol). The link has all kinds of technical information on it, but the gist of it (as far as I understand it) is this:

There’s a unidirectional heirarchy of networked computers, each of which is keeping it’s own version of time. At the top are relatively few, expensive, and incredibly accurate atomic clocks. At the bottom are random computers owned by chumps like you and me who just connect up so they don’t have to reset the clock by a few minutes a year. In the middle are various institutions that either need to have a pretty accurate clock or are in the business of helping distribute reasonably accurate time.

Every once in a while, a computer at some level n away from the top will send a message to a computer at the n-1 level, asking what time it is. Actually, it’ll probably send a similar message to a few of 'em.

The exchange will go something like this. (parenthetical info is not sent, but computed/stored on the level n side):

“Hey, what time is it?” (I think it’s x o’clock)
“It’s time y
“Oh, right. Ok, now what time is it?” (Gosh! I was off by y-x! Where does the time go…)
“It’s time z
“Gotcha. Thanks.” (Hmm, it took z - y = delta for the round trip, so I should probably set my clock to, say, z + delta/2)

Obviously, it’s more complicated than that, but that’s the important part.

100 PPM is pretty typical for computer clocks (the crystal in a DVD player is probably similar). For a 2 hour movie, two different computers with crystals 100ppm apart would be off 0.72 seconds from each other at the end of the movie. The longer the movie plays, the farther apart the two PCs will be. 0.72 would be very noticeable - weird echo effects would be audible much earlier.

It’s not much more expensive to use 50ppm crystals, which would cut the error in half, but they would still be very noticably out of sync at the end of the movie.

Arjuna34

Don’t buffers account for this?

I am pretty certain all DVDs and CDs fill a buffer before they output to the screen/speakers. This is the trick in portable CD players meant for the car that don’t skip. They have some vibration dampening to help but mostly it is a larger buffer so if the disk skips you never know it…the disk is actually playing ahead of what you are listening to.

NTP will do nothing about the issue with video playback drift mentioned here. It will ensure that your operating system clock is never more than a few fractions of a second off, but that’s it.

The problem is that DVD decoders aren’t driven by the operating system clock, but rather by the same underlying mechanism that the operating system clock uses: the hardware clock, or system clock. And this is where the inaccuracy lies.

There are protocols out there that operate like NTP, but are intended to synchronize media playback instead of the OS clock:

“Hey, how far into the movie are you?”
“Twenty minutes, 7.5829 seconds.”
“Okay, now how far are you?”

And so forth. Like I said, though, I don’t know of any offhand.

If the computers are idential, AND (and this is a very important “and”) they are both not connected to a network, then the two computers will be fairly close together. Arjuna34’s numbers (0.72 seconds in 2 hrs) are about what I’d expect. The human mind tends to blend things together at about 0.05 seconds, so you’re well above the threshold where your brain would have a hard time noticing the difference. Even before then, your ears would hear that something is off. There are guitar effects pedals which delay sound by a tiny fraction of a second then mix it in with the original sound (called a flanger, since it mimics pressing something against the flange of a tape deck reel back in the ancient days of recording), so it’s definately a noticable thing if you have a sound plus a delayed version of the sound reaching your ears. I’m not sure what your eyes would do with the video image.

If you have two different machines though, things could really go to hell in a handbasket really quickly. I was playing around with multi-tracking software (allows you to record multiple audio tracks then play them simultaneously, like what you’d see in a music recording studio) and found to my surprise that with a particular cheap sound card, that the computer would not sync up the data between two audio tracks for more than a few seconds. With two different machines, using possibly similar inaccurate hardware, you could be off by a second or more after just a couple of minutes of playing your DVD.