We know that the faster a person goes the more time slows down for them. Presumably this takes place regardless of speed but the difference isn’t noticeable until you get closer to the speed of light. How fast would you need to go to noticeably affect the passage of time compared to folks who are say on earth. Like if you travel at speed for an hour how fast would you need to be to lose a second, a minute, an hour, etc.?
Time dilation is calculated by 1/sqrt(1-v^2/c^2), where v is the velocity and c is the speed of light. So, if you wanted to know how fast you’d need to go to lose 1 second per hour, you’d want to calculate the v that makes this equation equal to 3600/3599. This comes out to about 2.4% the speed of light.
Slightly related: this Wikipedia article says that a person having spent 747 days on space station Mir experienced a total dilation of 20 milliseconds. The Apollo astronauts (going faster but for a shorter time) experienced a dilation smaller than 0.025 milliseconds.
When you’re talking about astronauts in orbit, you need to consider General Relativity, not just Special. When you do so, the dominant consideration is just the fact that they’re in free-fall (which has an effect not only larger than the one from speed, but in the opposite direction). The Mir astronauts were in free-fall for longer than the Apollo astronauts, so the effect is larger.
To the original question, it depends on how much of an effect you consider “significant”. With sensitive enough measurements, you can detect relativistic effects at even very low speeds. For instance, you can make an electromagnet out of a couple of feet of wire wrapped around a nail and connected to a battery: The magnetic field thus produced is a relativistic effect from the movement of the electrons in the wire. But the speed of those electrons (assuming 2 feet of copper wire and a 1.5 volt battery) is only a centimeter per second.
EDIT: I just checked the Wikipedia link, and they’re only doing the SR calculation, not the correct GR one. The number they get is therefore meaningless. This is a disturbingly common mistake, repeated even in physics textbooks.
GPS satellites have to correct for special and general relativity; if they didn’t, they would accumulate 10 km of error per day, rapidly rendering them useless. They orbit at an altitude of ~12,000 miles, making two orbits per day; if I did the math right, that’s about 8,400 MPH. To be fair, the correction for gravity (general relativity) is quite a bit larger than the correction for velocity (special relativity).
I remember doing some simple experiments in high school, involving colored ball bearings, which clearly demonstrated that the movements of the particles can be much slower than the movement of the wave that they propagate. Is that at all relevant to the situation? Your electrons are going slow, I grant you that, but their wave might still be quite fast.
Is this scalable? Someone living around 30º latitude traveling 840 mph would lose 1 km per day?
10 km per day is almost 7 meters per minute. You can drop the word “rapidly”.
My favourite answer is this video.
Not exactly fast at all.Let me guess: You filled up a tube with ball bearings, and pushed one in the end? Not a terrible analogy, and indeed, the waves are much faster (in fact, the speed of light, since they’re electromagnetic waves), but in this case, it really is the (very slow) speed of the electrons themselves that is relevant.
The key is that, at those speeds, because relativistic effects are very small, the magnetic field would be much smaller than the electric field, and so ordinarily would be lost in it… except that, for electrons in a wire, the electric fields all cancel out almost perfectly, but the magnetic fields don’t cancel.
It is not the speed of the wave, but precisely the speed of the electron itself, as small as it is, that creates the “magnetic field”. Magnetic interactions are only the relativistic effects of moving charged particles on the overall electric field.
Einstein wrote: “What led me more or less directly to the special theory of relativity was the conviction that the electromotive force acting on a body in motion in a magnetic field was nothing else but an electric field.”
See Relativistic electromagnetism - Wikipedia - although it gets a bit technical.
I don’t recall the exact numbers, but I heard someone on Star Talk saying that A NASCAR/Indy/F1 type driver going at about 200 MPH for a few hours loses something like a billionth of a billionth of a billionth of a second due to time dilation. Easily calcuable with formulas we have, hardly noticeable* and nowhere near a fill second, much less a minute or an hour.
*Plus, you have to figure, for the vast majority of us, it’ll all average out. At some point that NASCAR driver will probably be standing still while I’m in motion for long enough for me to catch up (fall behind…for him to catch up?).
I got sort of sucked into helping test this once. At the time, I was an Instructor at the USCG LORAN C school, and the class I taught featured the care and operation of the cesium frequency standards that we used. I was between classes and got TDYed (Temporary Dutyied) to Kennedy Space Center via the USN Observatory in Alexandria, VA with a cesium standard that had it’s phase and one pulse per second output as closely synchronized as possible with another back at the observatory. There was some one-off (well, two-off) devices plugged into them. Eventually, this cesium was loaded into one of the shuttles and flew a mission. On it’s return, it was unloaded and taken back to Alexandria and stuff was compaired with the homebody. I don’t recall the numbers, but there was a measurable time difference that was pretty close to what had been predicted. No where near microseconds, but in nanoseconds or high picoseconds. This was in the early 80’s, and I presume they can measure more accurately now.
Well, this is a thread about relativity; I’d say a GPS receiver that’s off by 10km in a single day has become useless relatively rapidly.
Here’s an article about a homebrew experiment to measure the effect of gravity on the passage of time: An adventure in relative time-keeping
TL;DR: Guy collects “atomic” clocks. He takes his family - and some clocks - on a vacation to Mt Rainier, and measures the difference in the rate at which the clocks run at high altitude. Measured difference of 22 nanoseconds is very close to the value predicted by GR.
No, for two reasons.
The first reason is that relativistic effects don’t scale linearly. They scale with the inverse square root (plus some other stuff. See the first response in this thread for the equation).
The second reason is that I think you misunderstand the claim. The GPS satellites wouldn’t be displaced by 10km per day. The output of their positioning would drift by 10km per day. They rely on careful calibration of precise clocks and radio signals to calculate position, and the clocks are off just a bit from clocks on the ground due to relativity. The very small error in the clocks is magnified greatly by the calculations required to determine position, so the final result would drift that much if they didn’t correct for the different clock rates.