I was just thinking about how time is affected by gravity, speed, and mass. A second on the earth is different from a second on jupiter.

So, is there such a thing as a the standard speed of time at a point with/affected by no meaningful mass or gravity, and at no speed (I realize this last one is impossible to determine, with galaxies spiraling and space expanding etc).

I realize the speed of time is only meaningful in comparison to another place relative to it, but let’s assume we pick a point in the universe that’s thousands of light years away from any meaningful source of mass (stars, planets, etc) and assume that the only motion it experiences is moving outward with the expansion of the universe. In other words, can we zero all the variables that affect time and say something about how fast it would go in comparison to a second on earth (at sea level at the equator if you want to get picky about it)?

Can we say one second on earth goes x times slower than one second at a point not affected by gravity or motion? If so, can we know what x actually is?

I remember hearing one explanation of time dilation that everything in the space/time continuum is really moving at the speed of light. Those of us who aren’t moving very fast in space are moving forward in time at nearly that full speed.

Objects that are crossing at a good portion of the speed of light are also moving forward in time, but more slowly, so that the ‘diagonal sum’ of their space movement and time movement still equals C. (Like if you’re travelling northwest so that your speed north is 10 mph, and your speed west is 10mph, your total northwest speed is something like 14 mph.)

I have no idea if this is accurate or not, but it seemed slightly relevant.

Not sure if this answers the OP but there is Planck Time which is the amount of time light travels one Planck Length. If time is discrete, this would probably be the quantum of time. If time is continuous it would be the smallest possible measurable duration and so time would appear to be atomized to us.

Planck time, as I understand it, is the smallest unit into which time can be divided. But it’s still a measurement of time. Something extremely ridiculously small, like a trillionth of a nanosecond.

Yes, there is. The “standard speed of time” is what you see measured by any ordinary clock that is in the same frame of reference as you are. It doesn’t matter what that frame of reference is with respect to gravity or motion, you will always observe it as being the same. This is also the only sense in which the question is even meaningful. There is no such thing as “absolute time” against which this can be measured, just as there’s no such thing as absolute motion or absolute rest, and for the same reason.

Planck time is a calculated time interval that depends on fundamental constants. There is no compelling evidence, at least yet, that time is quantized with a smallest possible amount. Nor is there any evidence that if it is quantized, the smallest possible amount is Planck time. (Of course, there is no reason if time is quantized that it shouldn’t be quantized at the Planck amount and that duration has the one advantage that it is at least not arbitrary.)

I’m pretty sure that all Planck units are still arithmetic and not physics. They are just combinations of fundamental constants such that the units (like seconds) disappear. I don’t believe there is any physics yet that says any of the Planck units have any specific physical meaning as smallest or largest anything.

Inside any reference frame, time always proceeds at one second per second. Always. This is invariant, under any speed, location, or acceleration. There are no variables that affect time itself; they only affect the tools we use to measure its duration.

It’s certainly true that if you compare reference frames you can find that a different total number of seconds have elapsed since an agreed-upon start time. But each one of those seconds is the same as any other.

That seems to make the rest of what you say impossible. (Besides the fact it implies a privileged reference frame, which is itself impossible.) Time does not and can not have a speed.

Doesn’t a second on jupiter (due to its immense gravity) go slower than a second on earth? Certainly slower than in a spot 1000 light years from the closest chuck of mass bigger than a piece of space dust. I agree that in that reference frame time will always go one second per second. But if one second on earth is equal to 100 seconds near a black hole, then you can make a comparison. If a different number of seconds have elapsed since an agreed upon start take, those seconds can’t all be the same. They’ll appear the same regardless of where you are.

Why couldn’t there be one privileged reference frame where there’s no mass, gravity or speed? At least assume mass, gravity and speed approach zero.

Couldn’t you at least plug the numbers into some formula and get a hypothetical default speed of time (it would have to be defined in reference to an earth second)?

Not really. If you’re there observing, you have mass and gravity.

And how much difference this clock has to a reference clock in a different frame depends on how far away this clock is from you, as gravitational effects also depend on distance.

Speed is also relative. Speed approaching zero means you already have some preferred inertial frame of reference in mind. You are trying to use a privileged reference frame to prove a privileged reference frame is consistent with relativity. That’s the same as assuming your conclusion.

The problem is that no one can simultaneously observe a second in two locations.

What would it mean to say that something has no speed? If you imagine a depth of space where you feel motionless and can only see one other object moving away from you, that object could feel motionless and see you moving away from it.

Aren’t you just talking about earth seconds then? That’s the speed of a clock.

I think the actual technical term is “Proper Time”

The phrase “default speed of time” doesn’t really make sense because you can only compare the passage of time between two places, because otherwise you don’t have anything to measure against.

You could compare the passage of time here vs a point in space with no mass-energy around but that would only give the speed of time here relative to there

Since time doesn’t exist for a photon traveling at light speed, is it possible that all the photons we see are just the same photon buzzing about and ‘drawing’ the universe like an electron beam on a TV?

Well, that’s done it. Now the superbeings up there in their invisible space ship, watching our universe as a reality TV show, will realise that they have been sussed and terminate the whole thing.

Photon don’t experience time, but they are limited by the speed of light. For one photon to move across the universe twice would take twice as long as the universe has existed. The lag required to “draw” the entire universe even once would take many powers of ten longer than the universe has existed.

What if the moon had a clock that beamed out the time. You could tune your radio to a certain frequency and hear something like “beep moon time 12:00:00 pm Wednesday, Nov 19th, 2014… beep moon time 12:00:01 …” Here on Earth we could synchronize our clocks to moon time, but over time that would drift, right? Because the gravity on the moon is less than on Earth, would our Earth clocks tick slower than the moon clock? Would we think the moon clock is going fast?

We need to make adjustments for both special and general relativity (both gravity and speed) for GPS signals.

That’s fine as it goes, but that doesn’t really help us set a universal standard time. We know there will be a difference in what those clocks tell us, but that doesn’t tell us which one is the “truth”.

We can only really measure it relative to some earth standard (even time on earth varies depending on altitude and latitude). That’s useful in itself, but it’s a bit different from what the OP was asking.

Sorry, not “latitude”. That’s a bizarre brain fart to have. Any latitude effect is essentially cancelled out by the oblateness of the earth. Or you can think of it like that, anyway.

The key is acceleration: the famous Twin Paradox is solved by the fact that one twin experiences acceleration and the other doesn’t. So a clock that has drifted deep in intergalactic space far from any gravity wells would have counted off the most time since the Big Bang.

(Or is that true? Do all observers in our universe see the Big Bang happening the same (local) amount of time ago?)

In a practical sense, time is nothing more than change. If nothing changes, how would you know that time has passed? Since electrons can change shells instantaneously, there is no maximum speed of change, so no minimum speed for time.