Beam me up Scotty my cerebellum blew a fuse.
That’s a good question. I wondered about it myself.
Here’s my wild guess; hopefully some brainiac who really understands this stuff will correct me if I’m wrong.
Your speed not only changes your perception of time, but also your perception of spatial measurements (length, distance). When you’re travelling fast in one direction, things shrink in that direction, according to the lorenz transform (IIRC and way fuzzy). So, anyone moving towards Andromeda sees it as closer. In other words, as you say, they’d disagree on how far away it was. Of course, since the velocities are small, the disagreement would be small, probably vanishingly small compared to the error in the calculated distance.
They’d also see different color light, but that’s true even for Newtonian physics.
No it’s not the same. Unlike the classic “twin paradox”, neither twin needs to stop and turn around. They both stay in a single reference frame the entire trip, so it’s arbitrary which one is considered to be jetting off. If the stay-at-home twin is moving WRT the preferred frame, and the twin that jets off selects the preferred frame, it’s the stay-at-home twin that ages less. (I should have written it this way in the first place, to emphasize the difference.)
One could debate, though, whether SR truly holds, as global Lorentz invariance is broken.
But of course, we have no experimental evidence that global Lorentz invariance holds.

I know that the original idea required an infinitely long cylinder, but I had been told that this is only an ideal, and that, in the region of a finite rotating cylinder (of absolutely stupendous and gargantuan mass) trajectories could be defined that went back in time.
I don’t think so (though it’s not my area of expertise, and I’d be happy to be corrected). IIRC, Hawking proved a theorem to the effect that in a spacetime meeting some general criteria, a time machine always needs some negative energy density/exotic matter to work. Tipler’s infinite cylinder does not meet these criteria (and therefore eludes the theorem), but a finite one would (I believe).
(I agree with you, when it comes to wormholes depending on “negative energy.” And yet…as I understand this…and that ain’t saying much!..“negative energy” could be accumulated. It isn’t absolutely impossible…)
The possibility is remote, but there, yes. But I think a warp drive wouldn’t even be the coolest thing to use negative matter, or matter of negative mass, for: you could build a ‘diametric drive’, which provides unbounded acceleration without any fuel cost! Basically, the positive matter falls towards the negative, while the negative falls away, and thus, the whole thing just zips through space.
Or go one step further, to imaginary mass: that stuff, you’d have to put in energy to slow down, and even then, you can never get it below lightspeed! So getting on and off an imaginary-mass spacecraft might prove challenging, but once you’re in, just slam on the brakes, and off you go!
Of course, there’s the slight drawback that this kind of stuff tends to destabilize the vacuum, but then, there always seems to be a little bit of environmental risk involved with breakthrough technologies…

I think it was a Larry Niven essay, where he talked about time travel (and in particular the invention of a time machine). He went through various scenarios of how people would go back in time and change things, modifying the timeline, until eventually the only universe that is stable is one where a time machine is never invented.
Yes, that’s more or less what I was thinking about, I’ll have to look up the essay. I was thinking about a slightly different proposal, in which ultimately, only that universe is stable in which it is physically impossible to build a time machine, since any possibility tends to become actualized, given enough time, which lots of universes seem to have a lot of. But it would perhaps be even more interesting if time travel were counterfactually possible, but the universe evolves under the constraint that it never is actualized, thus most (almost all?) evolutions for the universe get weeded out… Maybe that could be a way for culling string theory’s 10[sup]500[/sup] - 1 superfluous universes!
I tend to think of Asimov’s The End of Eternity as an early example of Niven’s law of chronological protection in action;
Eternity has time travel, but it causes too many problems so it is retroactively uninvented.

I don’t think so (though it’s not my area of expertise, and I’d be happy to be corrected). IIRC, Hawking proved a theorem to the effect that in a spacetime meeting some general criteria, a time machine always needs some negative energy density/exotic matter to work. Tipler’s infinite cylinder does not meet these criteria (and therefore eludes the theorem), but a finite one would (I believe).
A quick Googling seems to bear this out. Tipler apparently thought that a finite cylinder could be made to work if it were massive enough (and the numbers are beyond insane – galactic masses – ) but Hawking came along and said nuh-uh.
Way back in college, I wrote a paper for a Mathematical Modeling class, using parallel time-lines (such as the Many Worlds interpretation) and just playing with the idea. For instance, the classical Grandfather Paradox is like a singularity. It is a naked contradiction in the fabric of space/time/if. But nearby the grandfather paradox – e.g., you go back and shoot him, but only injure him severely – you can get zones of “if turbulence.” The future affects the past – even changes it – but not so much as to prevent it. It was all totally speculative, not an ounce of real science in it, but the model worked, and that got an A on the paper.
Why couldn’t the grandfather paradox just be like a bistable multivibrator? (Yeah, that’s a flip-flop).
A guy goes into the past, kills his grandfather. Grandfather never has kids, so nobody goes to the past to kill him. So he lives, and his grandson goes into the past, and kills him … and so on. In other words, there are two timelines, depending on whether GF was killed or not.
Traveling back in time bifurcates the timeline.
It gets messier when there are multiple players with different time machines. But the basic idea is that you never actually change your past, you just create a new one – or better yet, modify an existing copy. Seems a bit crazy, with a lot of excess universe copies, but it’s certainly less overhead than the infinite multiverses theory, which a few people take seriously – and is probably more believeable than time travel!

It gets messier when there are multiple players with different time machines. But the basic idea is that you never actually change your past, you just create a new one – or better yet, modify an existing copy. Seems a bit crazy, with a lot of excess universe copies, but it’s certainly less overhead than the infinite multiverses theory, which a few people take seriously – and is probably more believeable than time travel!
I’ve always thought that this basic idea could be the fundamental explanation of Quantum Mechanics, ie that QM is nothing other than an effective description of an extremely complicated tangle of closed time-like loops, where perhaps every particle is an extremal black hole whose horizon leads to different times/branches, and where the laws of physics result from consistency requirements. The basic idea of “QM from GR” was pursued by Einstein for a long time, but eventually, especially after Bell’s theorem people seemed to give up. Given that closed time-like loops are non-local, I don’t think Bell’s theorem applies, so I’ve always been curious why there wasn’t a larger interest in the idea. I think the main problem might be its sheer difficulty. I know I run a mental block when trying to consider more than one or two particles going back and forth in time and interacting… if one thing is clear to me, is that it is NOT “less overhead than the infinite multiverses theory”. Most multiverse theories, such as the string landscape require almost zero overhead; that’s why they are there, no work had to be done to “trim away” branches…
nm

. I know I run a mental block when trying to consider more than one or two particles going back and forth in time and interacting… if one thing is clear to me, is that it is NOT “less overhead than the infinite multiverses theory”.
I think I once asked here about something like a grandfather’s like paradox involving 3 or more “particles” so to speak. IIRC I didn’t get much input (or any even).
The time travel thing brings up an interesting question. Lets say we eventually really refine our knowledge of physics in the distant future. We are pretty darn sure we know how things actually work. And IN THEORY we know we can build a time machine and that it WILL fuck up causality.
But the machine is totally unbuildable in any even remotely practical sense.
What would/do we think about that? Do we go “meh” or do we spend sleepless nights worrying about the implications?

The time travel thing brings up an interesting question. Lets say we eventually really refine our knowledge of physics in the distant future. We are pretty darn sure we know how things actually work. And IN THEORY we know we can build a time machine and that it WILL fuck up causality.
But the machine is totally unbuildable in any even remotely practical sense.
What would/do we think about that? Do we go “meh” or do we spend sleepless nights worrying about the implications?
If how things actually work allows for the possibility of fucking up causality, then causality is already fucked up. It isn’t like the universe depends on us not pushing the “destroy causality” button.

If how things actually work allows for the possibility of fucking up causality, then causality is already fucked up. It isn’t like the universe depends on us not pushing the “destroy causality” button.
You could appeal to a form of superdeterminism… in other words the initial conditions of all the particles in the universe are the way they are because they ensure that causality will not be violated.

You could appeal to a form of superdeterminism… in other words the initial conditions of all the particles in the universe are the way they are because they ensure that causality will not be violated.
But if God wanted to fuck with things he could and still “obey the rules of the game”

I’ve always thought that this basic idea could be the fundamental explanation of Quantum Mechanics, ie that QM is nothing other than an effective description of an extremely complicated tangle of closed time-like loops, where perhaps every particle is an extremal black hole whose horizon leads to different times/branches, and where the laws of physics result from consistency requirements. The basic idea of “QM from GR” was pursued by Einstein for a long time, but eventually, especially after Bell’s theorem people seemed to give up. Given that closed time-like loops are non-local, I don’t think Bell’s theorem applies, so I’ve always been curious why there wasn’t a larger interest in the idea. I think the main problem might be its sheer difficulty. I know I run a mental block when trying to consider more than one or two particles going back and forth in time and interacting… if one thing is clear to me, is that it is NOT “less overhead than the infinite multiverses theory”. Most multiverse theories, such as the string landscape require almost zero overhead; that’s why they are there, no work had to be done to “trim away” branches…
Yeah. There was theoretical work on a “toy-model” of time travel, just to see if consistency of results was possible, and instead of finding that no consistent solutions were possible when time travel was allowed (suggesting that time travel was impossible) or that only one solution was possible, an infinity of solutions were possible (here’s the study by the way Billiard balls in wormhole spacetimes with closed timelike curves: Classical theory - CaltechAUTHORS) which makes me think that time travel at some scale is a way to create quantum uncertainty in a classical scenario.
Very interesting…
I only skimmed the thread, so forgive me if this link has already been posted. Warp Drives
Came to post the same thing.
So how do warp bubbles avoid backward causation? (Or do they?)

If how things actually work allows for the possibility of fucking up causality, then causality is already fucked up. It isn’t like the universe depends on us not pushing the “destroy causality” button.
That is how we could build a kind of Improbability Drive. Just arrange things such that if my ship doesn’t quantum teleport to alpha centauri in the next ten seconds, the time machine will screw with causality.

Came to post the same thing.
So how do warp bubbles avoid backward causation? (Or do they?)
They can’t, not completely. in certain limited situations a wormhole could avoid backward causation, but for a warp bubble I’m reasonably sure that you will always be able to find a frame of reference where causality would be breached.

I’m not sure I’m understanding this correctly. I get that the idea is that a tiny difference in relative motion, magnified by millions of light-years, can lead to a significant difference in simultaneity. But this would apparently mean that two observers on opposite sides of the Earth, one rotating toward a distant galaxy and one rotating away, could disagree by days on the timing of an observable event such as a supernova or the light curve of a variable quasar. I’m sure I would have heard if such is the case.
Sorry about the zombie thread but I recently came across a good answer to this question that I posed.
The answer is that both observers here on Earth but in relative motion agree (within the limits of Special Relativity) that they both saw the light of a supernova at the same time. What the real disagreement is is exactly how far away each one would measure the distant galaxy. If you do the classic world-line diagram, the light is arriving “now” for both but potentially with light-days of difference how far the light had to travel.