It’s just floating-point error. Adding the tiny energy of a CMB photon to the very large energy of the cosmic ray just equals the energy of the cosmic ray, due to not having enough least-significant bits to represent the tiny difference. The universe can’t abide by an actual violation of conservation of energy, so the interaction doesn’t happen and both the CMB photon and the particle go on their merry ways.
Even if the Universe were a simulation, doing the collision math in the locally-comoving cosmological frame is doing it the hard way. You’d first transform to the zero-momentum frame of the collision, and neither the coordinate transformation nor the calculation of the collision in that frame requires particularly high precision.
Though some folks have posited that the existence of super-GZK cosmic rays is evidence that the Universe is not actually completely frame-agnostic.
Setting aside the issue that the “simulation hypothesis” is at best pseudoscience, with the exception of UHECRs the ‘normal’ cosmic ray flux drops off in energy at just the right threshold as predicted by the GZK limit.
Stranger
My understanding is that the GZK limit is based on a calculation that assumes the particle in question is a proton. If it were a heavier particle, it would be able to have more energy. Wikipedia says that we don’t know what the OMG particle was, but “assumes” it was a proton. I think that’s a very poor assumption.
Right. I was joking of course, but ultimately “floating-point error” is one possible form of Lorentz violation. And most of the Lorentz-violating theories could be characterized as the universe having some type of granularity to it, which could be seen as a precision thing. I wonder if any of them take an information-based approach; i.e. that you can only pack so many bits into so small a space, and that this might limit the types of interactions that can happen.
What’s not pseudoscience, though, is that the behavior of the universe is already known to have strong ties to information theory (e.g., the Bekenstein bound). It would not be too surprising (and does not require a simulation hypothesis) if other interactions might be information-based in nature, or that physically interesting quantities might all be discrete rather than continuous, and that larger systems might have to limit something along the lines of “bit density.” All of which could cause some form of Lorentz violation.
A gap in the GZK spectrum would be predicted by “precision issues,” though. With smaller energy differences, there is enough precision and the GZK limit works as expected. But beyond a certain further limit, interactions get suppressed and some particles slip through.
Or, as glowacks said, maybe it’s just heavy ions.
That is true but an HZE ion at that kinetic energy would experience giant resonance and photodisintegration at about the same threshold due to interaction with the CMB. Heavy ions are a component of cosmic rays but most are mostly solar particles and a tiny fraction of galactic cosmic rays (GCRs). I don’t think that extragalactic heavy ions are a good hypothesis for the OMG or Amaterasu particles.
Stranger
Oh, I think there is reason to believe that some version of a mathematical universe hypothesis is a fundamental basis for reality (although I thing Tegmark is wrong in speculation that that the mathematical structure is based upon computable functions, or at least ones computable without our universe), and that information thermodynamics may actually delineate fundamental interactions and the ‘resolution’ at which information, and thus particle interactions, can be resolved. In fact, I’d go so far as to say that this is essentially axiomatic if we’re assuming that the fabric of reality has a mathematical structure. But I don’t think this is a likely explanation for UHECR; I suspect we’ll eventually find some explanation that is both more physically prosaic and more scientifically interesting, and perhaps gives insight into the nature of supposed ‘dark matter’. If there are “bit density”-type issues with high energy interactions I think they would only be seen much closer to the Planck scale.
Stranger
I fail to understand why you would think that. A proton is the most stable hadron and much more stable than a heavier atomic nucleus. The estimate for OMG was, according to that article, about as close to c as a non-photon could get. A proton crossing many millions of lightyears is much more likely to survive those ridiculous speeds than most nearly anything else.
Agreed, mostly. As energetic as the OMG particle was, it’s still far from the Planck scale. Although perhaps not as far as one would naively think–it’s “only” about a factor of 10^8 away. So it’s possible that one might start to see anomalies here. Probably not enough to get around the GZK limit, but who knows. If UHECRs are produced at a prodigious enough rate, perhaps anomalies in the effective cross-section could allow a few particles to slip through.
Then again, maybe the GZK limit doesn’t apply at all due to the source being local, and instead it’s a cosmic topological defect whipping around and colliding with itself, or something worse…