Antimatter-matter equality questions

It’s often cited as one of the key unknowns of cosmology / particle physics: Why does our universe have so much matter, and so little antimatter, when all (?) particle transformations are symmetric with respect to matter and antimatter.

However, various science popularizers treat this a bit differently so I am left a bit confused.

Some, like Sabine Hossenfelder (on one of her better days) say there is no real mystery here, because there is nothing about the symmetry of particle creation that entails anything about what the initial state of our universe should be.

Others present the theory that for every billion or so antimatter particles, there would be 1 billion + 1 matter particles, and everything we see comes from that slight imbalance. Where did this specific number come from? And of course, that energy would not disappear, so is this part of how this number was arrived at?

And can we rule out that matter and antimatter were just not mixed? So there are two huge volumes with a bunch of explosions between the two outside of our visible range? Or can we already rule that out from what we see in the CMB?

The plain fact is that Physicists spend a lot of time worrying over your question. Just today, I came on this from Scientific American online:

Unfortunately, it may be behind a paywall. Here is a relevant paragraph:
" Today physicists at the Large Hadron Collider (LHC)’s LHCb experiment published a paper in the journal Nature announcing that they’ve measured CP violation for the first time in baryons—the class of particles that includes the protons and neutrons inside atoms. Baryons are all built from triplets of even smaller particles called quarks. Previous experiments dating back to 1964 had seen CP violation in meson particles, which unlike baryons are made of a quark-antiquark pair. In the new experiment, scientists observed that baryons made of an up quark, a down quark and one of their more exotic cousins called a beauty quark decay more often than baryons made of the antimatter versions of those same three quarks."

The article goes on to say that these asymetries are not enough to solve the problem. And in this experiment, the matter decays faster than the antimatter, which is the wrong direction. Still they will go on looking for other asymmetries.

Yay, I’m not the only one who still uses “Truth” and “Beauty”!

The transformations aren’t purely symmetric, but the asymmetries seen so far are not anywhere near large enough to generate what is observed cosmologically.

You’d have to cite a specific rant of hers for me to comment further here about what she might be click-baiting about.

Some past discussion here and here. I’ll pull out some primary points:

Annihilation products. The boundaries between matter and antimatter regions are not completely devoid of material and should show evidence of matter-antimatter annihilation. Such annihilation products are not seen, and based on the distributions of matter in the universe and constraints on matter densities in the intergalactic medium, current data is sufficient to exclude the presence of matter/antimatter boundaries in the observable universe.

Cosmic rays. Our little pocket of space is bombarded by cosmic rays. These come from numerous processes at solar system, galactic, and intergalactic scales. Most of these cosmic rays are boring things like electrons and protons, but antimatter can also appear. Barring new physics, this antimatter comes from collisions of ordinary cosmic rays with other regular matter in the universe or in violent processes like supernovae and gamma ray bursts. The more complicated the antimatter gets, though, the harder it is to make using mundane physics. Antiprotons? Not that hard to make. Antideuterons? Sure. Antihelium-3? Pretty rare. Antihelium-4? Very rare. Experiments look for unexplained excesses of these light anti-nuclei, as evidence perhaps of large pockets of antimatter out there.

Early universe dynamics. If you try to separate matter and antimatter into distinct regions, you need a mechanism to do that. There is a great corpus of cosmological data that gives insight into the dynamics of (anti)matter in the early universe. Cosmic microwave background radiation, large-scale structure (e.g., superclusters, filaments), and more have well-measured and highly detailed patterns that match exceeding well what is expected if the random, entropic processes in the early universe, together with the structure-forming influence of gravity, proceeded as we understand them. In other words, there is no evidence that anything other than normal large-scale dynamics was present. One specific conclusion is that, if there were antimatter domains, they would have to still be comingled with matter domains at least until a time well after “recombination” (the birth of the CMB photons), and they would be quite noticeable then.

Baryon/photon ratio and abundances of light elements. These closely related quantities, measurable through cosmological observations, point to two key conclusions: there is a massive deficit of baryons relative to photons in the universe, indicating significant early (anti)baryon annihilation, and there was not “late”-time antimatter present to mess up the balance of primordial light elements, for which observations match expectations quite well.

It is essentially the baryon-to-photon ratio mentioned above, which connects to your intuition that the energy doesn’t disappear even if the baryons mostly annihilate away. But to be sure: while that’s a fairly direct line to the numerical imbalance, the imbalance is consistent with a host of observations.

The LHCb result did not find a new source of matter-antimatter asymmetry. Rather, they observed the already predicted effect for the first time in these specific particles. It’s an experimental tour de force, but the “not enough to solve the problem” part is the exact same “not enough” as before (i.e., it’s the known asymmetry that stems from how quarks see the so-called “weak” force.)

The next place to keep your eyes peeled for new information is likely the neutrino sector, wherein a matter-antimatter asymmetry could be present in a relatively large amount but just hasn’t been measured yet. In turn, this asymmetry could connect to the early universe given how neutrino masses might be generated. Current and near-term experiments are looking for this potential new source of matter-antimatter asymmetry.

I am by no means a fan of hers (and holy shit, I see she has just uploaded a video titled “Why physicists are afraid of Eric Weinstein […]”).
In fact, let’s forget I mentioned her. I think I’ve seen other science popularizers also say they see no reason why we would assume matter and antimatter to be balanced at the start of the universe.

However, thinking about it, it’s always the business of science of trying to extend our explanatory power. There’s no reason for us to stop at “The ratio just is what it is”. We should keep looking, especially while the ratio seems close to the near perfect symmetry of particle transformations.

I was probably unclear: I mean, whether the CMB can allow us to rule out a boundary outside of the observable universe. I am aware of the various reasons there can’t be a boundary within the observable universe.

@Pasta addressed that, too, but to summarize: That would be an extremely large-scale structure to the Universe, and we have no indication that the Universe is capable of having structure that large-scale.

I’ve never heard of those. Are these alternate names for the top and bottom quark? If so, I approve, and would support the official names being changed :grinning_face:.

Yup. I think they were the original names for t and b. But regardless of which came first, they’re clearly superior.

The problem with “Truth” and “Beauty” as names for quarks is that in Greece they often confused one for the other…

In an infinite universe, it may be possible that structures of any size may be possible merely by chance. We could be inside a bubble of matter-dominated space-time that is much larger than the observable universe, and which is one of an infinite number of similar fluctuations of various sizes.

I mean, we can’t rule it out, but it seems highly unlikely.

The CMB and other windows into the early universe (e.g., light isotope abundances spanning many order of magnitudes) all point to a need for antimatter-matter asymmetry local to our observable universe already. That is: the problem isn’t that there is matter here and maybe antimatter somewhere else. If you try that, then there isn’t nearly enough matter here. You need the imbalance to play out in our observable universe itself. So, “hiding” the antimatter outside the observable universe doesn’t solve the problem and just makes new ones.

As above.

In the 1970s, some European groups put forth “truth” and “beauty” while the rest of the world was adopting “top” and “bottom”. The latter basically won out, but you still see the whimsical versions from time to time. (Recalling the other quark names – up, down, charm, strange – it becomes a matter of whether you want to match the theme of the “up/down” naming or the “charm/strange” naming.)

And for a while, when only three quarks were known, there were some who referred to the s quark as “sideways”.

Nope; I can’t accept that. If we are embedded in a chance bubble of matter-dominated space-time, that bubble may be arbitrarily large; the visible universe is only 45.7 billion light-years in radius; the bubble of matter-dominated spacetime we might be embedded in may be trillions or quadrillions of light years wide, and we would never see any evidence from outside that volume. Infinity is big.

In order to explain the matter-antimatter asymmetry without resorting to chance, we need a mechanism of some sort that favours one over the other everywhere, and we don’t have that.

We do have such mechanisms. The antimatter-matter asymmetry well-measured in quarks is qualitatively the mechanism you need, even if quantitatively it can’t be the full story. However, similar mechanisms are allowed by theory (and being sought experimentally) in leptons without going outside the Standard Model, and any theory that you try to write down that could be relevant in the high energies of the early universe have such mechanisms in them natively.

For random bubbles like you envision, the size distribution would be ludicrously biased to smaller sizes, so the probability is ludicrously high that, if we were in a bubble, it would be evidently so at distances large enough not to affect anthropic considerations but much, much smaller than the observable universe. But the latter is seen to be immaculately homogeneous. In other words, “infinity allows all” is fine to play around with, but that infinity shows up in numerators and denominators, so it doesn’t provide an out.

(As an analogy: Infinite monkeys will eventually write the play “Hamlet”, but they will also write the play “Steve” – the same play with a differently named titular character – around 10700 times in the process. If all of those are anthropically viable universes, you don’t get to just pick the impossibly unlikely version out of the hat as the one we’re living in.)

Yes, you are probably right. Not definitely, but probably, and that’s all we have to go on, until the full story becomes clear (if it ever does).

Good thing the first two quarks weren’t the ones named “top” and “bottom”, or else the s quark would naturally be labeled “switch”.

Since a couple of posts touched on whether there are known processes that treat matter and antimatter differently, I wanted to add a little more on that.

Such processes exist in the Standard Model and have a rich experimental history. The matter-antimatter symmetry-violating processes that have been observed so far all stem from a single aspect of the Standard Model, which I’ll describe here.

There are three families of quarks: the “up/down” pair, the “charm/strange” pair, and the “top/bottom” pair. I’ll use the standard symbols u, d, c, s, t, b.

For almost all purposes, these pairings are fine. However, the weak force (one of the fundamental forces of nature) can choose different pairings, and it in fact does so. For the weak force, u doesn’t pair with d, nor c with s, nor t with b. Instead, there is a quantum mechanical mixing that jumbles things up in a set way. (Note: this freedom in the pairing is a natural piece of the quantum mechanics involved. It’s not something overly bespoke just tacked on at the end.)

So, the weak force sees each of u, c, and t paired with superpositions of d, s, and b. It’s a tiny mixing, though. For the u case, the paired “partner quark” in the weak force (call it dw instead of d) is a 94.9%, 5.1%, and 0.0014% superposition of d, s, and b. So, the weak force’s native u-dw pairing is mostly the same as u-d, but not purely so.

Fundamentally it’s not the percentages that are specified in the theory but rather a different set of quantities that involve complex numbers (i.e., numbers with real and imaginary parts.) For instance, the u-b part of the pairing probabilities above (0.0014%) is actually the square of the magnitude of the so-called u-b “matrix element” – the more fundamental quantity – which has the complex value 0.00135 + 0.00348i.

These (complex) matrix elements are the things that actually show up in calculations of particle interaction rates or decay rates or whatnot. And when you switch between matter and antimatter cases, these complex numbers get “conjugated”, meaning i is changed to -i everywhere.

In most cases, this conjugation doesn’t do anything significant because at the end of the calculation you take the magnitude (squared) of everything, and complex numbers have the same magnitude before or after conjugation. That is, a+ib and a-ib both have squared magnitude a^2+b^2.

But, if a particle decay or interaction of interest can proceed via two different underlying mechanisms, then the calculation requires first adding two independent products of complex numbers (one from each mechanism) and then getting the magnitude of that sum. The complex conjugation (i\rightarrow-i) for the matter-to-antimatter switchover applies only to the matrix element pieces of each term in the sum, so the magnitude of the sum is changed in general when switching between matter and antimatter in such cases.

To summarize:

  • The quarks are paired up, but…
  • The weak interaction chooses a different set of pairings, and…
  • The mixing up of the pairings involves complex numbers, and…
  • Matter and antimatter treat those complex numbers differently (i.e., conjugated), and…
  • Particle decays or interactions that can proceed at a significant level via two (or more) mechanisms will “leak” those differences into the resulting probabilities, and thus…
  • Matter and antimatter behave differently.

Due to mathematical constraints (like the probabilities summing to 1, for instance), there is actually only a single numerical value in the Standard Model that encodes all the imaginary aspects of the matrix elements. This single source of asymmetry manifests itself in many observable processes. It was originally observed in decays of particles called “kaons” but has since been measured in lots of other places – and so far always in line with the Standard Model prediction and always due to just this one underlying source of asymmetry. The article linked upthread by @Hari_Seldon is a recent example.

There are other places in the Standard Model where truly distinct sources of matter-antimatter asymmetry could stem from. One relates to the strong force, but it has not been observed despite very precise measurements. This apparent lack of symmetry violation in the strong force is a big unsolved problem, with interesting possible solutions that even connect to dark matter.

The other place where a new source of matter-antimatter asymmetry could live is in neutrinos. Here the constraints are extremely poor still, so a lot remains to be learned experimentally.