Yes, thanks for pointing out, I forgot about this. Not a good problem. There are consistency bounds on the upper number of particles, but for all I know 4 or 5 generations would be okay theoretically, it’s just in conflict with experiment.
Sabine’s view is that very few questions about “why is this number what it is” in physics are good ones (maybe not any). They’re just numbers you determine by experiment and plug into your theory.
I’m not totally sure I agree, but she has a point that the record for these sorts of investigations isn’t so great.
Okay, so something I don’t understand then. One of the candidates for dark matter is WIMPs, Weakly Interacting Massive Particles.
Technically, a neutrino is a weakly interacting massive particle. So, if dark matter does turn out to be a WIMP, how would that be different from a heavier generation of neutrino?
If a neutrino is heavy enough, in the 100 GeV range, that it doesn’t undergo oscillation, then it wouldn’t have any effect on the neutrinos that we do detect. It would just be a particle that only interacts with gravity and the weak force.
How do we know that? Can we tell from the cosmic microwave background? If you have any article suggestions on that I’d be more than happy to read them. I appreciate your patience explaining stuff to me over the years, like ‘why aren’t GR and QM compatible mathematically’.
Sabine is great. I don’t agree with everything she says, but she has a very contrarian take toward the current state of physics, and is worth listening to.
I’ve enjoyed her iconoclastic view on the current state of particle physics but I’m starting to wonder if she’s taking empiricism too far. If you dismiss all questions about physics by saying “there’s no guarantee there’s anything deeper to be inferred, maybe it just happens to be that way” then there wouldn’t have been any such thing as science to begin with.
I’ve been listening to a podcast called Daniel and Jorge Explain the Universe. Daniel Whiteson is a partí al physicist at CERN. In the episode about neutrinos he explains that the difference is that neutrinos interact with both gravity and the weak nuclear force, while dark matter interacts only with gravity. They both ignore the strong nuclear force and electromagnetism.
Well, maybe? One could certainly have something that interacts with the weak nuclear force in some way, but not electromagnetism, that would be more or less consistent with what we know of dark matter. But not all neutral weakly-interacting particles are neutrinos.
This is from the wiki article on dark matter. Any thoughts?
"Although the scientific community generally accepts dark matter’s existence,[16] some astrophysicists, intrigued by specific observations that are not well-explained by ordinary dark matter, argue for various modifications of the standard laws of general relativity. These include modified Newtonian dynamics, tensor–vector–scalar gravity, or entropic gravity. These models attempt to account for all observations without invoking supplemental non-baryonic matter.
True, but I think her take is a little more pragmatic than that, and it would be unfair to take it to the extreme logical conclusion. I think she’s complaining more about the common approach of:
The Standard Model is wrong because it doesn’t [explain dark matter, account for quantum gravity, predict the particle masses, explain the difference between the cosmological constant and the vacuum energy, etc.]
Come up with an extension to the standard model that predicts particle X
Build a particle X detector
There is no particle X
Repeat ad infinitum. If the process actually gave results, there would be no problem with it, but it just doesn’t seem to be working. Therefore they should try something else.
Wikipedia is correct to say that some astrophysicists are pursuing other explanations. None of them have really been able to make anything work, though, and I would argue that Occam’s razor is against them as well: The existence of a type of particle that doesn’t interact electromagnetically is really a very small conceptual leap, and if such a particle existed, of course it would be very difficult for us to detect. In fact, I would find it more remarkable if the sorts of particles we can easily detect were the only ones that exist.
I will grant that the MOND proponents have found some interesting patterns in galactic structure that are not predicted by standard dark matter theory, but they also aren’t contradicted by it, either. Probably what it means is that there’s some emergent phenomenon in distributions of dark matter that we haven’t yet explained.
I guess that’s where my question comes from, shouldn’t they be?
We define particles by their properties, by what forces they interact with. One particle only interacts with the electromagnetic field and gravity, and we call that a photon. Only one particle interacts with the weak field, electromagnetic field, and gravity, and we call that a lepton. Only one particle interacts with the strong field, weak field, the electromagnetic field, and gravity, and we call that a quark.
So, the particle that interacts with the weak field and gravity is what we call a neutrino. If we find another particle that interacts only through the weak force and gravity, wouldn’t that be a neutrino as well?
Huh, that brings up another question. Do we know that there are no quarks that don’t carry a charge? A particle that interacts with only the weak and strong forces? Would they still be a quark? If they did exist, would we be able to detect them? And are they considered to be a candidate for dark matter?
No, neutrinos are leptons in spite of not having an electric charge. Without them lepton number would not be conserved. For example beta decay produces an electron or positron and also a neutrino or anti-neutrino having the opposite lepton number. It was in fact such conservation along with conservation of mass/energy, spin, etc. that caused neutrinos to be postulated decades before they were experimentally observed.
I should not have used “lepton” as shorthand for electron, muon, and tau, but there is not a single word term that refers to those three particles. It was lazy and incorrect of me.
In supersymmetric models, a candidate WIMP would be the lightest superpartner of the Standard Model particles, which would be a neutralino. As for the difference to the neutrino, particles are identified by their quantum numbers. One of those is the lepton number, which for neutrinos is 1, for a neutralino (not being a lepton) 0.
If particles were identified solely by which interactions they’re subject to, then the three charged leptons would all be the same particle. But they’re not.
I’m late to the party, but I’ll sprinkle in some contributions below.
Indeed, it’s not a prediction but rather part of how the SM is constructed given the observations. However, the number three has at least a germ of an explanation via the anthropic principle. In particular, a universe that followed the SM’s general rules (i.e., same fundamental fields and gauge symmetries) but that only had one or two generations of quarks and leptons could not have the discrete symmetry violations required to generate a matter-dominated universe. Stated more casually: if there were only two or one generations, the early universe couldn’t have led to an excess of matter over antimatter and we wouldn’t be here to obverse the universe. Since we are here, we must have at least three generations, and entropic arguments could be made that a three-generation universe is the most probable observable number to have come about (versus higher numbers).
There are also models beyond the SM that lead to hard mathematical inconsistencies if there are fewer than three generations. Of course, those extended models may or may not be relevant to our actual universe.
Sort of. These limits relate to any matter that stays relativistic much longer than everything else before eventually becoming non-relativistic once the universe has cooled enough. The number of such particles and their total mass leave a strong imprint on large scale galactic structure and cosmic microwave background patterns. But, non-neutrinos can in principle contribute here and sufficiently heavy neutrinos would not, so it isn’t actually a constraint on neutrinos. Indeed, it’s all kinematic – the constraints don’t care about the neutrino as a neutrino, just as a chunk of four-momentum. However, if we take the Standard Model as the starting point, then neutrinos are the only candidate particles for this piece of the cosmological picture, and so one usually talks about these as neutrino constraints. And they are, but the measurements are related a priori only to the light neutrinos (including sterile).
The SM’s weak interaction – the one connected to Z and W bosons – needn’t be the only weak interaction at play. Dark matter can be weakly interacting with a completely separate interaction. But even if not, a new particle that did interact via the SM’s weak interaction needn’t behave like a neutrino (for instance does it live in a so-called weak doublet with a so-far-unknown charged lepton partner, or is it a singlet doing it’s own thing? If the latter, it wouldn’t be walking and talking like a neutrino.)
In short: we don’t define “neutrino” based on just that one thing. Other aspects of the SM’s structure would need to also be right to have a neutrino.
To be sure, the process is eliminating allowed parameter space for the dark matter, starting with either (1) the most well-motivated possibilities or (2) the easiest to get at experimentally. Unfortunately, our puny mortal capabilities have only allowed us to scan a relatively small piece of allowed parameters for dark matter, so it’s not really “not working” as much as it is “a long road”.
Electrically neutral quarks would cause all sorts of problems in light of experimental data unless they were exceedingly massive and also brought along other baggage to ensure a self-consistent model. There definitely aren’t neutral quarks at masses comparable to the known ones. Note that the strong force is, well, strong, so taking away the electrical charge doesn’t help this hypothetical particle hide very well!
Granted, but if the rate of chipping away at the parameter space is very low, this may not be the most efficient use of physicist time. Ideally, one would launch in a research project with a goal of learning how to chip away at the space more efficiently. But I don’t know if that’s a reasonable thing to ask.
Everyone in the field is always trying to figure out how to chip away at the parameter space more efficiently. What little success we’ve had at that thus far is the little bites we’ve been taking. It’s just plain a really hard problem to solve.
I think Sabine would disagree with that. Instead, a great deal of time is spent on predicting novel particles and then trying to detect them. That’s not a terrible idea in and of itself. But given the lack of success, maybe it’s worth taking a step back and spending time trying to figure out how big the parameter space even is. And, especially, stop bothering with extensions that predict numbers that may simply be empirical.
Consider, say, the Axion Dark Matter Experiment:
The axion was invented to solve the strong CP problem, which isn’t actually a problem at all: just a question about fine-tuning. So what if it’s fine-tuned? It doesn’t cause a contradiction in the theory.
But then, the axion was suggested to be a dark matter candidate, which is a real, open problem. But that’s a very roundabout way of going about things. Why not try to work out the most likely candidates directly, instead of looking for leftovers from an unrelated theory that tries to solve a non-problem?
Well, the experiment hasn’t found anything. I guess that means they’re narrowing the parameter space. But only because they increased the parameter space when they proposed the axion to begin with.