Unobservably tiny departures from physical constants

Inspired by the “do neutrinos have mass” thread. My question is which physical constants have to have their presumed exact values because even the tiniest departure would have observable consequences, and which could be off by some infinitesimally tiny value. Like the famous assumption of Relativity that there is exactly zero variance in the measured speed of light: could it be that objects in relative motion do show a difference in the speed of light by something like one divided by Graham’s Number and thus in principle there are preferred reference frames? Or would that contradict all the observations predicted by Relativity?

I don’t think there’s anything in physics where we can truly rule out a slight deviation from the theoretical value. The closest I can think of would be things that are inherently an integer value, but even there, my Particle Data Booklet lists the number of quark generations as 3 ± some very small number.

And no, I don’t know why they list it that way, rather than as a (very high) confidence level that it’s not 4.

2 + 2 = 5 for very large values of 2

Mainly I was wondering if there was any variance in the measured speed of light at all, and therefore non-equivalence of all reference frames, if that would mean there shouldn’t be any such thing as relativistic “mass” (momentum), foreshortening, mass-energy equivalence, etc. at all; or if it would just mean that all those things would be off by some infinitesimally tiny amount.

Particle handbook says:

The there are two constants related to electricity that are part of the formula for the speed of light - permittivity and permeability of the vacuum. Prior to the redefinition of SI constants to rely solely on measurable quantities and no artifacts, the latter was effectively defined to be 4 * pi in a certain unit system. However, that was simply because of how the ampere was defined, which required a non-physical (infinite) set up to realize precisely, such that when you did the math on that set up, you got permeability of 4 * pi in some units related to the setup. After the redefinition of the constants, where an Ampere is defined as a Coulomb per second, and a Coulomb is a specific number of electrons, it’s still fairly close, but for some reason is slightly off from the old value; I’d have thought they’d have defined things such that it wouldn’t change to any significant digit. However, it looks like it was affected by the experimental error bars on other constants that are now required to be defined in terms of laws of physics rather than artifacts, so it’s always possible that its best known value is off a little from the previous value as our experiments get more accurate.

Wait, when did this happen? I know there was a push to redefine the kilogram, but I don’t think that’s even finished yet, and I hadn’t heard about any of the other base units being changed.

I don’t recall the PDG publishing quark counts that way (what year PDG are you using?), but they do report the number of light neutrinos that way. This is because the nominally integral number can be inferred from various continuous quantities, most precisely the partial width (i.e., lifetime (sort of)) of the Z boson, but also from cosmological data. It’s rather heartening when such inferred values, when treated continuously, yield a number rather precisely sitting at an integer.

For quarks, limits on additional generations are usually reported either as mass limits on a 4th generation or as tests of the unitarity of the quark mixing matrix.

But, yeah, anything that isn’t fundamentally counting something always has a loophole for making a tiny nudge, and lots of people spend their careers looking for evidence of those nudges.

Physicists (at least this one) have a name for one over Graham’s number. It is called “Zero”. Mathematicians wouldn’t like that but it is so tiny that it cannot make any conceivable difference in the real world.

IANA expert by any means.

But just on a logical basis ISTM there’d be no reason for all the relativistic voodoo to completely stop happening just because of a smidgen of natural variability in the 33rd decimal place.

The fact the math works even though we have error bars on our knowledge of the underlying constants strongly suggests that the math would work on any actual constant value within those error bars. And would work even for a constant that wasn’t in fact constant but which varied within those error bars, or some small subsection of them.

I think this may not be able to be resolved to a factual answer, because the inexorable march of science keeps turning up more reasons for physical constants to have specific relationships to one another.

It might be that all the physical constants have to have the values they actually do, and our attempts to make our estimates more precise and uncover more relationships could converge on a set of exact relationships and an understanding that it couldn’t have been otherwise. In fact this has always been my default guess (and I’m actually a physicist, albeit a pretty mediocre one). In this case the answer would become “all physical constants have to have the exact values they do, though our estimates of those values vary a bit from where we put them in 2023”. But we’re far enough from figuring this out that for me it can only be conjecture.

But then of course there’s the multiverse question of perhaps we just happen to be living in a universe where they have those particular values.

I don’t know how seriously real physicists take the multiverse idea… I’m just an interested spectator.
It feels like a cop-out to me, but I guess it may be just the way reality actually is?

Some time in the late 90s. And I don’t think I’ve gotten it out in a decade, so I’m not sure I could even find it to double-check. I might be misremembering the neutrino one.

There are many different multiverse ideas, and how seriously physicists take them varies likewise.

Years ago it was said about cosmology that there are two schools of thought: that the number of theories about the ultimate fate of the universe will expand forever, and, that the number of theories will eventually start to contract again.

Maybe something similar is going on here.

Or… somewhere.

That’s meta enough that my head might asplode. :exploding_head:

I guess I was mostly thinking about the string/brane flavor, which last I heard suggested about 10^500 universes.

Of course we are still waiting for a testable prediction from that school of thought, as far as I know…?

10^500 is about the number of distinct models that you can get out of strings, but there’s no implication that all of those models in any sense “actually exist”.