A few Universe/Big Bang questions

Wow, thank you all the members, especially Chronos for a thorough and clear explanation.

Following Chronos’ answer:

Yes, I know of the Hubble Constant. My thought was that the “constant” wasn’t
constant because it was large during inflation, and that it grew smaller after inflation (only to be sped up again later via dark energy). So the constant can’t be run backwards to get a precise point of time 0. If we account for all this in MAP data and our models, then I’m sure we can get close on the age.

As other members pointed out too, I am envisioning an exploding singularity as the big bang. This is why I can’t get past the “infinite” part. I have no problem with space being infinite; it’s our own universe [the matter in it] that originated from the big bang that shouldn’t be infinitely dispersed. It should have an end somewhere - just outside our own universe. It seems as though the answer here is that the size of the universe is infinite at any time after time 0. Like the size is (Infinity * time). Bad analogy since Infinity is not a number, I know. On the explanation of the video game and the sides - Isn’t that the flat vs. curved universe argument? I thought we were pretty sure we lived in a flat universe.

On my last question, now that I read Chronos’ reply, my REAL original question was “How do we know how much baryonic matter we have?” - I lacked the vocabulary though. My thought was that there could just be a bunch of “dark” baryonic matter in the form of dust, planets, black holes, etc in galaxies that take the place of and eliminate the need for exotic dark matter that we cannot find. This seems to be to be Occam’s Razor vs exotic matter that we can’t detect. At least we can detect neutrinos. So how do we know our models of the amount of baryonic matter is accurate?

I thought that the wikipedia article linked by Asymptomatically Fat was nice in that it addressed a lot of your questions, in addition to my CMB question. Here is a repeat link to the start of the article.

Well, we never know for sure whether any model is right. But with the model of big bang nucleosynthesis, it helps a lot that the model is overdetermined but consistent. That is to say, just knowing, say, the ratio of helium-3 to deuterium in the Universe would in principle be enough to give a figure for the amount of baryonic matter. But we also have the amount of various lithium and beryllium isotopes, which can also tell us the amount of baryonic matter, and all of these ratios tell us the same amount of baryonic matter (to within reasonable amounts of error). If we really do have enough baryonic matter to account for the dark matter, then something else not only threw off all of those measurements, but threw them all off in a way that exactly mimics there being less baryonic matter, and now we again find ourselves forced to posit the existence of something we don’t understand. Might as well just posit the nonbaryonic dark matter, at that point.

There are three aspects to the answer:
(1) We just don’t see the extra baryonic matter we would need.
(2) Non-baryonic dark matter explains, in one go, a lot of very different experiment data, while baryonic dark matter doesn’t.
(3) Non-baryonic dark matter isn’t that exotic a thing to introduce.

Each in turn…
(1) We just don’t see the extra baryonic matter we would need.
We know a lot about how baryons behave, so even if there were a dark excess, we can look for these baryons in other ways. Larger chunks like neutron stars or brown dwarfs would leave gravitational microlensing signatures. Gas clouds can be measured via absorption spectra (e.g., Lyman-alpha forest). Clouds of matter would also interact in measurable ways during galactic cluster collisions. These tools have all been used to infer the baryonic matter density directly, and it all agrees with the standard (i.e., cold dark matter) cosmological picture.

(2) Non-baryonic dark matter explains, in one go, a lot of very different experiment data, while baryonic dark matter doesn’t.
This is sort of the indirect version of item (1). The observed ratios of light isotopes in the universe, large scale galaxy structure formation, and the CMB power spectrum all agree beautifully with a CDM picture and horribly with a pure baryonic dark matter picture. For one example of this, here’s the Planck CMB power spectrum, with both the data (points) and the best Lambda-CDM prediction (curve) shown. You have to be doing something right to explain all those wiggles. In fact, those wiggles are due precisely to the acoustic oscillations of baryons and photons interacting with one another in the presence of whatever dark matter is around. The data and the model agree spectacularly, and if you try to remove dark matter from the picture – even just a little – the agreement falls apart.

(3) Non-baryonic dark matter isn’t that exotic a thing to introduce.
In the standard model of particle physics, we’ve got particles that participate in all three of the strong, weak, and electromagnetic forces. We’ve got particles that participate in only two of them. We’ve got particles that participate in only one. We’ve got three families of quarks and three families of leptons, for some reason. We got dozens of parameters than have to be set by hand. We’ve got masses than inexplicably span 12 orders of magnitude. Adding a new particle that interacts only weakly just isn’t so crazy, and doing so can even solve some open problems in particle physics.

So, it’s the inclusion of non-baryonic dark matter that satisfies Occam. Without it, you’d have to add layers of hacks to make everything work. This is in part because the inferred density of dark matter is so much larger than the inferred density of baryonic matter. Rather than dark matter being a subtle addition needed to make things line up a little better, it’s a dominant part of the story, and the story works well across many types of observations.

It seems like every time a particle physicist proposes some new particle to solve some problem or another in particle physics, they also include a section at the end of the paper pointing out how this new particle could account for the cosmological dark matter. If even a small fraction of these proposals were true, we’d have the opposite dark matter problem: There’s not enough to account for all of the new particles the particle physicists want.

Just to pick up a few points:

General relativity tells you how the scale factor (and hence the Hubble parameter) evolves. FWIW in a Universe with a Hubble parameter that is constant in time there is no big bang singularity and it extends infinitely into the past (this is called de Sitter spacetime).

The key point is to realize the singularity itself isn’t part of the description and we can’t assign it physical properties such as volume. If you were to try extend the definition of volume to singularity itself you could find reasons for assigning it zero volume and reasons for assigning it infinite volume, but it’s moot.

What Hari Seldon and Chronos are talking about is the spatial topology of the Universe rather than its geometry (i.e. whether it is positively curved, negatively curved or flat). However the conditions that the Universe is spatially both homogeneous and isotropic mean that the geometry also fixes the topology. Hari Seldon describes a homogeneously flat and compact (i.e. finite volume) topology that would be described as an exotic as it lacks strict isotropy.

As I said earlier baryonic dark matter doesn’t fit with the evidence, which Pasta goes into more detail above about.

While it’s true that nontrivial topologies are not quite isotropic, they can still be isotropic enough. In any event, though, if the Universe does have a nontrivial topology, it appears to be on scales larger than we have the capability of observing.