If everything was hot, dense, and super close-together at the Big Bang, what kept us from collapsing into a singularity?
The Big Bang is one of the most counterintuitive ideas out there. If you think about taking all the matter and energy in the Universe, and starting it off in a tiny region of space, doesn’t it seem rather unlikely that it would expand at the exact rate needed to give us the Universe we see today? Wouldn’t it be far more likely to simply collapse, gravitationally, into the densest type of object the Universe can contain: a black hole? Clearly, that didn’t happen. But understanding why that didn’t happen might just be one of the most profound questions you can ask to help make sense of the Universe we inhabit.
If you knew, from first principles, what the laws of physics were everywhere and at all times in our Universe, that still wouldn’t be enough for you to come up with the prediction that the Universe as we see it ought to exist. Because while the laws of physics set the rules for how a system evolves over time, it still needs a set of initial conditions to get started. Somehow, the way that the fabric of the Universe was expanding at the earliest moments we can conceive of balanced out this tendency of the matter and energy to gravitate and collapse. In order to see how this all works, let’s go back to the birth of our most successful theory of gravity — general relativity — some 100 years ago.
Prior to Einstein, it was Newton’s Law of Universal Gravitation that was the accepted theory of gravity. All of the gravitational phenomena in the Universe, from the acceleration of masses on Earth to the orbits of the moons around the planets to the planets themselves revolving around the Sun, his theory described it all. Objects exerted equal-and-opposite gravitational forces on one another, they accelerated in inverse proportion to their mass, and the force obeyed an inverse square law. By time the 1900s rolled around, it had been incredibly well-tested, and there were no exceptions. Well, with thousands upon thousands of successes to its credit, there were almost no exceptions, at any rate.
But to the astute and those who paid great attention to detail, there were a couple of problems:
- At very fast speeds — that is, at speeds approaching the speed of light — Newton’s ideas about absolute space and absolute time didn’t hold anymore. Radioactive particles lived longer, distances contracted, and “mass” didn’t appear to be the fundamental source of gravitation: that honor looked like it went to energy, of which mass is only one form.
- In the strongest gravitational fields — at least, if that’s why the planet Mercury is believed to be special among our Solar System’s planets in orbit around the Sun — the Newtonian prediction for the gravitational behavior of objects is slightly but noticeably off from what we observe. It’s as though, when you get very close to a very massive source, there’s an extraattractive force that Newtonian gravity doesn’t account for.
In the aftermath of this, there were two developments that paved the way for a new theory to supersede Newton’s brilliant, but centuries-old, conception of how the Universe worked.
The first major development was that space and time, previously treated as a separate three-dimensional space and a linear quantity of time, were united in a mathematical framework that created a four dimensional “spacetime.” This was accomplished in 1907 by Hermann Minkowski:
The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. […] Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.
This worked only for flat, Euclidean space, but the idea was incredibly powerful mathematically, as it led to all the laws of special relativity as an inevitable consequence. When this idea of spacetime was applied to the problem of Mercury’s orbit, the Newtonian prediction under this new framework came a little closer to the observed value, but still fell short.
But the second development came from Einstein himself, and it was the idea that spacetime was not flat at all, but was curved. And the very thing that determined the curvature of spacetime was the presence of energy in all of its forms, including mass. Published in 1915, Einstein’s framework was incredibly difficult to calculate in, but presented scientists everywhere with the tremendous potential to model physical systems to a new level of accuracy and precision.
Minkowski’s spacetime corresponded to an empty Universe, or a Universe with no energy or matter of any type.
Einstein was able to find a solution where you had a Universe with one single, solitary point mass source in it, and with the stipulation that you were outside of that point. This reduced to the Newtonian prediction at great distances, but gave stronger results at closer distances. These results not only agreed with the observations of Mercury’s orbit that Newtonian gravity failed to predict, but it made new predictions about the deflection of starlight that would be visible during a total solar eclipse, predictions that were later confirmed during the solar eclipse of 1919.
But there was another solution — a surprising and interesting one — that came out just weeks after Einstein published his general theory of relativity. Karl Schwarzschild had worked out further details of what happens to a configuration with a single, solitary point mass of arbitrary magnitude, and what he found was remarkable:
- At large distances, Einstein’s solution held, reducing to Newton’s results in the far-field limit.
- But very close to the mass — at a very specific distance (of R = 2M, in natural units) — you reach a point where nothing can escape from it: an event horizon.
- Moreover, inside that event horizon, everything that enters inevitably collapses towards a central singularity, which is unavoidable as a consequence of Einstein’s theory.
- And finally, any initial configuration of stationary, pressureless dust (i.e., matter that has zero initial velocity and does not interact with itself), regardless of the shape or density distribution, will inevitably collapse down to a stationary black hole.
This solution — the Schwarzschild metric — was the first complete, non-trivial solution to general relativity ever discovered.
So, with that in mind, what about the hot, dense, early Universe, where all the matter-and-energy presently strewn across some 92 billion light-years worth of space was contained in a volume of space no bigger than our own Solar System?
The thing you must wrap your mind around is that, much like Minkowski’s spacetime, Schwarzschild’s solution is a static one, meaning that the metric of space does not evolve as time progresses. But there are plenty of other solutions — de Sitter space, for one, and the Friedmann-Lemaître-Robertson-Walker metric, for another — that describe spacetimes that either expand or contract.
If we had started off with the matter-and-energy our Universe had in the early stages of the Big Bang, and didn’t have a rapidly expanding Universe, but a static one instead, and one where none of the particles had pressure or a non-zero velocity, all of that energy would have formed a Schwarzschild black hole in extremely short order: practically instantaneously. But general relativity has another important caveat in it: not only does the presence of matter and energy determine the curvature of your spacetime, but the properties and evolution of everything in your space determines the evolution of that spacetime itself!
What’s most remarkable about this is that we know, from the moment of the Big Bang onwards, that our Universe only seems to have three possible options, dependent on the matter-and-energy present within it and the initial expansion rate:
- The expansion rate could have been insufficiently large for the amount of matter-and-energy present within it, meaning that the Universe would have expanded for a (likely brief) time, reach a maximum size, and then recollapse. It’s incorrect to say that it would collapse into a black hole (although this is a tempting thought), because space itself would collapse along with all the matter-and-energy, giving rise to a singularity known as the Big Crunch.
- On the other hand, the expansion rate could have been too large for the amount of matter-and-energy present within it. In this case, all the matter and energy would be driven apart at a rate too rapid for gravitation to bring all the components of the Universe back together, and for mostmodels, would cause the Universe to expand too quickly to ever form galaxies, planets, stars, or even atoms or atomic nuclei! A Universe where the expansion rate was too great for the amount of matter-and-energy contained within it would be a desolate, empty place indeed.
- Finally, there’s the “Goldilocks” case, or the case where the Universe is right on the bubble between recollapsing (which it would do if it had just one more proton) and expanding into oblivion (which it would do if it had one fewer proton), and instead just asymptotes to a state where the expansion rate drops to zero, but never quite turns around to recollapse.
As it turns out, we live almost in the Goldilocks case, with just a tiny bit of dark energy thrown in the mix, making the expansion rate just slightly larger, and meaning that eventually all the matter that isn’t gravitationally bound together already will be driven apart into the abyss of deep space.
What’s remarkable is that the amount of fine-tuning that needed to occur so that the Universe’s expansion rate and matter-and-energy density matched so well so that we didn’t either recollapse immediately or fail to form even the basic building-blocks of matter is something like one part in 10²⁴, which is kind of like taking two human beings, counting the number of electrons in them, and finding that they’re identical to within one electron. In fact, if we went back to a time when the Universe was just one nanosecond old (since the Big Bang), we can quantify how finely-tuned the density and the expansion rate needed to be.
The level to which the expansion rate and the overall energy density must balance is insanely precise; a tiny change back then would have led to a Universe vastly different than the one we presently observe. And yet, this finely-tuned situation very much describes the Universe we have, which didn’t collapse immediately and which didn’t expand too rapidly to form complex structures. Instead, it gave rise to all the wondrous diversity of nuclear, atomic, molecular, cellular, geologic, planetary, stellar, galactic and clustering phenomena we have today. We’re lucky enough to be around right now, to have learned all we have about it, and to engage in the enterprise of learning even more: the process of science. The Universe didn’t collapse into a black hole because of the remarkably balanced conditions under which it was born, and that might just be the most remarkable fact of all.
Ethan Siegel is the author of Beyond the Galaxy and Treknology. You can pre-order his third book, currently in development: the Encyclopaedia Cosmologica.