Very interesting. The 2nd cited article offers two, closely related, possible explanations:
While these views (Idiocracy?) may have the “ring of truth,” ISTM there’s a much simpler possible explanation. The large size of the human brain is a major obstacle to childbirth, as I understand it, and a major cause of mothers’ deaths at childbirth. Might not mutations to pack the same brain power into a smaller volume have been advantageous?
We are 7B people vs maybe 10-50k at that time. There is far more diversity today than you would have seen at that time, and any individual then would easily fit into the expected range of human populations today.
A problem I have with comparing the loss of animal brain size and the loss of human brain size is that domesticated animals have humans to do their thinking for them, and to do it better - even a smart dog or cat isn’t a nearly an intellectual match for even a stupid human. Humans don’t have any such smarter species they can lean on, and in fact have a long history of exploiting their stupider fellow humans.
And then whole “Idiocracy” hypothesis is pretty clearly an example of thinly hidden classism/racism, as well as a call for eugenics & mass murder. It was just less subtle back in the day with The Marching Morons.
Seriously? I’ve lived more than 50 years. That’s 10% to 25% of the span you mention. You think that (1957 to 2012) times 4 is going to equal omniscient and godlike?
If we continue to advance faster and faster I wouldn’t be surprised for us to become essentially as advanced as it’s possible to be in that time or less, barring accomplishments that require longer time periods due to laws-of-physics limitations like the speed of light.
Personally I prefer the old term “arbitrarily advanced civilization” to “godlike”; godlike tends to imply abilities that are impossible or probably impossible. Like the previously mentioned omniscience. Or more prosaically, something like FTL travel; if the laws of physics don’t allow faster than light travel, then we’ll never be able to travel faster than light regardless of how advanced we get or how long we last. “Godlike” has more the implication that you can do as you please, regardless of any pesky physical laws.
It depends. We can’t know what kinds of technologies will exist but it may be possible to bend the natural laws someday, or failing that we can create alternate universes where we can bend the laws of nature.
Failing that, virtual reality environments that are as real as the real world can always be created.
As far as faster than light travel, it may be possible if space/time can be bent around a ship.
So in instances like that, even if the natural law cannot be broken perhaps it can be gotten around.
What is the maternal mortality rate? 2% in nature? Clearly, if the head is too big, nature might select different utero-vaginal characteristics to make birth more manageable as easily as she might make the brain smaller. That just makes no sense, maternal mortality is just a natural cull.
Thing is, about 90% (arbitrary large value) of what the brain does is non-cognitive or otherwise not related to reasoning. I just took a drink of my coffee, which used a large portion of my brain to accomplish, from my hand finding the cup to the action (and how much to sip) to the taste response to setting it down again. Go out for a walk and pay attention to the enormous amount of available sensory input you block out (Huxley’s “reducing valve”). From this perspective, the environmental threat theory makes a lot of sense – it is not about how big it is but how you use it.
Although some of our descendants might be Matrioshka Brains
there is a non-trivial possibility that some of them will be recognisably human.
The great thing about a technological civilisation is that it can retain information for an arbitrary period- data can be recorded in several different locations and compared at intervals to see if any errors have crept in. There is no reason that any particular piece of data need be lost- if the civilisation persists, that data could last to the heat death of the universe.
Three billion years from now, or a trillion, the data in the human genome could be reactivated and modern humans could be reconstituted, if this is ever deemed desirable. There could be humans present when the Earth dies, and when the last star in the Milky Way falls into the central black hole.
A large part of why we have big brains is to navigate complex social situations, no other species has the ability to do what we do socially. Our cognitive intelligence is piggybacked onto our social intelligence. And before modern medicine, in western cultures, childbirth was the #1 killer of women (supposedly fire was #2, since women were wearing large dresses and constantly heating and cooking with open flames).
The fact that we haven’t been wiped out yet isn’t proof that can’t be wiped out. We might be able to deflect asteroids by the end of this century - and even if we can, there’s probably going to be an upper limit on how much mass that system is capable of deflecting. And giant rocks are far from the deadliest thing that could come at us from space. A lot of that can be avoided by getting out of our solar system, but we don’t know if that’s really possible yet - it’s never been done, and we really only have the haziest ideas of how it might be possible. And those ideas might all be wrong.
Hundreds of years ago, civilization survived without a good knowledge of disease prevention, but hundreds of years ago, it would take a couple of months to travel between major cities. We have better hygiene and pharmaceuticals, but we also have far more disease vectors and much stronger bugs. We’re unlikely to be exterminated by a disease, but a sufficiently virulent plague could gut our civilization, vastly slowing or even reversing our scientific progress.
And there’s the possibility that we’ll get hit by multiple catastrophes at once - ecological disaster from climate change, followed by an especially nasty contagion, might leave us with no one manning the anti-asteroid cannons when another dinosaur killer swings through the inner solar system, and that’s the end of* homo sapiens sapiens.*
That assumes that the rate of scientific progress continues it’s upward curve. We have no particular reason to believe that will happen, other than optimism. There may be an upper limit on what is possible for humans to understand or create. Just because we haven’t hit that limit yet, doesn’t mean that limit doesn’t exist.
100% recycling isn’t possible - you always lose something to entropy. And there’s ultimately a finite number of resources on the Earth. There’s also a hard limit to how efficient you can make a machine - you’re never going to get a device that produces more energy than it consumes. All of these options also require resources to implement. We might be able to invent a more reliable energy source than petrochemicals, but we’re going to need petrochemicals to find, refine, and/or manufacture those sources - if we use up all the gas before we invent a replacement for gas, we’re fairly well fucked.
And if we make it past all of that - if we survive every thing this universe can throw at us… eventually the universe will simply run out of steam. Entropy always wins.