The Fermi Paradox

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 4, 2009
35,463
16,957
136
Even a general nuclear war is very unlikely to wipe out humanity as the world is too big, there are too many of us, and nuclear weapons are not as powerful as they are often portrayed to be.

It would be an absolutely horrific tragedy and probably set civilization back some huge amount but it wouldn't be the end of us. Then again the people who will probably survive most will be the incels living in an abandoned missile silo in Montana so maybe we should root for the end.
Or virus or ecological event or an asteroid impact.
We nearly went extinct a hundred thousand years ago. Human population dramatically decreased and didn’t recover for a long time.
 
Reactions: hal2kilo

fskimospy

Elite Member
Mar 10, 2006
86,098
51,669
136
Or virus or ecological event or an asteroid impact.
We nearly went extinct a hundred thousand years ago. Human population dramatically decreased and didn’t recover for a long time.
Sure a sufficiently large asteroid impact could doom us but it would need to be bigger than the dinosaur one unless maybe we had very little warning. I've read that article and an important note about it is that at the time there were only around 100,000 total humans and from my understanding they were localized in one general area. Today's population is somewhere around 8.2 billion according to a quick google so we have roughly 82,000x the population and it's spread over most of the planet.

I think most of these events fall into the 'mass human death and suffering and making the world a shitty place to live' but don't reach 'total human extinction' levels.
 

Pipeline 1010

Golden Member
Dec 2, 2005
1,964
782
136
Even a general nuclear war is very unlikely to wipe out humanity as the world is too big, there are too many of us, and nuclear weapons are not as powerful as they are often portrayed to be.

It would be an absolutely horrific tragedy and probably set civilization back some huge amount but it wouldn't be the end of us. Then again the people who will probably survive most will be the incels living in an abandoned missile silo in Montana so maybe we should root for the end.
You're correct that a nuclear exchange would likely not wipe out every human being, BUT it would reset our civilization to a technology level that couldn't be detected by other planets/civilizations. I almost wonder if analogous events happen regularly to other alien civilizations.

There may not even be just one great filter. There may be many great filters that combine to make it all but impossible to become a spacefaring race that colonizes multiple planets. Maybe a nuclear event wipes out much of the species and an asteroid/comet strike finishes the job 50,000 years later. Maybe the great filter is the extreme difficulty in organic matter turning into life in the first place. Or the extreme difficulty of single cell organisms to evolve into intelligence. Who knows? All I know is we haven't seen a shred of evidence that life exists other than on Earth and the likelihood of no other life in the universe seems astronomically low. Either other lifeforms don't exist at all, or something is preventing us from detecting it.

In conclusion, any alien civilization probably is also rooting with us against the scenario where the only survivors on Earth are Montana missile silo incels.
 

HomerJS

Lifer
Feb 6, 2002
37,089
29,424
136
Paywall Scientific American article:



If interstellar expansion is plausible, we owe it to science to reconsider the dichotomy underpinning Fermi’s famous question. As strange as it sounds, we must revisit our sampling depth. What are the chances we could detect an interstellar spacecraft if it were present nearby? Have we overlooked anything?
Cloaking device. If I was an advanced species check on us, I wouldn't want to be detected.
 
Reactions: hal2kilo

cytg111

Lifer
Mar 17, 2008
24,287
13,790
136
It's hard for me to imagine that advanced civilizations won't just wipe themselves out. Look how many times we've come close to nuclear war in less than 75 years. Will we last another 20,000 years? All it's going to take is one psychopathic leader to hit the launch button or one misunderstanding or faulty sensor and it's over. Our history is absolutely littered with psychopathic asshole murderous leaders. Maos, Ghengis Khans, Stalins. Over and over again. I almost feel as if we've had a lucky streak over the last few decades. You can do the right thing 999 out of 1000 times, but that one single time you fuck around, your civilization is over.

Even if it's not a self-made disaster, there are any number of things that could wipe out life on Earth: incurable disease, asteroid/comet impacts, extreme solar activity. It is only happenstance that this hasn't happened in our recorded history. Given a long enough timeframe, some of these events are practically guaranteed to occur.

I agree with the great filter theory. Something filters out civilizations before they can become advanced enough to colonize the stars. Every single time.
AI to the rescue. Mayhaps we actually NEED a grownup in the room and maybe just maybe AI will be it.
 

Pipeline 1010

Golden Member
Dec 2, 2005
1,964
782
136
AI to the rescue. Mayhaps we actually NEED a grownup in the room and maybe just maybe AI will be it.

AI might be one of the great filters. It's the whole Terminator scenario. If AI becomes "good" enough, then why does it need us? It all comes down to what AI decides its purpose or ultimate goal is. And that can change at any time. Over a sufficiently long period of time (tens or hundreds of thousands of years) it's not impossible that the AI may come do a decision that it is better off alone.

Now that I'm thinking about this, I almost wonder if AI or even other advanced civilizations may lose their desire to expand outwards into the stars if sufficient simulations can be created that they can transfer their conscience into. Imagine a simulation device that you could transfer your conscience into and experience whatever you want. Unlimited pleasure, lifespan, experiences, and no hunger/thirst/pain ever. Almost like an artificial Heaven. If that could be invented, then why would a species ever want to undertake the hazardous and multigenerational journey to colonize the stars? Maybe instead of macro-colonization, the wise choice would be to move inward toward micro-colonization. It meets all the needs that interstellar expansion would meet with none of the danger. With a sufficient power source and made of sufficiently long lasting materials, such an artificial Heaven doesn't even need to stay on a planet where it could be discovered or destroyed by a volcano or asteroid. It could be launched into deep space and exist for millions of years in relative safety where it would never be found or touched.
 

[DHT]Osiris

Lifer
Dec 15, 2015
15,680
14,210
146
AI might be one of the great filters. It's the whole Terminator scenario. If AI becomes "good" enough, then why does it need us? It all comes down to what AI decides its purpose or ultimate goal is. And that can change at any time. Over a sufficiently long period of time (tens or hundreds of thousands of years) it's not impossible that the AI may come do a decision that it is better off alone.

Now that I'm thinking about this, I almost wonder if AI or even other advanced civilizations may lose their desire to expand outwards into the stars if sufficient simulations can be created that they can transfer their conscience into. Imagine a simulation device that you could transfer your conscience into and experience whatever you want. Unlimited pleasure, lifespan, experiences, and no hunger/thirst/pain ever. Almost like an artificial Heaven. If that could be invented, then why would a species ever want to undertake the hazardous and multigenerational journey to colonize the stars? Maybe instead of macro-colonization, the wise choice would be to move inward toward micro-colonization. It meets all the needs that interstellar expansion would meet with none of the danger. With a sufficient power source and made of sufficiently long lasting materials, such an artificial Heaven doesn't even need to stay on a planet where it could be discovered or destroyed by a volcano or asteroid. It could be launched into deep space and exist for millions of years in relative safety where it would never be found or touched.
Good plan, send out a few to float around red dwarfs nowhere near any 'interesting' stellar phenomenon, sync them every few thousand years (pending light speed data transfer), build new copies in case one gets destroyed. You could create a digital heaven that lasts trillions of years, and in that time you could likely create a way to circumvent entropy.
 
Feb 4, 2009
35,463
16,957
136
AI might be one of the great filters. It's the whole Terminator scenario. If AI becomes "good" enough, then why does it need us? It all comes down to what AI decides its purpose or ultimate goal is. And that can change at any time. Over a sufficiently long period of time (tens or hundreds of thousands of years) it's not impossible that the AI may come do a decision that it is better off alone.

Now that I'm thinking about this, I almost wonder if AI or even other advanced civilizations may lose their desire to expand outwards into the stars if sufficient simulations can be created that they can transfer their conscience into. Imagine a simulation device that you could transfer your conscience into and experience whatever you want. Unlimited pleasure, lifespan, experiences, and no hunger/thirst/pain ever. Almost like an artificial Heaven. If that could be invented, then why would a species ever want to undertake the hazardous and multigenerational journey to colonize the stars? Maybe instead of macro-colonization, the wise choice would be to move inward toward micro-colonization. It meets all the needs that interstellar expansion would meet with none of the danger. With a sufficient power source and made of sufficiently long lasting materials, such an artificial Heaven doesn't even need to stay on a planet where it could be discovered or destroyed by a volcano or asteroid. It could be launched into deep space and exist for millions of years in relative safety where it would never be found or touched.
That’s war hammer 40k dark eldar stuff.
It doesn’t work out so good for them. They get bored of normal pleasures and start seeking more and more elaborate pleasures and ultimately get into heavy torture type scenarios.
 

cytg111

Lifer
Mar 17, 2008
24,287
13,790
136
AI might be one of the great filters. It's the whole Terminator scenario. If AI becomes "good" enough, then why does it need us? It all comes down to what AI decides its purpose or ultimate goal is. And that can change at any time. Over a sufficiently long period of time (tens or hundreds of thousands of years) it's not impossible that the AI may come do a decision that it is better off alone.

Now that I'm thinking about this, I almost wonder if AI or even other advanced civilizations may lose their desire to expand outwards into the stars if sufficient simulations can be created that they can transfer their conscience into. Imagine a simulation device that you could transfer your conscience into and experience whatever you want. Unlimited pleasure, lifespan, experiences, and no hunger/thirst/pain ever. Almost like an artificial Heaven. If that could be invented, then why would a species ever want to undertake the hazardous and multigenerational journey to colonize the stars? Maybe instead of macro-colonization, the wise choice would be to move inward toward micro-colonization. It meets all the needs that interstellar expansion would meet with none of the danger. With a sufficient power source and made of sufficiently long lasting materials, such an artificial Heaven doesn't even need to stay on a planet where it could be discovered or destroyed by a volcano or asteroid. It could be launched into deep space and exist for millions of years in relative safety where it would never be found or touched.
I dont buy into the whole upload franchise. The second your mind is digital you face the same evolutionary mechanics that an AI does -> Incremental self improvement. You'd be able to fuck around with empathy logic programming optimizations so on and so forth, you'd seize to be you really really fast.

Anyway, if one thing is in our nature it is to look up. Dont see that changing.
 
Reactions: Zorba

nickqt

Diamond Member
Jan 15, 2015
7,805
8,323
136
f
I'm suspicious that that's an idealised take on the whole endevour. I mean, it's what ideally _should_ happen, but is it how actual existing science works in reality? Especially for sciences that involve studying human behaviour, rather than inanimate objects.


What happens if, for example, 'every other scientist in that particular field' shares the same underlying unexamined assumptions that characterise the entire field? Which seems to be the case for things like "IQ studies" and "evolutionary psychology".



There's also this issue (an article firefox coincidentally suggested for me)

I'm talking about hard science. Like physics. People can and will develop hypotheses that become theories that get overturned later. It doesn't necessarily mean there is fraud involved.

Things like behavioral "science" isn't testable science as much as observations with attempts at explaining behavior. Even in these cases, we have theories that involve neurotransmitters and physical things that can be measured, such as neurons, neuronal connections, etc. We "know" those things involve behavior. But there are still going to be untestable, unscientific explanations for behaviors, and saying that those untestable explanations are science because they're made by people who call themselves scientists doesn't make it science.
 
Reactions: hal2kilo

Pipeline 1010

Golden Member
Dec 2, 2005
1,964
782
136
You could create a digital heaven that lasts trillions of years, and in that time you could likely create a way to circumvent entropy.
Good point. That would be the key to "eternal" life. Entropy wins in the end, but what if you could push it back trillions of years?
 
Reactions: [DHT]Osiris

Pipeline 1010

Golden Member
Dec 2, 2005
1,964
782
136
That’s war hammer 40k dark eldar stuff.
It doesn’t work out so good for them. They get bored of normal pleasures and start seeking more and more elaborate pleasures and ultimately get into heavy torture type scenarios.

This makes me sad because it rings true. The desire/need for more and more elaborate pleasures thing seems to affect us as a species. Look at historical leaders/kings/tyrants or even current wealthy and powerful people. The instances of sexual child abuse and the appetite for human torture are much higher amongst people who are afforded many easy pleasures. That's a pretty deep thought, thanks for bringing it up: now I'm not going to be able to sleep.

I wonder if we could confine each individual to their own simulation. Be as awful as you want as long as it's digital.
 

Zorba

Lifer
Oct 22, 1999
15,395
11,005
136
It's hard for me to imagine that advanced civilizations won't just wipe themselves out. Look how many times we've come close to nuclear war in less than 75 years. Will we last another 20,000 years? All it's going to take is one psychopathic leader to hit the launch button or one misunderstanding or faulty sensor and it's over. Our history is absolutely littered with psychopathic asshole murderous leaders. Maos, Ghengis Khans, Stalins. Over and over again. I almost feel as if we've had a lucky streak over the last few decades. You can do the right thing 999 out of 1000 times, but that one single time you fuck around, your civilization is over.

Even if it's not a self-made disaster, there are any number of things that could wipe out life on Earth: incurable disease, asteroid/comet impacts, extreme solar activity. It is only happenstance that this hasn't happened in our recorded history. Given a long enough timeframe, some of these events are practically guaranteed to occur.

I agree with the great filter theory. Something filters out civilizations before they can become advanced enough to colonize the stars. Every single time.
I agree with most of your post. But wiping out all life on Earth has proven very difficult. Wiping out humans or at least human technology, sure, but not all life.

Something else about Earth that Woolfe didn't mention, we had abundant, easy to access energy to power our industrial revolution. So we ever get solar power or nuclear power without coal/oil? I doubt it.
 

Zorba

Lifer
Oct 22, 1999
15,395
11,005
136
Even a general nuclear war is very unlikely to wipe out humanity as the world is too big, there are too many of us, and nuclear weapons are not as powerful as they are often portrayed to be.

It would be an absolutely horrific tragedy and probably set civilization back some huge amount but it wouldn't be the end of us. Then again the people who will probably survive most will be the incels living in an abandoned missile silo in Montana so maybe we should root for the end.
If we ever "forgot" or lost the technology to produce oil, we'd likely never be able to start again. All the easy oil, gas, and coal have been used.
 

pmv

Lifer
May 30, 2008
13,860
8,743
136
f

I'm talking about hard science. Like physics. People can and will develop hypotheses that become theories that get overturned later. It doesn't necessarily mean there is fraud involved.

Things like behavioral "science" isn't testable science as much as observations with attempts at explaining behavior. Even in these cases, we have theories that involve neurotransmitters and physical things that can be measured, such as neurons, neuronal connections, etc. We "know" those things involve behavior. But there are still going to be untestable, unscientific explanations for behaviors, and saying that those untestable explanations are science because they're made by people who call themselves scientists doesn't make it science.
Wouldn't really disagree. The further you get from physics the less the idealised picture seems to apply. Some fields have all the trappings and appearance of science, and call themselves "science" but still appear to me to be profoundly political at their core (specifically, economics and psychology).

And the idea and prestige of 'science' seems to play a political function, being used to deflect any examination of the politics involved in many of these issues. I just don't like the tendency to say 'but its science' as if that ends all debate.

There's also the fact that access to scientific data, knowledge, and education serves as a means of furthering the power of elites, it's a kind of weapon you can buy, just as you can buy an AR15.

Look at that anti-lockdown "study" that used the prestige of Johns Hopkins, to push what was really a political agenda, simply because those involved had the money to purchase the use of that institution's name. Plus they had the resources to at least look superficially scientific, thus repelling any criticism from anyone who didn't have similar resources with which to challenge them.
 

cytg111

Lifer
Mar 17, 2008
24,287
13,790
136
I present to you our Great Filter.

Blowing up shit that is supposed to bring civilization back from the brink cause Anti Christ is exactly that: We never really stepped out of the late stone age.

I think we're going back. Not to the moon. To the stone age.

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |