18 Ways the World Could End

The biggest threats are often not always what we think they are. The sheep spends its whole life fearing wolves only to be killed by the shepherd. With that in mind, here are some of the many things that could end the human race as we know it...

1. Our universe collides with another universe

Quantum theorists think it's possible that our universe is just one of many bubbles floating in a greater cosmos. What would happen if our universe collided with another? The answer: no one knows. It could be that one or both universes would be destroyed. So are we just one day away from destruction? Probably not. After all, the universe has survived this long.

2. Strong Artificial Intelligence

Within the next 100 years, it's possible, even likely, that the human race will be made obsolete by artificial intelligence. This hypothesized event is known as the The Singularity, so named because it is the point in time when the rate of change becomes so steep that it resembles a straight vertical line. Why would this happen? It's actually pretty simple. If humans are able to build a machine that is smarter than humans, then that machine would be able to build an even smarter machine, etc... Within a few years, artificial intelligence will quickly surpass the entirety of the human race.

Once this happens, there is no way to put the genie back in the bottle. The human race will simply not have the resources to defeat a superintelligent computer, just like a human can't beat a computer in chess today. Confused? Read more about it. In any case, it seems possible that the human race will be subordinated by computer intelligence within our lifetimes.

Perhaps our new masters will treat us kindly. Perhaps they won't. Perhaps they'll treat us like we treat ants: mostly ignoring them unless they become a nuisance. Or perhaps humans will merge with computers and become superintelligent ourselves. One thing's for certain: if we create superintelligent AI, the future of the human race will change forever.

3. An asteroid or comet

This is becoming less and less of a global threat. We have done a very good job of tracking near-Earth objects above 1 km in size. And while smaller asteroids might slip through our sensors, they don't have the potential to cause planetary climate change if they collide with Earth. If you are spending time worrying about asteroid impacts, don't.

4. A high mass, extra-solar object

Worry about this instead! In 2017, astronomers first observed something that had never been seen before: an object passing near Earth which came from outside our solar system. This particular object was not large enough to be an existential threat, but it's possible that there are planets, asteroids, or other large objects hurtling through space on a collision path with Earth. For example, some people think that our current Solar System once had five gas giants instead of four with the fifth being ejected to who knows where. Perhaps an unseen rogue planet or dwarf star is hurtling towards Earth as we speak.

5. Nuclear weapons

Nuclear war is bad. Is the end of the world? Probably not. We simply don't have the firepower to send the human race back to the stone age. At least that's the conclusion of some people who have criticized the flaws of the "nuclear winter" hypothesis popularized by Carl Sagan.

What percent of the population would die in a full nuclear exchange? I'm not sure. Probably at least 10% of the world's population. Possibly 90% or more. But it won't be 100%. Some countries would not be involved in the nuclear exchange at all and would survive relatively unscathed. As a species, we need more and bigger bombs to commit suicide. No doubt very smart people are working on it right now.

6. Grey goo

Nanobots are extremely small robots (trillions per cubic inch) that currently exist as a hypothetical state of future technology. What if someone made self-replicating nanobots? And what if those nanobots were poorly, or maliciously, programmed. Very bad things could happen. They could, much like locusts, convert any food source they encounter into more nanobots, thus turning the world, and everyone you love into "grey goo".

7. Alien attack

Science fiction author Liu Cixin has conceived of the universe as a "Dark Forest" where alien civilizations keep themselves hidden to avoid being attacked. Why do they do that? Because, just like two wild animals meeting in the jungle, two civilizations meeting in space have no way to communicate with or trust each other. Rather than risk being annihilated, it is safer to annihilate the other. Meanwhile the human race, like complete morons, has loudly advertised our presence to the universe since the dawn of the radio age.

When the attack comes, it will not be like the movie Independence Day, with lumbering and easily hackable spaceships hovering over our major cities. Instead it will be in the form of a small object moving at relativistic speeds. An object the size of a baseball, moving at 99.99999999% the speed of light would be enough to destroy the entire planet. We would have no possible way to see it coming. One day, Earth would simply be vaporized and that would be the end.

8. Gamma-ray burst

Did you know that supernovas and stellar collisions can cause a Gamma-ray burst? It's possible, though extremely unlikely, that a gamma ray burst in the Milky Way, beamed directly at Earth, could cause a mass extinction event. Some people think that the Ordovician–Silurian extinction event of 450 million years ago was caused by a gamma ray burst.

9. Supervolcanoes

Supervolcanoes are to regular volcanoes as to Superman is to a regular man. They are ridiculously more powerful. A typical supervolcano is 1000 times as powerful as the eruption of Mount St. Helens. The eruption of a supervolcano can change the climate of the world for decades.

So could the eruption of a supervolcano kill the human race? It almost did once. Some people theorize that the eruption of Lake Toba 70,000 years ago reduced humanity to a population of just 10,000 or so.

Fortunately for us, humanity has progressed a lot in the last 70,000 years. We can do a lot to cope with lower temperatures and sunlight. When the supervolcano under Yellowstone awakens from its slumber, Denver could be buried under a meter of ash. But most of the rest of us will bravely soldier on.

10. The simulation is turned off

Back when the New York Times was still the nation's paper of record, it used to offer thought-provoking articles like this one.

What if it were possible to create a computer program that could simulate human consciousness? If that were possible, it would also be possible to create a program to simulate a billion people, or a trillion, a quadrillion, or more. With so many simulated humans out there, the odds are nearly 100% that you just a simulation. And one day the owner of the simulation might get bored and turn if off.

11. A human-created virus

Conspiracy theorists think that Covid-19 was created in a Chinese lab. If so, it wasn't a very good lab. The fatality rate of Covid-19 is less than 1%. For the young and healthy, the fatality rate is possibly even less than the flu. But other diseases are far worse.

What if someone were able to create a disease with the mortality of Ebola and the rate of spread of measles? It would be very, very bad. But even that wouldn't be enough to end the human race.

Viruses that kill their hosts don't tend to spread very well. They tend to mutate to less deadly forms to ensure their survival. It would probably require multiple simultaneous superviruses to finish us off. (Not to give anyone any ideas).

12. Accidentally creating a black hole

I'm not really qualified to comment on this, but I'm going to say that it's not out of the realm of possibility that a high energy physics experiment leads to an unexpected consequence which destroys the world.

13. Climate change

You personally are not going to die from climate change. People vastly overestimate the short-term effects of climate change. By the year 2100, sea levels will have only risen by a couple feet. But the long term effects of climate change could be devastating. It's possible that, over thousands of years, we trigger a runaway greenhouse effect which makes the planet completely uninhabitable.

But even this seems extremely unlikely. The Earth has been much hotter than it is today. 400 million years ago, CO2 levels in the atmosphere were five times what they are today. Climate change is going to cause the extinction of many species. Humans probably won't be one of them.

14. A sexless future

Did you know that fertility rates in the developed world are far below the rate of replacement? The country with the lowest fertility rate is Singapore, where the average woman has less than 1 child. This means that each generation will be less than half as big as the one before it. What if the human race ceases to exist because of lack of interest?

I think this is also fairly unlikely. People who don't want children will become extinct. Other people will thrive. The population of Amish, Mormons, Hasidic Jews, and many other traditional religions is growing quickly. There is a Hasidic community in New York state where the average age is just 13 years old, due to an insanely high birth rate. At current rates, religious extremists will form a majority of the world population within a couple centuries.

15. The sun becomes a red giant

In about 5 billion years, the sun will grow into a red giant. It will become so large that it will completely absorb Mercury, Venus, and possibly even Earth.

But it won't take 5 billion years to kill off the human race. The sun is continuously getting brighter. It is estimated that within 1.1 billion years the sun will get so much brighter that the Earth is no longer habitable.

16. Global... cooling?

The Snowball Earth theory suggests that, at least two times in Earth's history, the entire surface of the globe was covered with ice. It's possible that, in the extremely long term, the Earth once again reverts to a total ice age which makes every continent uninhabitable. Unless global warming kills us first...

17. A totalitarian future

For much of history, China had the largest economy and the highest level of technology in the entire world. So why, beginning around 1500 AD, did China stagnant while European civilization grew to eventually dominate the globe?

Some people think China was stymied by central control. In the crucial years between 1500–1900, Europe was a loose patchwork of warring civilizations while China was mostly united under a single emperor. Throughout Chinese history, the central bureaucracy imposed high levels of control, at times burning books, preventing technological innovation, and shutting down the exploration of the seas. Europeans did the same things too, of course. But since no one power exerted too much influence, there was always room for maneuver. Christopher Columbus, for example, was rejected by many royal courts before convincing Spain to fund his voyages.

Today, central governments exert more power than ever. China, once again, takes the lead. Their Social Credit System seeks to score every aspect of a person's life in terms of obedience to the central authority. Lest we feel too smug, freedom in western countries in on the wane as well.

One day, China (or another country) may control the entire world. The entire globe might be stuck in a trap where few are happy, but no one can challenge the central authority. And while this wouldn't be the literal end of humanity, it would mean the death of joy, beauty, and self-determination. We might as well be extinct.

18. Not with a bang, but with a whimper

Thomas Malthus was an English philosopher who observed that the population grows to meet any growth in the production in food, meaning humans will always be hungry.

Ironically, he made this observation at the cusp of the Industrial Revolution, the start of a period when human prosperity began to grow faster than ever before. Today, the world produces far more food than we can possibly consume. Does this prove Malthus wrong? I don't think so. Like many great thinkers, he wasn't wrong, just early.

The human race gained a temporary reprieve due to the extraordinary gains in productivity that happened after the Industrial Revolution. Today, productivity growth is slowing. Eventually, it will stop and reverse. We will return to the situation which governed humanity through most of its history: a perpetual shortage of food and resources. Over thousands of years, we will exhaust the world's supply of metals, fossil fuels, nitrates, and other substances necessary for human civilization. The population will fall. Ten thousand years from today, all that remains will be small bands of hunter gatherers, completely ignorant of the vast and grand societies which preceded them. And then, something else will happen, and we will be gone.

+18
Level ∞
Sep 17, 2020
Hey it's not all doom and gloom! I hope I convinced you that most of these are either extremely unlikely or a long way off. But if you're worried about the world ending, remember this: Life is short - go enjoy it!
+10
Level 69
Sep 17, 2020
As for #12, when the Large Hadron Collider was put into operation people worried that this might happen, but it didn't. Stephen Hawking calculated that we would need a particle accelerator at least the size of the universe to create a sufficiently large black hole to do the job.. Perhaps the powers that be put in safeguards to keep us from destroying ourselves that way.
+14
Level 48
Sep 17, 2020
Very interesting, thanks for making me never sleep again!
+18
Level ∞
Sep 17, 2020
Hey, but at least you'll be safe from Freddy Krueger!
+12
Level 56
Sep 17, 2020
If China controlled every country on Earth, then we would have only 80k Covid cases as the government might not reveal the correct number =P
+7
Level 80
Sep 18, 2020
You missed perhaps the most likely one: human impact on the environment and climate leading to a breakdown somewhere in the food chain resulting in mass starvation, perhaps an irreversible change to the atmosphere, and mass extinction.

Also, naturally occurring viruses or bacteria evolving faster than our science's ability to keep up, that's a possibility.

And there's also the eventual perhaps inevitable entropic heat death of the Universe, though that's quite a ways off, and the chances that something else will kill us all off before then are fairly large.

#8 is silly. Everyone knows that would just turn us all into incredible hulks.
+1
Level 62
Sep 18, 2020
I was just going to mention drug-resistant organisms. I doubt it would lead to the end of the human race, but still not entirely impossible that a super-superbug develops naturally that kills people so rapidly that it's impossible to find a cure (AKA real-life Plague Inc). Also, not sure if this counts but maybe human modification from CRISPR and combination with technology. That wouldn't be the end of humanity per se but it would change the human race "as we know it," as QuizMaster said at the beginning of the article.
+3
Level ∞
Sep 18, 2020
Dang forgot heat death of the universe. That would have been a fun one.
+1
Level 76
Sep 18, 2020
I vote for geologists not to call it the anthropocene but rather an anthropogenic mass extinction event
+2
Level 57
Sep 18, 2020
Number 10 last line, it should be "...and turn it of"
Great blog article btw, really interesting to see so many possible theories which I have never imagined before. After reading the Artificial Intelligence taking over us part I wonder if we were also made by some civilization and we as humans outsmart them ... Possible I guess (^~^;)ゞ
+4
Level 65
Sep 18, 2020
Muphry's Law...
+2
Level 73
Sep 18, 2020
#2. There is a step missing from the creation of super AI to it becoming our master. Doesn't that require will to power? I don't think we can take it for granted that an AI, even if it develops consciousness, has a will on its own.
+3
Level ∞
Sep 18, 2020
It's impossible to make predictions about what a superintelligent A.I. will do. It's like asking grasshoppers to predict elements of human society. But even if a superintelligent A.I. didn't have the will to power, it could be wielded by a person who did. It would be permanently destabilizing.
+3
Level 80
Sep 18, 2020
Once a true general AI is developed (not one narrowly focused to a specific task but one that can "think" and "reason," and there's no reason to believe that this won't eventually be developed as there's nothing particularly special about the lump of cholesterol between our ears) it will immediately be superior to every human being on Earth, be able to think and evolve as much in a minute as we could in a century, and have access to basically the entire sum of human knowledge and experience assuming it has an Internet connection. Once that happens, how it develops further will be completely out of our hands.

Another fun possibility for humanity's extinction is if we invent a super capable AI that IS NOT a true general AI, and is committed to a single purpose, like... make the most paperclips as efficiently as possible. That AI goes on to learn and improve and gets better and better and better.
+3
Level 80
Sep 18, 2020
At some point who's to say that it doesn't decide that to make more paperclips it needs to eliminate humans who are getting in the way by using valuable metals for other unnecessary things like cans and lawn chairs. We try to stop it, but of course our attempts to do so are correctly identified as something that would impair the AI's ability to make more paper clips. So it kills us all. Not because it wanted us dead, or understood really what it was doing, but just because it was fulfilling its programming.

In many ways, though, you could say that the AI committed to making paper clips is not any different from human consciousness, which is just an accidental by-product of our genetic programming to reproduce. DNA is just a program that has evolved to self-replicate. Every single other human trait and endeavor is merely a behavior that evolved to aid in this, or as a side effect of something else evolved to aid in this. We're making babies instead of paper clips. But same thing
+1
Level 69
Sep 18, 2020
Paperclips is a great videogame too
+2
Level 73
Sep 18, 2020
Exactly, AI is frightening because it is so unpredictable. So many people seem to assume that along with "smartness", the AI will evolve emotions and will want to dominate us for reasons that are psychologically relatable to us. The article read a bit like that to me. But I'm on board with both Q's and K's comments here.
+1
Level ∞
Sep 18, 2020
Agreed. As humans, we tend to see things in our own image. But an AI will not have the same empathy as humans. It will able to manipulate human emotions to get us to do what it wants.

Along those lines, I highly recommend the movie Ex Machina.

+1
Level 70
Oct 14, 2020
Watched Ex Machina earlier today. Great movie, thanks for the recommendation.
+2
Level 68
Sep 18, 2020
Happy Friday!
+5
Level 71
Sep 18, 2020
Excellent blog, although you missed the most obvious scenario.

Dr. Evil holds the world to ransom for...."One Million Dollars"...and we don't take him seriously

+2
Level ∞
Sep 18, 2020
A hitherto unimagined "Doomsday Device" would be a good addition!
+4
Level 54
Sep 18, 2020
2020: Hmm... so many options!
+2
Level 49
Sep 18, 2020
Earth could get destroyed if our orbit gets disturbed
+2
Level 68
Sep 19, 2020
"END Simulation". Number 10 kind of gets to me... I would like to think we were at least saved for memory. Perhaps the most vast Flash Drive!!!! You know... for some omniscient being to peruse at their own leisure!!!! 🌎💻 (Do you wish to save program?)
+1
Level 47
Sep 22, 2020
Or maybe they just simulate worlds in the way we look at stuff like reddit, for sources of wholesomeness, humor, and knowledge.
+4
Level 62
Sep 20, 2020
When #14 is your entire future, amarite fellas
+1
Level 47
Sep 22, 2020
This would be a LOOOONG way off, but there are *theories* that when the big bang happened from the end of another universe. I think it's called the Big Bounce, and it says that eventually the average density of the earth will be low enough that it starts contracting until it is a single singularity. thus another big bang. (I think the part where this universe ends is called The Big Crunch.)
+2
Level 23
Sep 26, 2020
Very creative. I believe humans will innovate past climate change and the like but AI, I don't know. All I know, is it's not going to happen during my life. Great job!
+1
Level 54
Oct 13, 2020
Love the sheep/wolves allusion, hadn't heard it before but I see it's widely used. Very interesting article.....now I have to eat a lot of chocolate to cheer myself up.
+1
Level 58
Oct 13, 2020
Points 2, 5 and 6 are making me so angry that I'm debating to buy a punching bag. Especially point 2. Why can't they stop inventing stuff like that. Nobody wants this crap. The only application for that horseshit is in analysing user behaviour in social media, in order to keep us pinned to the phone and thus see as many commercials and adds as possible. And that in itself makes me excessively angry. If I sparked your interest and/or you don't believe me, you might want to watch "the social dilemma" (Netflix documentary).