Tag Archives: news

How nigh is the end?

Top 10 Extinction Risks

I first wrote this blog in 2015 but I’m updating a lot of old material for my new book on sustainability. Potential extinction justifies a chapter in that I think. In 2015, the world seemed a lot safer than it does right now, so I increased several of the risk estimates accordingly. This article wasn’t meant to be doom-mongering – that’s just the actual consequence of adding up my best current estimates, and as I say at the end, you’re welcome to do the very simple sums with your own figures..

“We’re doomed!” is a frequently recited observation. It is great fun predicting the end of the world and almost as much fun reading about it or watching documentaries telling us we’re doomed. So… just how doomed are we? Initial estimate: Maybe a bit doomed. Read on.

In 2015 I watched a ‘Top 10 list of threats to our existence’ on TV and it was very similar to most you’ve probably read even recently, with the same errors and omissions – nuclear war, global virus pandemic, terminator scenarios, solar storms, comet or asteroid strikes, alien invasions, zombie viruses, that sort of thing. I’d agree that nuclear war is still the biggest threat, so number 1, and a global pandemic of a highly infectious and lethal virus should still be number 2 – my personal opinion on COVID was that it was almost certainly made in a lab, quite probably with the intention of developing a potential bioweapon, and it probably escaped by accident and poor safety protocols before it was anywhere near ready for that purpose, so if anything, we actually got off light. It could have been far worse, and the next one very probably will – many bad actors – terrorist groups, rogue governments and the occasional mad scientist, will have been impressed by the proof of principle of a cheap and easy means of destroying economies via poor government reactions and will have been very busy since trying to engineer their own viruses, with the assistance of AI of course. There is no shortage of potential viruses to start with. These risks should still be in 1st and 2nd place.

1: Nuclear War

2: Viruses

The TV list included a couple that shouldn’t be in there.

One inclusion was a mega-eruption of Yellowstone or another super-volcano. A full-sized Yellowstone mega-eruption would probably kill millions of people and destroy much of civilization across a large chunk of North America, but some of us don’t actually live in North America and quite a few might survive pretty well, so although it would be quite annoying for Americans, it is hardly a TEOTWAWKI threat (the end of the world as we know it). It would have big effects elsewhere, just not extinction-level ones. For most of the world it would only cause short-term disruptions, such as economic turbulence, at worst it would start a few wars here and there as regions compete for control in a new world order.

Number 3 on their list was climate change, which is an annoyingly wrong, albeit very popularly held inclusion. The only climate change mechanism proposed for catastrophe is global warming, and the reason it’s called climate change now is because global warming stopped in 1998 and still hadn’t resumed until almost 18 years later, so that term became too embarrassing for doom mongers to use. Since then, warming has resumed, but has still fallen very far short of the enormous catastrophes predicted 15- 20 years ago. London is not under water, there is still Arctic ice populated by a very healthy number of polar bears, the glaciers are melting but have not all vanished, Greenland and the Antarctic still have most of the ice they had then, and sea level has only increased very slightly faster than it has for the last few hundred years, not by the several metres predicted on our front pages. CO2 is a warming agent and emissions should be treated with caution, but the net warming contribution of all the various feedbacks adds up to far less than screamed and the climate models have mostly proven far too pessimistic. If anything, warming expected in the next few decades is likely to be partly offset by the effects of low solar activity and by the time it resumes, we will have migrated most of our energy production to non-carbon sources, so there really isn’t much of a long term problem to worry about – I have never lost a wink of sleep worrying about extinction caused by climate change. With likely warming by 2100 pretty manageable, and around half a metre sea level rise, I certainly don’t think climate change deserves to be on any top 20 list of threats to our existence in the next century and certainly not on my top 10.

The top 10 list missed two out by including climate change and Yellowstone, and my first replacement candidate for consideration might be the grey goo scenario – or variants of it. The grey goo scenario is that self-replicating nanobots manage to convert everything including us into a grey goo.  Take away the silly images of tiny little metal robots cutting things up atom by atom and the laughable presentation of this vanishes. Replace those little bots with bacteria that include electronics, and are linked across their own cloud to their own hive AI that redesigns their DNA to allow them to survive in any niche they find by treating the things there as food. When existing bacteria find a niche they can’t exploit, the next generation adapts to it. That self-evolving smart bacteria scenario is rather more feasible, and still results in bacteria that can conquer any ecosystem they find. We would find ourselves unable to fight back and could be wiped out. This isn’t very likely, but it is feasible, could happen by accident or design on our way to transhumanism, and might deserve a place in the top ten threats. This is an amusing one to include, because I also suggest this kind of synthetic organism, and some close relatives, as an excellent mechanism for fixing our environment by breaking down pollution of various kinds. It could be the environment’s saviour, but also its destroyer if not used correctly.

However, grey goo is only one of the NBIC convergence risks we have already imagined (NBIC= Nano-Bio-Info-Cogno). NBIC is a rich seam for doom-seekers. In there, you’ll find smart yogurt, smart bacteria, smart viruses, beacons, smart clouds, active skin, direct brain links, zombie viruses, even switching people off. Zombie viruses featured in the top ten TV show too, but they don’t really deserve their own category any more than many other NBIC derivatives. Anyway, that’s just a quick list of deliberate end-of-world solutions – there will be many more I forgot to include and many I haven’t even thought of yet. Then you have to multiply the list by 3. Any of these could also happen by accident, and any could also happen via unintended consequences of lack of understanding, which is rather different from an accident but just as serious. So basically, deliberate action, accidents and stupidity are three primary routes to the end of the world via technology. So instead of just the grey goo scenario, a far bigger collective threat is NBIC generally and I’d add NBIC collectively into my top ten list, quite high up, maybe 3rd after nuclear war and global virus. AI still deserves to be a separate category of its own, and I’d put it next at 4th. In fact, the biggest risk of AI being discussed at the moment is its use by maniacs to design viruses etc, essentially my No. 3 entry.

3: NBIC Weapons

So, AI at No. 4. Many AI ‘experts’ would call that doom-mongering, but it simply isn’t. Apart from being a primary mechanism in risk 3, there are several other ways in which AI could accidentally, incidentally or deliberately destroy humanity, and frankly, to say otherwise is to be either disingenuous or not actually very expert. AI doesn’t stop at digital neural nets or LLMs. Some of my other current projects are designing AIs that could be extremely powerful, cheap and fast-evolving, very superhuman, and conscious, with emotions. All that is achievable within a decade. If I can design such things, so can many others, and some of them will not be nice people.

4: AI

One I am very tempted to include is drones. Little tiny ones, not the Predators, and not even the ones everyone seems worried about at the moment that can carry 2kg of explosives or Anthrax into the midst of football crowds. Current wars are demonstrating how effective smallish drones can be, but they could get a lot smaller and be even more useful. Tiny drones are far harder to shoot down, but soon we will have a lot of them around. Size-wise, think of midges or fruit flies. They could be self-organizing into swarms, managed by rogue regimes, terrorist groups, or set to auto, terminator style. They could recharge quickly by solar during short breaks, and restock their payloads from secret supplies that distribute with the swarm. They could be distributed globally using the winds and oceans, so don’t need a plane or missile delivery system that is easily intercepted. Tiny drones can’t carry much, but with nerve gas or viruses, they don’t have to. Defending against such a threat is easy if there is just one, you can swat it. If there is a small cloud of them, you could use a flamethrower. If the sky is full of them and much of the trees and the ground infested, it would be extremely hard to wipe them out. So if they are well designed to cause an extinction level threat, as MAD 2.0 perhaps, then this would be way up in the top ten too, 5th.

5: Micro-Drones

Another class of technology suitable for abuse is space tech. I once wrote about a solar wind deflector using high atmosphere reflection, and calculated it could melt a city in a few minutes. Under malicious automated control, that is capable of wiping us all out, but it doesn’t justify inclusion in the top ten. One that might is the deliberate deflection of a large asteroid to impact on us. If it makes it in at all, it would be at tenth place. It just isn’t very likely someone would do that. However, there are many other ways of using the enormous size of space to make electromagnetic kinetic weapons. I designed quite a few variants and compared their potential power if designed as a weapon to our current generation of nuclear weapons. Considering timescales, it seems fair to say that by 2050-2060, the most powerful weapons will be kinetic, not nuclear. Asteroid diversion still presents the most powerful weapon, but an inverse rail gun, possibly designed under the guise of an anti-asteroid weapon would still be capable of being 1 GigaTon TNT equivalent. (The space anchor weapon is just in the table for fun and comparison, and thankfully is only a fictional device from my sci-fi book Space Anchor).

6: Electromagnetic Kinetic Space Weapons

Solar storms could wipe out our modern way of life by killing our IT. That itself would kill many people, via riots and fights for the last cans of beans and bottles of water. The most serious solar storms could be even worse. I’ll keep them in my list, at 7th place

7 Solar Storms

Global civil war could become an extinction level event, given human nature. We don’t have to go nuclear to kill a lot of people, and once society degrades to a certain level, well we’ve all watched post-apocalypse movies or played the games. The few left would still fight with each other. I wrote about the Great Western War and how it might result and every year that passes, it seems more plausible. Political polarisation is getting worse, not better. Such a thing could easily spread globally. I’ll give this 8th place.

8 Global Civil War

A large asteroid strike could happen too, or a comet. Ones capable of extinction level events shouldn’t hit for a while, because we think we know all the ones that could do that. Also, entry 6 is an anti-asteroid weapon turned against Earthly targets, and suggests we may well be able to defend against most asteroids. So this goes well down the list at 9th.

Alien invasion is entirely possible and could happen at any time. We’ve been sending out radio signals for quite a while so someone out there might have decided to come see whether our place is nicer than theirs and take over. It hasn’t happened yet so it probably won’t, but then it doesn’t have to be very probable to be in the top ten. 10th will do.

High energy physics research has also been suggested as capable of wiping out our entire planet via exotic particle creation, but the smart people at CERN say it isn’t very likely. Actually, I wasn’t all that convinced or reassured and we’ve only just started messing with real physics so there is plenty of time left to increase the odds of problems. I’ll place it at number 11 in case you don’t like one of the others.

My top ten list for things likely to cause human extinction, or pretty darn close:

  1. Nuclear war
  2. Highly infectious and lethal virus pandemic
  3. NBIC – deliberate, accidental or lack of foresight (includes smart bacteria, zombie viruses, mind control etc)
  4. Artificial Intelligence, including but not limited to the Terminator scenario
  5. Autonomous Micro-Drones
  6. Electromagnetic kinetic space weapons
  7. Solar storm
  8. Global civil war
  9. Comet or asteroid strike
  10. Alien Invasion
  11. Physics research

I’m not finished yet though. The title was ‘how nigh is the end?’, not just what might cause it. It’s hard to assign probabilities to each one but I’ll make my best guess. Bear in mind that a few on the list don’t really become full-sized risks for a year or two yet, so interpret it from a 2030 viewpoint.

So, with my estimated probabilities of occurrence per year:

  1. Nuclear war:  2% (Russia is already threatening their use, Iran very likely to have them soon)
  2. Highly infectious and lethal virus pandemic: 1.75% (All the nutters know how effective COVID was)
  3. NBIC – deliberate, accidental or lack of foresight (includes smart bacteria, zombie viruses, EDNA, TNCOs, ATSOs etc): 1.5% (albeit this risk is really 2030+)
  4. Artificial Intelligence, including but not limited to the Terminator scenario: 1.25%
  5. Autonomous Micro-Drones: 1%
  6. Electromagnetic kinetic weapons, 0.75%
  7. Solar storm: 0.1%
  8. Global civil war: 0.1%
  9. Comet or asteroid strike 0.05%
  10. Alien Invasion: 0.04%
  11. Physics research: 0.025%

Let’s add them up. The cumulative probability of the top ten is 8.565%. That’s a hard number to do sums with so let’s add a totally arbitrary 1.435% to cover the dozens of risks that didn’t make it into my top ten (including climate change, often listed as number 1 by doomsayers), rounding the total up to a nice neat 10% per year chance of ‘human extinction, or pretty darn close’. Yikes! Even if we halve them, that’s still 5%. Per year. That only gives us 10-20 years if we don’t change the odds.

If you can think of good reasons why my figures are far too pessimistic, by all means make your own guesses, but make them honestly, with a fair and reasonable assessment of how the world looks socially, religiously, militarily, politically, environmentally, the quality of our leaders, human nature etc, and then add them up. You might still be surprised how little time we can expect to have left. I’ll revise my original outlook upwards from ‘a bit doomed’. We’re quite doomed.