Nuclear weapons + ?

I was privileged and honoured in 2005 to be elected one of the Fellows of the World Academy of Art and Science. It is a mark of recognition and distinction that I wear with pride. The WAAS was set up by Einstein, Oppenheimer, Bertrand Russel and a few other great people, as a forum to discuss the big issues that affect the whole of humanity, especially the potential misuse of scientific discoveries, and by extension, technological developments. Not surprisingly therefore, one of their main programs from the outset has been the pursuit of the abolition of nuclear weapons. It’s a subject I have never written about before so maybe now is a good time to start. Most importantly, I think it’s now time to add others to the list.

There are good arguments on both sides of this issue.

In favour of nukes, it can be argued from a pragmatic stance that the existence of nuclear capability has contributed to reduction in the ferocity of wars. If you know that the enemy could resort to nuclear weapon use if pushed too far, then it may create some pressure to restrict the devastation levied on the enemy.

But this only works if both sides value lives of their citizens sufficiently. If a leader thinks he may survive such a war, or doesn’t mind risking his life for the cause, then the deterrent ceases to work properly. An all out global nuclear war could kill billions of people and leave the survivors in a rather unpleasant world. As Einstein observed, he wasn’t sure what weapons World War 3 would be fought with, but world war 4 would be fought with sticks and stones. Mutually assured destruction may work to some degree as a deterrent, but it is based on second guessing a madman. It isn’t a moral argument, just a pragmatic one. Wear a big enough bomb, and people might give you a wide berth.

Against nukes, it can be argued from a moral basis that such weapons should never be used in any circumstances, their capability to cause devastation beyond the limits that should be tolerated by any civilisation. Furthermore, any resources spent on creating and maintaining them are therefore wasted and could have been put to better more constructive use.

This argument is appealing, but lacks pragmatism in a world where some people don’t abide by the rules.

Pragmatism and morality often align with the right and left of the political spectrum, but there is a solution that keeps both sides happy, albeit an imperfect one. If all nuclear weapons can be removed, and stay removed, so that no-one has any or can build any, then pragmatically, there could be even more wars, and they may be even more prolonged and nasty, but the damage will be kept short of mutual annihilation. Terrorists and mad rulers wouldn’t be able to destroy us all in a nuclear Armageddon. Morally, we may accept the increased casualties as the cost of keeping the moral high ground and protecting human civilisation. This total disarmament option is the goal of the WAAS. Pragmatic to some degree, and just about morally digestible.

Another argument that is occasionally aired is the ‘what if?’ WW2 scenario. What if nuclear weapons hadn’t been invented? More people would probably have died in a longer WW2. If they had been invented and used earlier by the other side, and the Germans had won, perhaps we would have had ended up with a unified Europe with the Germans in the driving seat. Would that be hugely different from the Europe we actually have 65 years later anyway. Are even major wars just fights over the the nature of our lives over a few decades? What if the Romans or the Normans or Vikings had been defeated? Would Britain be so different today? ‘What if?’ debates get you little except interesting debate.

The arguments for and against nuclear weapons haven’t really moved on much over the years, but now the scope is changing a bit. They are as big a threat as ever, maybe more-so with the increasing possibility of rogue regimes and terrorists getting their hands on them, but we are adding other technologies that are potentially just as destructive, in principle anyway, and they could be weaponised if required.

One path to destruction that entered a new phase in the last few years is our messing around with the tools of biology. Biotechnology and genetic modification, synthetic biology, and the linking of external technology into our nervous systems are individual different strands of this threat, but each of them is developing quickly. What links all these is the increasing understanding, harnessing and ongoing development of processes similar to those that nature uses to make life. We start with what nature provides, reverse engineer some of the tools, improve on them, adapt and develop them for particular tasks, and then use these to do stuff that improves on or interacts with natural systems.

Alongside nuclear weapons, we have already become used to the bio-weapons threat based on genetically modified viruses or bacteria, and also to weapons using nerve gases that inhibit neural functioning to kill us. But not far away is biotech designed to change the way our brains work, potentially to control or enslave us. It is starting benignly of course, helping people with disabilities or nerve or brain disorders. But some will pervert it.

Traditional war has been based on causing enough pain to the enemy until they surrender and do as you wish. Future warfare could be based on altering their thinking until it complies with what you want, making an enemy into a willing ally, servant or slave. We don’t want to lose the great potential for improving lives, but we shouldn’t be naive about the risks.

The broad convergence of neurotechnology and IT is a particularly dangerous area. Adding artificial intelligence into the mix opens the possibility of smart adapting organisms as well as the Terminator style threats. Organisms that can survive in multiple niches, or hybrid nature/cyberspace ones that use external AI to redesign their offspring to colonise others. Organisms that penetrate your brain and take control.

Another dangerous offspring from better understanding of biology is that we now have clubs where enthusiasts gather to make genetically modified organisms. At the moment, this is benign novelty stuff, such as transferring a bio-luminescence gene or a fluorescent marker to another organism, just another after-school science club for gifted school-kids and hobbyist adults. But it is I think a dangerous hobby to encourage. With better technology and skill developing all the time, some of those enthusiasts will move on to designing and creating synthetic genes, some won’t like being constrained by safety procedures, and some may have accidents and release modified organisms into the wild that were developed without observing the safety rules. Some will use them to learn genetic design, modification and fabrication techniques and then work in secret or teach terrorist groups. Not all the members can be guaranteed to be fine upstanding members of the community, and it should be assumed that some will be people of ill intent trying to learn how to do the most possible harm.

At least a dozen new types of WMD are possible based on this family of technologies, even before we add in nanotechnology. We should not leave it too late to take this threat seriously. Whereas nuclear weapons are hard to build and require large facilities that are hard to hide, much of this new stuff can be done in garden sheds or ordinary office buildings. They are embryonic and even theoretical today, but that won’t last. I am glad to say that in organisations such as the Lifeboat Foundation (lifeboat.com), in many universities and R&D labs, and doubtless in military ones, some thought has already gone into defence against them and how to police them, but not enough. It is time now to escalate these kinds of threats to the same attention we give to the nuclear one.

With a global nuclear war, much of the life on earth could be destroyed, and that will become possible with the release of well-designed organisms. But I doubt if I am alone in thinking that the possibility of being left alive with my mind controlled by others may well be a fate worse than death.

3 responses to “Nuclear weapons + ?

  1. This article has set me thinking, you call yourself a futurologist, some of the things you are afraid of are already widespread, and in some cases you ought to be using the past tense.
    Fear induces fear,
    It is time now to escalate these kinds of threats to the same attention we give to the nuclear one.
    Make it even more widespread and more difficult to handle?
    But I doubt if I am alone in thinking that the possibility of being left alive with my mind controlled by others may well be a fate worse than death.
    1 Read (or write) A.N.other – some will be, some will not it could work both ways.
    2 What will who be fighting for?
    3 You can’t eat money.
    The threats are more mundane than you give the general impression of. It takes one person to make a mistake, probably under pressure, and he/she is branded a terrorist for ever
    Not all the members can be guaranteed to be fine upstanding members of the community, and it should be assumed that some will be people of ill intent trying to learn how to do the most possible harm.
    Should it? Aren’t you in danger of switching emphasis to the wrong aspect?
    The major risks will be from the run of the mill assignments. And as with doctors being struck off the people responsible will be given aliases, all I say is choose the aliases very very carefully, which I think will involve the decision makers widening their criteria. I don’t actually know who decides who will run as who.

    I wrote this out using highlighting to make sense but it hasn’t transferred to the comment, I hope it makes sense to you as it is.

    Like

  2. Pingback: How nigh is the end? | The more accurate guide to the future

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.