Is Terminator coming?

A good amount of discussion recently about Terminator thanks to a recent MOD report on smart weapons.

Predator (and other remote controlled drones) are able to fire missiles at enemies, while their controller is safe thousands of miles away. They are being used to good effect in Libya. The ethical issues are now being discussed and it is interesting to see the different points of view.

What some people are asking is “is it fair to kill someone from safety thousands of miles away?” Seems a good question, to which the superficial answer is ‘probably not, but war is rarely fair’. Going a little deeper, this question implies a belief that there ought to be a level playing field where both sides have to face equal danger. In other words it should be a fair fight. But this is not a new issue, and boils down to a simpler one: should richer adversaries be allowed better weapons? Or should bigger and stronger people be allowed to beat up smaller ones? Should the more skilled gladiator be allowed to compete against a less skilled one? In essence, if you have an advantage, should you be allowed to make the most of it in warfare? I think that using wealth and high tech to gain an advantage over less advanced or wealthy opponents is just a variant of the imbalance between opponents that has played out on school playgrounds, amphitheatres and battlefields, and indeed, for billions of years in nature – lions have a big advantage over a baby antelope. I don’t think these new weapons have fundamentally changed this really. And sure they allow killing at a distance, but so do spears and guns. Even with today’s technology, these weapons are still really just smart spears, making decisions according to predetermined programs written by humans. They don’t have any consciousness or free will yet. And they create an advantage, but so does being a bigger guy, or being fitter, or better trained.

Are there other ethical questions then? Well, yes. Should smart but not very smart machines be allowed to make life and death decisions and fire missiles or guns themselves, or should a human always push the fire button? This one is interesting, but again, not completely without precedent, Romans used lions and tigers to kill Christians.  A smart killing machine isn’t really very different from a trained lion, or a herd of animals caused to stampede towards your enemy, or a war elephant. The point at which control is relinquished to something that doesn’t know any better is where and when the ethical act is done.

Anyway, back to the question, should they? Yes, I think so, provided that the terms under which they do so are decided in advance by humans, in which case they are just smart machines. All they are doing is extending the physical reach and duration of that decision. And smartness can go a long way before the machine is responsible. A commander sends autonomous troops out to carry out his plans. The fire button is pressed the moment he dispatches the orders. He is responsible for the act that follows. The soldier carrying out the act is less (but partly) responsible. The smart drone will one day be held partly responsible too when it is truly aware of its actions and able to decide whether to follow an order, but meanwhile, it is still just a smart spear and the human that sent it out to do its work holds the full responsibility. The fire button isn’t pushed when the drone fires the missile, it is earlier when it was launched and autonomy handed over, or when the remote controller pushes the fire button. No amount of algorithm or program changes that. We can allow machines to make decisions themselves provided we design the algorithms and equipment they use and accept the responsibility. The machine has no responsibility at all, yet.

But how much autonomy should a future machine be allowed, once we go beyond just algorithms? When it stops being just a smart spear and truly makes its own decisions while understanding the implications? That means it needs to be conscious, and that implies also a high degree of intelligence. That will be quite different. Who is responsible then? And if it is misused or if software crashes? Of course, such very smart and conscious machines may well develop their own values and ethics too, and they may impose their own constraints on our actions. When this happens, we will have worse things to worry about than ethics. Perhaps this means we shouldn’t worry. If machines cant do anything they are truly responsible for until they become conscious, and then they become a bigger threat that makes ethical considerations irrelevant, maybe we shouldn’t be concerned about the ethics because there is simply no area where they will become important.

I believe this is the case.  We can ethically use smart weapons because they are just better spears, and all we are doing is using our technological advantage. When we make self aware machines that can genuinely make their own decisions, at first they will have safeguards that force them to do our bidding, so are still just better spears. Then they have a degree of free will, the ethics simply becomes irrelevant. The damage is already done and they will be a threat to humankind. In which case, the ethical act is one off and at the point of pushing the button the system that makes these first self-aware machines.

To me that makes the whole issue much simpler. We only have one point to worry about, whether we create machines that can truly decide for themselves and make their own decision. Today, they just follow algorithms and don’t know what they are doing. Some time soon we must decide whether to pass this critical point. The invitation to Terminator will go out then.

2 responses to “Is Terminator coming?

  1. I’m worried less about machines taking over and more about people who want to use machines to depopulate the earth. its a perfect way to kill us all after the rich get all the machines working for them.

    Like

  2. Pingback: Too late for a pause. Minimal AI consciousness by Xmas. | Futurizon: the future before it comes over the horizon

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.