There is xenophobia, and homophobia, and photophobia (the fear of paparazzi…) The newest one seems to be cyberphobia, or the fear that The Machines will take over and exterminate humanity.
Nowhere is this as blatant as in the debate about ‘killer robots’ — automated/autonomous battlefield systems. At the recent UN CCW, whimsically named ‘Convention on Conventional Weapons,’ some AI experts warned about battle robots becoming weapons of mass destruction (WMDs), shooting up everyone and everything in their path, in uncontrollable frenzy.
Controls and legislation were called for, petitions signed, letters written, with the intent of banning the manufacture and deployment of such weapons.
Well, go right ahead, folks. You will have even less success than you had with landmines and cluster munitions and gases and biologicals: all hidden away now, but ready to come out of the closet if serious war breaks out.
I can only wish you would dedicate your worthy energies to some better cause, such as educating all children in the world to the same standard, or promoting Universal Basic Income, or getting basic sanitation to the two billion folks that still don’t have it…
Let’s start with the claim that battle robots will turn into WMDs, by calmly considering the most recent ‘mass destruction’ we’ve had, the Rwandan genocide. Almost 800,000 people hacked by machetes, wielded by humans. Sentient and otherwise compassionate humans, driven by ideology.
Not a single robot in sight, unfortunately.
Maybe, a timely intervention by well armed troops — which stand to machete wielding crowds as battle robots stand to well armed soldiers — would have saved many thousands.
All weapons — from rocks to knives to nuclear warheads — are as lethal as those who wield them. Autonomous weapons are much less dangerous than any other; they do as they are told and only as they are told.
A tired, traumatised, battle-weary, indoctrinated soldier is far more indiscriminate than any autonomous weapon.
All armies consist of humans who have been conditioned to obey orders. Before you have a good soldier, first you must make a human into a sort of obedient robot. What if those robotised humans get out of human control?
No need to speculate, they do, all the time. Worse, you only know your soldiers misbehaved if someone tells on them (Mỹ Lai comes to mind…) — while autonomous weapons are monitored and recorded to the microsecond.
People mistake ‘autonomous’ for ‘conscious artificial intelligence’. Not the same at all. No plans exist for cAI weapons systems, as they would be less effective than a simple robot and, because they would not have our evolutionary quirks, far less predictable. They might get to the field and not fire at all, if they do not agree with the target assignment and have no motivation for violence.
The main reason, why I like the idea of autonomous weapons, is that fanatical fighters hate them. There is no glory in destroying a robot, or in being killed by one. ISIS fighters feared drones even more after they learned that many were operated by women.
I don’t see how the destructive potential of battle robots would be higher than regular infantry, for example. Any increase in fire power costs money. Combat vehicles today have extra firepower and armour — even life support — to protect their human crews. More than 98% of all small arms ammunition in battle is spent to protect friendly troops from enemy fire, to force the baddies to keep their heads down and spoil their aim. Such protection won’t be required for autonomous, expendable machines. So the accommodations and fire power can be reduced to favour mobility and speed, which happen to be far more efficient at avoiding enemy fire in the first place.
Remember, the usual purpose of war is not to kill the enemy, but to force him to do what you want, while preventing resistance and stopping attacks. Killing only becomes necessary when there is a consequent threat, like swamping your structure with prisoners, or leaving too many able troops behind your lines. The most efficient use of battle robots would be to disable enemy troops without killing, forcing the opposition to spend resources caring for them.
It is only when the purpose of war is extermination that the killing takes front stage. Countries that have the will and the resources for mass killing do so remotely, in any case, as Saddam did when sending warplanes to gas the Kurds. It is always the low tech violence that makes more victims.
There is a direct link between higher technology and a decrease in battlefield casualties and, more importantly, harm to civilians. Whatever the news media may show in gory front pages, the recent conflicts in the Middle East would have killed a lot more civilians if both sides used low tech, shown by the fact that the more technically challenged side routinely used civilians as human shields.
What makes humans so hostile to each other? Would or should we build machines with the same flaws? Why?
It would be very costly to build machines as biased, gullible and prone to cognitive disfunction as humans. It is cheaper to condition existing people to do your bidding. Artificial intelligence is being developed to give us companions with less flaws. The idea that AI must be evil comes straight from religious concepts of ‘perfect creation in god’s image’ — the sort of bullshit behind racism and genocide.
As for the speed and accuracy of robot shooters, the main reason why people equate them to WMDs, it is routinely underestimated. Most capital ships in the US Navy, as well as USAF and US Army advance bases are protected by one or more Phalanx turrets (plenty of videos in YouTube, very entertaining). These are capable of shooting a missile or even a naval cannon shell out of the air.
Now, why not equip a large tank with something like a Phalanx turret? Expensive, but very effective in shooting up stuff.
But, except for very special ciscumstances, it would be useless as an offensive weapon in a battlefield. There is no need for all that speed and accuracy when your purpose is not to kill, but to interdict, to deny territory and resources to the enemy. Swatting flies with a sledgehammer smashes the furniture but is not likely to kill the flies.
Never too much to reiterate that the objective of military action is not to kill the enemy, but to stop him. So your Phalanx robot advances, shooting up anything that moves. The enemy does not move. Phalanx moves on, enemy sneaks up behind and continues business as usual. Which puts you at the diplomatic disadvantage of having shot up a lot of collaterals without stopping the enemy. Well done.
The Soviets learned this kind of lesson when they first rolled their tanks into Budapest in 1956, only to have the vanguard attacked from the rear, out of deserted streets, with their own low tech invention: Molotov cocktails.
They later taught it to the North Vietnamese, which caused the US troops no end of bother. And they forgot it again in Afghanistan…
An automated system that works on outdated algorithms and doctrines will make the same mistakes as the humans it replaces, only much faster.
Now, people rightly lament the suffering and damage caused by war. All wars are damaging, one way or another, as they are meant to be. The only way to reduce wars is through economic interdependence. You don’t want to kill your customers, you don’t want to attack the country whose currency backs yours and where you have your reserves, you don’t want to alienate your suppliers. That is why we are in the middle of the Long Peace.
War is only good for opportunists, on both sides. Let’s not confuse arms production and sales with war. Almost all the weapons made and sold are never used in anger. Not having weapons is still seen as providing bandit governments or groups with an opportunity, just like leaving your door unlocked, without an alarm system and out of reach of an effective police force.
A few countries, such as Costa Rica, have no armed forces, but they rely on having poorly equipped neighbours, on being polite to all, and on having the USA close by.
Now, the poor people, the common people, cause wars. They vote for the nationalists, the ideologues, the great leaders, the revolutionaries. If you want to make history, the worst possible result is becoming part of it, by voting for an ideology, or following a Great Leader.
If you want to stop the production of weapons, you must convince every bandit, every jihadist group, every militant and survivalist, every banana dictatorship in the world to swear off aggression of any kind. And then some North Korea equivalent will always come up and gobble up your peaceful followers. And the tribal folks will still do their ethnic cleansing using rocks and bamboo spears, with no well armed soldiers to stop them.
We are living in the most peaceful era in the whole of our history. For the first time in some 30,000 years the proportion of people killed by violence is trending down, dramatically. That is closely linked to developments in weapons technology.
The centenary of the first battle won by the use of a sizeable tank formation, Cambrai, is being celebrated. The Brits broke through the Kaiser’s trench fortifications with the new lumbering war machines. The horror.
But it was a breakthrough in technology that broke the trench stalemate and did much to reduce the number of battlefield casualties. It was a significant contribution toward ending the conflict.
And so it goes. We have learned, to our credit, to keep technology advancing without the need for battlefield deployment, thereby reducing the need for battlefields.