“This is a new round in the international arms race, and it brings different dangers for international stability,” Doctor Jürgen Altmann, physicist at the Technical University of Dortmund and vice-chairman of the International Committee for Robot Arms Control (ICRAC) told The Local on Thursday.
“Seven countries worldwide have armed drones, and if Germany and the other Europeans jump in then it's suddenly a lot more,” he explained.
“This contributes to an uncontrolled escalation in unmanned aerial vehicles.”
His comments came the day after Germany's Defence Ministry announced plans to develop an armed drone together with Italy and France by 2025.
While ICRAC is not pushing for a global ban on unmanned armed drones – they are already too widespread for that – Altmann argued that Germany shouldn't be pushing the technology forward.
For Altmann and his fellow scientists, there are many fresh dangers presented by unmanned aerial vehicles, including the reduced political risk that comes from removing the risk to human soldiers and pilots – meaning it would be less controversial at home for a politician to use violence abroad.
Not only that, but their potential unconventional uses could spark a fresh age of paranoia on the geopolitical stage.
“There are new possibilities for surprise attacks, most dangerously against nuclear weapons or command and control systems,” Altmann said. “The fear of such attacks will increase”.
But ICRAC sees the real fight to come as being against autonomous weapons systems.
“Researchers are already considering how they might let these weapons systems choose their own targets,” Altmann notes.
Robots deciding for themselves what to shoot at and when could lead to terrible unforeseen consequences.
“If you think of two fleets of autonomous weapons systems circling one another, which will attack if they think they are about to be attacked, you have two software systems encountering one another – through real-world military hardware – that have never been tested together.”
He points to cases in civilian life where interactions between such complex systems could have cauesd chaos, such as the Wall Street “Flash Crash” of 2010 – caused in part by automated high-frequency stock trading programs.
The difference with an armed autonomous drone could be a war being sparked when a software system mistakes the sun's reflection for a rocket launch, Altmann said.
“Huge results could come from misinterpreted signals,” he points out. “They could start shooting even though the human politicians aren't prepared to start a war.”
Draw a red line
That's why Altmann and a 10-strong delegation of his ICRAC colleagues will attend a UN session in Geneva later this month to discuss such “Lethal Autonomous Weapons Systems” (LAWS).
Under the framework of the Convention on Certain Conventional Weapons, the scientists hope to see countries agree restrictions for autonomous weapons.
It's a fight they have been in since 2009, when the organization was set up to pressure for an “arms control regime” to prevent autonomous killing machines being developed and limit the power of weapons carried by remotely-controlled drones, including a ban on nuclear weapons and robot space weapons.
SEE ALSO: Germany, France, Italy want Euro drones