Killer Robots: Removing the Human From Inhumane Acts of War
Killer Robots: Removing the Human From Inhumane Acts of War © Photo: Sony Pictures Releasing CIS
02:03 20.02.2016(updated 08:05 20.02.2016) Get short URL
Yale professor calls for US to lead in non-proliferation of automated weapons.
If you’ve ever taken the Metro along the Washington DC Beltway you’ve probably seen Northrup Grumman’s unnerving drone advertisement under the banner “Unmanned Power.”
An allegory on the military industrial complex’s interest in rendering human dignity obsolete, the advertisement’s slogan undersells it only a bit. Some facetious rebranding suggestions include, “Northrop Grumman: Hello Gentle Commuters, We Could Kill You Right Now” or “Northrop Grumman: Say Goodbye to International Laws of War.” But who can say whether the company’s target demographic is sitting on the train after a long workday thinking, “If Only I Knew Where I Could Buy a Predator Drone.”
This week, international scholars predicted that the United Nations’ campaign for a comprehensive treaty restricting the automation of weapons, including drones, will ultimately fail to be implemented. This comes at a time when academics warn that the Pentagon is rushing to develop and expand their collection of deadly autonomous weapons.
One Yale bioethicist, Wendell Wallach, believes that the United States, the nation with the highest number of defense lobbyists per capita, could ultimately play a leading role in preventing the proliferation of killer robots. Wallach calls for an executive order proclaiming that lethal autonomous weapons violate existing international humanitarian law. But can the most militarized nation in the history of the modern world really be trusted to act as its common-sense peacekeeper?
As Wallach argues, “It’s going to be hard to put an arms control agreement in place for unmanned killer drones,” citing issues of attribution under the laws of war and the impossibility of criminally sanctioning a robot. In essence, Wallach’s position is that, without immediate action, the only delineation between a war criminal and a well-paid executive will be a line of code. And he is correct.
However, that also spells out precisely why his proposal will never come to fruition. Sadly, the remainder of Wallach’s optimistic proposals die the same death.
Wallach calls for a private sector committee of experts, rather than a federal agency, to address autonomous weapons, and they would be similar to the group that monitored our economic markets until 2008.
Wallach also advocates that 10 percent of funding on artificial intelligence should go directly toward helping military personnel cope with the fact that killer robots will take their jobs. An implication could be that citizens would become a kind of feudal dissident in opposition to our killer robotic overlords, including Northrop Grumman, and the rest of the immensely profitable corporate military-industrial complex. Many question the power of the president to regulate killer robots and the power of the American people to resist an eventual servitude.