Skip to content
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Peter Singer on the ethics of how drones are like viruses and vice-versa.

The push of a button was never considered a violent act until the 20th century. Cold War films like Dr. Strangelove turned the "Doomsday" button--the launch of nuclear missiles which assured mutual self-destruction between East and West--into a cultural icon.

In recent years, with the boom of video games, war is no longer about the push of a single button, but rather now conjures up images of playing a computer game. In the resurrected series Arrested Development, the smothered son Buster enlists in the military to escape his overbearing mother; someone as helpless as him finds success playing what he believes to be is a video game. To his horror, he eventually learns that he's actually commanding drones.

As this satire illustrates, our drones--all technology--are only as smart and as ethical as those controlling it.

Peter Singer,  the director of the Center for 21st Century Security and Intelligence and a senior fellow in the Foreign Policy program, spoke to Big Think about shifting war technologies and ethical issues.

As Singer explains, how we wage war continues to advance in ways that may one day bring to life science fiction: 

There’s been an enormous amount of changing forces on warfare in the twenty-first century.  And they range from new actors in war like private contractors, the black waters of the world to the growth of warlord and child soldier groups to technological shifts.  The introduction of robotics to cyber.  And one of the interesting things that ties these together is how not only the who of war is being expanded but also the where and the when.  So one of the things that links, for example, drones and robotics with cyber weapons is that you’re seeing a shift in both the geographic location of the human role.  Humans are still involved.  We’re not in the world of the Terminator.  Humans are still involved but there’s been a geographic shift where the operation can be happening in Pakistan but the person flying the plane might be back in Nevada 7,000 miles away.

But new tricks bring up age-old debates. Singer touches on Stuxnet, a computer virus that attacked Iran's nuclear centrifuge seemingly undetected. A cyberweapon seems less violent, of course, than traditional forms of warfare. But even cyberwarfare is not free from ethical scrutiny, as Singer points:

One of the next steps in this both with the physical side of robotics and the software side of cyber is a shift in that human role – not just geographically but chronologically where the humans are still making decisions but they’re sending the weapon out in the world to then make its own decisions as it plays out there.  In robotics we think about this as autonomy.  With Stuxnet it was a weapon.  It was a weapon like anything else in history, you know, a stone, a drone – it caused physical damage...

On one hand we can say this may have been the first ethical weapons ever developed. Again whether we’re talking about the robots or Stuxnet, they can be programmed to do things that we would describe as potentially ethical.  So Stuxnet could only cause harm to its intended target.  Yet popped up in 25,000 computers around the world but it could only harm the ones with this particular setup, this particular geographic location of doing nuclear research.  In fact, even if you had nuclear centrifuges in your basement, it still wouldn’t harm them.  It could only hit those Iranian ones.  Wow, that’s great but as the person who discovered it so to speak put it, “It’s like opening Pandora’s box.”  And not everyone is going to program it that way with ethics in mind.

For Singer's complete interview, watch the video.

 


Related