Skip to content
Surprising Science

How Will Machines Face Moral Dilemmas?

As machines like drones and automobiles become more independent of their human operators, they will increasingly face moral dilemmas. Will they know right from wrong? 
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

What’s the Latest Development?


As machines become increasingly autonomous from humans, such as self-guided drones or self-driven vehicles (which have been granted drivers’ licenses in Nevada), they will face moral dilemmas. “Should a drone fire on a house where a target is known to be hiding, which may also be sheltering civilians? Should a driverless car swerve to avoid pedestrians if that means hitting other vehicles or endangering its occupants?” Groups like the International Committee for Robot Arms Control have formed in opposition to military drones but on the whole, autonomous machines stand to do more good than harm. 

What’s the Big Idea?

The human legal system, which depends on concepts such as agency and responsibility, must determine who is at fault if an autonomous machine does harm. Is it the designer, the programmer, the manufacturer or the operator? In cases where moral dilemmas are present, machines should be programmed to make decisions which accord with most people’s moral judgments. “The sooner the questions of moral agency [autonomous machines] raise are answered, the easier it will be for mankind to enjoy the benefits that they will undoubtedly bring.”

Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Related

Up Next