Let's Take A Closer Look

Explaining complicated subject matter simply since 1986

U.S. President Ronald Reagan liked to define “mixed emotions” as the feelings a man has as he watches his mother-in-law drive over the cliff in his new Cadillac. Another example involves how driverless vehicles will be programmed to react in emergencies. One is to act in the interests of the passengers and the other is to act for the greater good. Why is this important?

Because driverless vehicle technology will need to deal with moral and ethical dilemmas.

There are many scenarios involving undesirable alternatives. For example, when a pedestrian steps in front of a car that can’t stop in time, the car’s brain must make an unpleasant choice. Will it direct the car to hit the pedestrian, hit an oncoming car in the other lane, or swerve into a tree? Vehicles will react by using pre-programmed formulas. Who will be making the decisions about how the algorithms are written?

The Trolley Problem.

There are many versions of what ethicists call this classic thought experiment in moral principles. Azim Shariff, director of the Culture and Morality Lab at UC Irvine, used one version in his study, The Social Dilemma of Autonomous Vehicles, published in Science Magazine. He found most of us believe the vehicle’s occupants should be protected at all costs – when we are the occupants. When others are the passengers, we think the vehicle should sacrifice itself and its occupants to safeguard pedestrians and other drivers.

Few of us are surprised to hear that self-preservation wins, but where does this leave us?

Who will be the ones to decide how driverless vehicles will be programmed to handle ethical dilemmas? Will it be left to the manufacturers, who are all competing with each other? Or will it be the government deciding what type of programming manufacturers will use?

Take a look at the MIT Media Lab website, Moral Machine, which they describe as Human Perspectives on Machine Ethics.

They have put together an interactive set of a dozen lesser-of-two-evils scenarios. You are the one who decides which action the vehicle takes.

Take a few minutes now and see how your philosophy compares with others. (CLICK HERE FOR THE EXPERIMENT)

Enter your email address to subscribe to this blog and receive notifications of new posts by email.