The Moral Dilemma Presented by Driverless Cars

The Moral Dilemma Presented by Driverless Cars

Why you should care

Because tech is coming for you, and it’s making decisions for you. Are you OK with that? 

Someday, perhaps sooner than you think, robots are likely to make moral decisions, according to a small but growing group of robot ethicists. Canadian researcher Jason Millar is one of them. He blogs about robots and moral philosophy at Robohub.org, and earlier this summer he posted a hypothetical question on “death by autonomous car”:

“Pretend you’re alone in a driverless car on a single-lane road that’s heading into a tunnel. A child suddenly runs across the tunnel’s entrance, trips and falls. You can either hit the child and save yourself or swerve into the tunnel edifice, killing yourself but saving the child.”

Of the 110 people who responded to the online poll, some 64 percent of people said they’d save themselves, while 36 percent would sacrifice themselves for the child.

The questions are a modern take on old philosophical conundrums — the so-called “trolley questions.” They’re versions of: Do you save a building full of people by allowing a runaway train to continue on its deadly course, killing an innocent bystander, or swerve to miss the lone person, allowing the train to crash into the building?

“They keep asking the question — how should engineers design these cars to react?” says Millar. “And I thought … there’s another question nobody’s asking: Who should decide how the car swerves? Who has the moral authority?”

Relatively few people are asking such questions, Millar says. His field, robotic ethics, is so new that four years ago, when he started teaching, he struggled to find readings for his students. Now, there are plenty more.

Take Patrick Lin, an ethics expert who has war-gamed with the Pentagon and consulted Google about ethics and engineering. In a Wired piece last year on driverless cars, he wrote: “Say that the best we could do is make robot cars reduce traffic fatalities by 1,000 lives. That’s still pretty good. But if they did so by saving all 32,000 would-be victims while causing 31,000 entirely new victims, we wouldn’t be so quick to accept this trade — even if there’s a net savings of lives.”

As you’re automating more and more technologies … you’re sometimes going to be automating answers to what turn out to be deeply moral questions.

The ethics of electronics aren’t just life-and-death decisions. As a former R&D engineer, Lin says few designers worry about anything but functionality and marketability. But designs can affect both how we see ourselves and how we are broadcast to the world — something close to our core identity. No idle textbook wondering, this — in a way, Facebook’s algorithms already control what we present to the world.

Algorithms are the priceless coding that drive sites to do things like make money for Mark Zuckerberg. But that also means that code makes decisions about us — often without our knowing it.

Engineers must change their approach to these robot dilemmas, says Millar. “I look at those kinds of issues the way critics might have looked at health care issues 50 or 60 years ago,” he says, pointing out that medical ethics make health care practitioners think about their work — like end-of-life decisions — in a moral way, not just a technical one. “The same thing needs to happen in engineering,” he argues.

al·go·rithm ˈalgəˌriT͟Həm/

noun

  1. a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer: “a basic algorithm for division”

Source: Oxford Dictionaries

Meanwhile, users should know that designers and engineers are making moral choices, he says. And they should understand what algorithms are — and that they might do things with our thoughts and appearances that we don’t want, without our realizing it.

Then we need to ask questions, like when our right to present ourselves as we wish can supersede a company’s ability to profit. This one, too: Who gets to determine how a driverless car will react when faced with a kid, a tunnel and you?

Comment

OZYAcumen

Numbers and factoids --- fodder for your next cocktail party.