Self-driving Vehicles: An Ethical Quandary

by William Mattar | February 17th, 2017

In time, autonomous cars will rule our roads. This new landscape will change the way we get from Point A to Point B, relying on GPS coordinates and computer algorithms—instead of our senses of vision and hearing, which we have relied on to “see what is there to be seen” since the advent of the automobile—to make the trip safely.

As explored in our previous blog post, this is no longer the stuff of science fiction.

The technology is there; it is now just a matter of refining it to ensure that, before self-driving vehicles hit the road en masse, they will make our roads safer. Business Insider projects that 10 million self-driving cars will be on the road by 2020.

As with most major societal changes, opinions are divided.

Some argue that self-driving cars will reduce motor vehicle collisions, resulting in fewer bodily injuries. For instance, the National Highway Transportation Safety Administration found that 93% of accidents are caused by human error. While Google’s self-driving cars have been involved in several accidents since launch, according to its testers, all were caused by human error.

Because self-driving cars will reduce, or even eliminate, the need for drivers to focus on the road ahead, it is believed that they will drastically reduce the number of collisions.

For instance, one article documents a stunt that was performed in the Mojave Desert in California. Five stunt drivers, traveling at 50mph, steered self-driving cars into a convoy and towards the rear of a lorry truck.

After activating “smart” cruise control, which allowed the vehicles to navigate the track unaided, the self driving cars’ “automatic emergency braking system immediately kicked in and the car came to a standstill before it could smash into the truck.”

Opponents of self-driving cars point to other statistics; namely a study from the University of Michigan’s Transportation Research Institute revealing that, despite the collisions being the result of another motorist’s error, “self-driving cars are involved in crashes at five times the rate of conventional cars.”

The same study revealed that the injury rate also increased fourfold.

In addition, others have cited the ethical quandary posed by self-driving cars. A recent article published in the MIT Technology Review, “Why Self-Driving Cars Must Be Programmed to Kill,” highlights this issue, presenting the following questions:

 

 

How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?

The study frames the dilemma as follows:

Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?

Not all individuals, with the steering wheel in their hand, would react to this predicament in the same way. The split-second response would vary based on an array of factors that varies from person to person. Nevertheless, autonomous vehicle manufacturers are tasked with developing a uniform response to this, and other, scenarios at the intersection of ethics and technology.

The bottom line is this: Self-driving cars are going to drastically change society. Whether the change is for the better or worse has not yet been decided.

If you’ve been injured in a motor vehicle accident, the New York motor vehicle accident lawyers at William Mattar Law Offices want to help. Give us a call anytime at (844) 444-4444 to speak with a member of our legal staff.