A Valentine’s Day Heartbreaker for Google

by William Mattar | April 20th, 2017

As covered in an earlier blog entry, over the past five years Google has taken pride in the fact that, despite its self-driving car fleet being involved in several accidents since launch, all was caused by human error.

That all changed on February 14, 2016, Valentine’s Day, when Google’s vehicle was involved in a collision with a public transit bus.
The tech giant concedes it bears “some responsibility for the incident.” According to a Department of Motor Vehicle Report submitted by Google, the accident occurred as follows:

A Google Lexus-model autonomous vehicle (“Google AV”) was traveling in autonomous mode eastbound on El Camino Real in Mountain View in the far-hand lane approaching the Castro Street intersection. As the Google AV approached the intersection, it signaled its intent to make a right turn on red onto Castro St. The Google AV then moved to the right-hand side of the lane to pass traffic in the same lane that was stopped at the intersection and proceeding straight. However, the Google AV had to come to a stop and go around sandbags positioned around a storm drain that was blocking its path. When the light turned green, traffic in the lane continued past the Google AV. After a few cars had passed, the Google AV began to proceed back into the center of the lane to pass the sand bags. A public transit bus was approaching from behind. The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue. Approximately three seconds later, as the Google AV was reentering the center of the lane it made contact with the side of the bus. The Google AV was operating in autonomous mode and traveling at less than 2 mph, and the bus was traveling at about 15 mph at the time of contact.

In essence, the Google vehicle attempted to re-enter a lane, assuming, incorrectly, that the public transit bus would allow it to enter that lane.

According to a Reuters article, Google has reviewed the incident “and thousands of variations on it in [a] simulator in details,” making refinements in its software. The goal: To program the cars to “understand that buses (and other large vehicles) are less likely to yield to . . . than other types of vehicles.”

Google has since released a report, putting the incident in perspective, and assuming some responsibility for the incident.

This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

This poses an interesting question: Will a self-driving car ever offer the same instinctual response as a human driver? In other words, can sensors, monitors, and equations expressed in ones and zeros ever replicate the response of a sentient, self-determining person?

This question is loaded with equal parts ethics and technology.

If you’ve been injured in a motor vehicle collision, the New York motor vehicle accident lawyers at William Mattar Law Offices want to help. Give us a call anytime at (844) 444-4444 to speak with a member of our legal staff.