Photo of Lillian K. Saba And John K. Kline

Exceptional Representation

Exceptional Results

Self-driving cars: What you don’t know could kill you

On Behalf of | Jan 26, 2017 | car accidents |

Self-driving cars (also known as fully-autonomous vehicles) have been hyped as the future of car travel. By replacing error-prone human drivers with powerful sensors and computer software, the reasoning goes, crashes could be significantly limited or even eradicated.

While the idea is tempting and early successes have been promising, fully-autonomous vehicles are not a reality yet. And in the meantime, as car companies introduce features to make vehicles semi-autonomous, the dangers of fatal accidents are very real. Moreover, this transition period is full of questions about who is liable for these crashes.

Tesla has been among the first companies to make mostly-autonomous driving technology available to consumers. But the chosen name for this feature, “Autopilot,” is more than a little misleading. Many drivers don’t realize that “Autopilot requires full driver engagement and all times,” according to a spokesperson for the National Highway Traffic Safety Administration.

There seems to be miscommunication surrounding both the responsibilities of drivers and the limits of this technology. Unfortunately, that gray area may be at least partially to blame for a fatal crash. In May 2016, a man driving his Tesla in Autopilot mode was killed in a crash on a Florida highway when his vehicle crashed into a truck crossing the road.

As it turns out, Tesla’s Autopilot is effective at preventing rear-end crashes, but has more difficulty with cross-traffic collisions. According to an accident investigation, neither the driver nor the autopilot applied the brakes prior to the crash, even though the driver likely would have had time to see the collision coming. This suggests that he didn’t know he was supposed to be serving as a backup to Autopilot, or that Autopilot had such serious limitations.

Investigators determined that the Autopilot software functioned as intended, and therefore, that no recall is necessary. But is Tesla to blame for failing to both educate and warn Tesla owners about its driver-assistance feature? These are the types of questions that will likely need to be answered in order to prevent future fatal accidents and to compensate the families of victims killed in such crashes.