Tesla Autopilot

Tesla Kills In Autopilot Mode: Is It Man Or Machine’s Fault?

A contentious manslaughter lawsuit involving a deadly incident caused by a Tesla vehicle with the controversial Autopilot technology turned on is set to begin later this month in Los Angeles.

It’s the first of its kind, and it might establish a precedent for future automobile accidents employing driver-assistance software.


We won’t know the exact defense until the case is heard, but the key point is that the man driving the car is facing manslaughter charges — but has pleaded not guilty, opening up potentially novel legal arguments about culpability in a fatal collision when, technically speaking, it wasn’t a human driving the car.

The forthcoming trial will center on a deadly incident that occurred in 2019. Kevin George Aziz Riad was driving his Tesla Model S when he jumped a red light and crashed with a Honda Civic, killing a couple who were apparently on their first date.

Riad did not engage the brakes but had his hand on the steering wheel, according to vehicle data. Perhaps most importantly, the Tesla’s Autopilot system was activated in the seconds preceding the incident. Riad’s attorneys have argued that he should not be punished, but have so far refrained from openly blaming Tesla’s Autopilot software.

Tesla is already under fire for its Autopilot and so-called Full Self-Driving software, while admitting that the functions “do not make the car autonomous” and that drivers must maintain constant attention to the road.

According to a recent poll, 42 percent of Tesla Autopilot users are “comfortable treating their vehicles as completely self-driving.”

This month’s trial has a good possibility of creating a precedent. Was Riad entirely at blame, or was Tesla’s Autopilot at least partially to blame?

“Who’s at fault, man or machine?”

Reference- Reuters, Business Insider, Inside EVs, Futurism, Business Today, Insurance Institute for Highway Safety (IIHS), CNBC