A 42-year-old driver of a Tesla initially denied being responsible for a hit-and-run incident that resulted in the death of a woman, is now asserting that he cannot recall whether or not he actually hit her. If he did hit her, he must have been using the car’s Autopilot feature and was distracted by checking work emails at the time.
This defense strategy brings attention to the flaws in the legal systems regarding driver assistance software. These features, despite their limitations, are being utilized to shift blame.
The evidence against the Tesla driver, who initially denied involvement, has been growing. Police found a windshield wiper and surveillance footage linking his car to the incident. The vehicle also showed minor damage, and investigators collected hair samples from three places.
It’s important to note that charges have not been filed yet. The Tesla driver’s attorney, however, is already resorting to a bizarre line of argument.
“My client voluntarily spoke to investigators, and he explained it is probable his car would’ve been using Tesla’s full self-driving capability,” he told the media, referring to the EV maker’s infamous driver assistance add-on.
Although Tesla’s marketing may suggest otherwise, their vehicles are not yet capable of fully autonomous driving and drivers must still be prepared to intervene and take control, including applying the brakes, as stated on the company’s website.
The car manufacturer is currently under investigation by the National Highway Traffic Safety Administration due to a string of incidents where Tesla vehicles collided with emergency response vehicles that were stopped with sirens or flares.
The number of known deaths involving Tesla’s Autopilot has also surged, with the regulator’s June analysis revealing that there have been at least 736 crashes in the US that involved the EV maker’s controversial driver assistance feature since 2019, at least 17 of which were fatal.
Reference- The Washington Post, Futurism, Star Tribune, Inside EVs, NHTSA website