The family of a man who was killed when his Tesla Model X plowed into a highway barrier is suing the electric vehicle maker for allegedly using drivers as guinea pigs to test the company’s artificial intelligence instruments.
Tesla is “beta testing its Autopilot software on live drivers,” Mark Fong, an attorney representing Walter Huang’s family, said in a statement. Huang, a Silicon Valley resident, died when his vehicle slammed into a barrier on U.S. 101 in California in 2018. His vehicle’s semi-automated feature misread the lines on the road and failed to account for the concrete median, according to the April 26 lawsuit.
Fong added: “The Huang family wants to help prevent this tragedy from happening to other drivers using Tesla vehicles or any semi-autonomous vehicles.” Fong, while preparing the lawsuit, said the family’s lawyers have access to Huang’s vehicle but not to the data collected by Tesla.
“We had access to the car but the data in the car is proprietary. Tesla possesses that and the ability to decrypt it,” he said during a press conference Wednesday. “We downloaded what we could that was in the public domain, shall we say, that’s able to be accessed by non-proprietary sources.” (RELATED: Report Shows Tesla In Autopilot Mode Accelerated Seconds Before Deadly Wreck)
Tesla has not responded to The Daily Caller News Foundation’s request for comment about the lawsuit’s allegation. CEO Elon Musk frequently touts the auto feature, calling the artificial intelligence feature something that will revolutionize the auto industry, but he does warn customers not to depend entirely on the function.
Autopilot was being used at the time of Huang’s crash, according to a June 2018 National Transportation Safety Board (NTSB) report investigating the wreck. It also found that his hands were on the steering wheel “for a total of 34 seconds, on three separate occasions, in the 60 seconds before impact.” Huang’s hands were not detected on the steering wheel in the six seconds before the incident.
Tesla has wrestled with similar incidents in the past. A man was killed in 2016 after his Model S slammed into a truck on the highway. An NTSB report later found he was using the autopilot feature but had ignored several audio warnings to place his hands on the steering wheel. Analysts stated at the time that the autopilot feature carries with it some unique problems.
“The expectation of Tesla is that the driver is alert and vigilant, ready to take over at a moment’s notice,” Ryan Eustice, a professor of engineering at the University of Michigan, told reporters in 2016. Drivers become complacent and place too much trust in auto-driving features, so customers have to be prepared to balance safety with convenience, he added.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact email@example.com.