A Tesla involved in a fatal crash on a highway in Southern California in the United States last week was operating on autopilot at the time, authorities said.
The May 5 accident in Fontana, a town 50 miles east of Los Angeles, is under investigation by the National Highway Traffic Safety Administration (NHTSA). The investigation is the 29th case involving a Tesla to which the agency has responded.
A 35-year-old man was killed when his Tesla Model 3 struck an overturned semi-truck on a highway at around 2:30 a.m. local time (9:30 a.m. GMT). The driver’s name has not yet been made public. Another man was seriously injured when the electric vehicle struck him while helping the semiconductor conductor out of the wreckage.
The California Highway Patrol, or CHP, announced Thursday that the car uses Tesla’s partially automated driving system called Autopilot, which has been involved in several crashes. The Fontana crash marks at least the fourth American fatality involving autopilot.
“Although the CHP does not normally comment on ongoing investigations, the department recognizes the high level of interest focused on accidents involving Tesla vehicles,” the agency said in a statement. “We felt that this information provides an opportunity to remind the public that driving is a complex task that requires the full attention of the driver.”
The federal safety investigation comes just after the CHP arrested another man who authorities said was in the back seat of a Tesla driving this week on Interstate 80 near Oakland with no one at the flying.
CHP did not say whether officials determined whether the Tesla in the I-80 incident was operating on autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it .
But it is likely that either the autopilot or the “Full Self-Driving” worked so that the driver was in the backseat. Tesla allows a limited number of owners to test its autonomous driving system.
Tesla, which disbanded its public relations department, did not respond to an email seeking comment on Friday. The company states in owner’s manuals and on its website that Autopilot and “Full Self-Driving” are not fully autonomous and that drivers should be careful and ready to intervene at all times.
The autopilot sometimes struggled to deal with stationary objects and level crossings in front of Teslas.
In two accidents in Florida, between 2016 and 2019, cars with the autopilot in use drove through semi-trailers, killing the men behind the wheel of the Teslas. In a 2018 crash in Mountain View, California, an Apple engineer driving on autopilot was killed when his Tesla hit a freeway barrier.
Tesla’s system, which uses cameras, radar and close-range sonar, also struggles to deal with stopped emergency vehicles. Teslas struck several fire engines and police vehicles which were stopped on the freeways with their flashing hazard lights on.
For example, the NHTSA sent a team in March to investigate after a Tesla on autopilot struck a Michigan State Police vehicle on Interstate 96 near Lansing. Neither the soldier nor the 22-year-old Tesla driver was injured, police said.
After fatal crashes in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a more robust system to ensure drivers pay attention and limit autopilot use to highways where it is can work efficiently. Neither Tesla nor the security agency acted.
In a February 1 letter to the U.S. Department of Transportation, NTSB Chairman Robert Sumwalt urged the department to pass regulations governing driver assistance systems such as autopilot, as well as autonomous vehicle testing. . NHTSA has relied primarily on voluntary vehicle guidelines, taking a practical approach so as not to hamper the development of new safety technologies.
Sumwalt said Tesla uses people who bought the cars to test “Full Self-Driving” software on public roads with limited monitoring or reporting requirements.
“Because the NHTSA has no requirements in place, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the AV. [autonomous vehicle] the limits of the control system, ”Sumwalt wrote.
He added, “Although Tesla includes a disclaimer that ‘currently enabled features require active driver supervision and do not make the vehicle autonomous,’ NHTSA’s hands-on approach to audiovisual testing supervision presents a potential risk to motorists and other road users.
The NHTSA, which has the power to regulate automated driving systems and request recalls if necessary, appears to have developed a renewed interest in the systems since US President Joe Biden took office.