Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Thursday, April 18, 2024 | Back issues
Courthouse News Service Courthouse News Service

Tesla Was on Autopilot in Fatal Crash

A Tesla Model 3 involved in a fatal crash with a semitrailer in Florida on March 1 was operating on the company's semi-autonomous Autopilot system, federal investigators have determined.

DETROIT (AP) — A Tesla Model 3 involved in a fatal crash with a semitrailer in Florida on March 1 was operating on the company's semi-autonomous Autopilot system, federal investigators have determined.

Tesla Motors unveils the new lower-priced Model 3 sedan at the Tesla Motors design studio in Hawthorne, Calif., on March 31, 2016. Tesla, recognizing as imperative its ability to produce a cheaper electric car, told employees on Jan. 18, 2019, that it must cut 7 percent of its workforce. Tesla’s cheapest model right now is the $44,000 Model 3, and it needs to broaden its customer base to survive. (AP Photo/Justin Pritchard)

The car drove beneath the trailer, killing the driver, in a crash that is strikingly similar to one on the other side of Florida in 2016 that also involved the Autopilot.

In both cases, neither the driver nor the Autopilot system stopped for the trailers, and the roofs of the cars were sheared off.

The Delray Beach crash on March 1, which remains under investigation by the National Transportation Safety Board and the National Highway Traffic Safety Administration, raises questions about the effectiveness of Autopilot, which uses cameras, long-range radar and computers to detect objects in front of the cars to avoid collisions. The system also can keep a car in its lane, change lanes and navigate freeway interchanges.

Tesla has maintained that the system is designed only to assist drivers, who must pay attention at all times and be ready to intervene.

In a preliminary report on the March 1 crash, the NTSB said that initial data and video from the Tesla show that the driver turned on Autopilot about 10 seconds before the crash on a divided highway with turn lanes in the median. From less than eight seconds until the time of the crash, the driver's hands were not detected on the steering wheel, the NTSB report stated.

Neither the data nor the videos indicated the driver or the Autopilot system braked or tried to avoid the trailer, the report stated.

The Model 3 was going 68 miles per hour when it hit the trailer on U.S. 441, and the speed limit was 55 mph, the report said. Jeremy Beren Banner, 50, was killed.

Tesla said in a statement Thursday that Banner did not use Autopilot at any other time during the drive before the crash. Vehicle logs show that he took his hands off the steering wheel immediately after activating Autopilot, the statement said.

Tesla also said it's saddened by the crash and that drivers have traveled more than 1 billion miles while using Autopilot.

"When used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance," the company said.

The circumstances of the Delray Beach crash are much like one that occurred in May 2016 near Gainesville. Joshua Brown, 40, of Canton, Ohio, was traveling in a Tesla Model S on a divided highway and using the Autopilot system when he was killed.

Neither Brown nor the car braked for a tractor-trailer, which had turned left in front of the Tesla and was crossing its path. Brown's Tesla also went beneath the trailer and its roof was sheared off. After that crash Tesla CEO Elon Musk said the company made changes in its system so radar would play more of a role in detecting objects.

David Friedman, who was acting head of NHTSA in 2014 and is now vice president of advocacy for Consumer Reports, said he was surprised the agency didn't declare Autopilot defective after the Gainesville crash and seek a recall. The Delray Beach crash, he said, reinforces that Autopilot is being allowed to operate in situations it cannot handle safely.

"Their system cannot literally see the broad side of an 18-wheeler on the highway," Friedman said.

Tesla's system was too slow to warn the driver to pay attention, unlike systems that Consumer Reports has tested from General Motors and other companies, Friedman said. GM's Super Cruise driver assist system operates only on divided highways with no median turn lanes, he said.

Tesla needs a better system to more quickly detect whether drivers are paying attention and warn them if they are not, Friedman said, adding that some owners tend to rely on the system too much.

"Tesla has for too long been using human drivers as guinea pigs. This is tragically what happens," he said.

To force a recall, the NHTSA must do an investigation and show that the way a vehicle is designed is outside of industry standards. "There are multiple systems out on the roads right now that take over some level of steering and speed control, but there's only one of them that we keep hearing about where people are dying or getting into crashes. That kind of stands out," Friedman said.

The NHTSA said Thursday that its investigation is continuing and its findings will be made public when completed.

The Delray Beach crash casts doubt on Musk's statement that Tesla will have fully self-driving vehicles on the roads sometime next year. Musk said last month that Tesla had developed a powerful computer that could use artificial intelligence to navigate the roads safely with the same camera and radar sensors that are now on Tesla cars.

Friedman doubts it. "Show me the data," he said. "Tesla is long on big claims and short on proof. They're literally showing how not to do it by rushing technology out."

In a 2017 report on the Gainesville crash, the NTSB wrote that design limitations of Autopilot played a major role. The agency said that Tesla told Model S owners that Autopilot should be used only on limited-access highways, primarily interstates. The report said that despite upgrades to the system, Tesla did not incorporate protections against use of the system on other types of roads.

The NTSB found that the Model S cameras and radar were not capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions.

Categories / Personal Injury, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...