From Tom Krisher, The Associate Press
A Tesla Model 3 included in a crash with a semitrailer March 1 in Florida was working on rsquo & the business;s semi-autonomous Autopilot system investigators have determined.
The car drove beneath the trailer, killing the driver, in an accident that is strikingly similar to one that occurred on the other side of Florida in 2016 that included use of Autopilot.
In both cases, neither the driver nor the Autopilot system ceased for the trailers, and the cars’ roofs were sheared off.
The Delray Beach crash, which remains under investigation by the National Transportation Safety Board and the National Highway Traffic Safety Administration, raises concerns about the effectiveness of Autopilot, which uses cameras, long-range radar and computers to detect objects in front of the cars to avoid collisions. The system can keep a car in its lane, change lanes and navigate freeway interchanges.
Tesla has maintained that the machine is designed to assist drivers, who need to pay attention and be prepared to intervene.
In a study on the March 1 accident, the NTSB said that initial data and video from the Tesla demonstrate that the driver turned on Autopilot about 10 seconds on a highway with turn lanes in the median. From less than eight seconds until the time of the driver the crash ’s hands weren’t discovered on the steering wheel, the NTSB report stated.
Neither the information nor the movies indicated the driver or the Autopilot system braked or attempted to avoid the trailer, the report stated.
The Model 3 was about 68 miles per hour when it struck the trailer on U.S. 441, and the speed limit was 55 mph, the report said. Jeremy Beren Banner, 50, was killed.
Tesla said in a statement Thursday that Banner did not use Autopilot at any time during the drive before the crash. Vehicle logs show that he took his hands immediately after tripping Autopilot, the statement said.
Tesla also said it’s saddened by the crash and that drivers have traveled over 1 billion miles when using Autopilot.
“When used by an driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those working without help,” the company said.
The conditions of the Delray Beach crash are much like one that happened in May 2016 near Gainesville, Florida. Joshua Brown, 40, of Canton, Ohio, was traveling in a Tesla Model S on a highway and using the Autopilot system when he was killed.
Neither Brown nor the car braked for a tractor-trailer, which had turned in front of the Tesla and was crossing its path. Brown’s Tesla also went under the trailer and its roof was sheared off. After that crash Tesla CEO Elon Musk said the company made changes in detecting objects so radar would play more of a role.
David Friedman, who had been acting head of NHTSA in 2014 and is vice president of advocacy for Consumer Reports, said he was amazed that the agency didn’t announce Autopilot defective after the Gainesville crash and seek a recall. The Delray Beach crash reinforces that Autopilot has been allowed to function in situations it can’t handle.
“Their system cannot see the wide side of an 18-wheeler on the highway,” Friedman said.
Tesla’s system was slow to warn the driver unlike systems that Consumer Reports has tested from General Motors and other companies, Friedman said. GM’s Super Cruise driver system only works on divided highways with no median turn lanes, he said.
Tesla requires a better system warn them if they’re not and to quickly detect if drivers are paying attention, Friedman said, adding that some owners tend to depend on the system.
“Tesla has for too long been using drivers that are human as guinea pigs. This is tragically what happens,” he said.
To force a recall, NHTSA must do an investigation and reveal that the way a vehicle is designed is outside of industry standards. “There are systems out on the roads right now that take over some degree of steering and speed control, but there’s only one of them that we keep hearing about where people are dying or getting into crashes. That kind of stands out,” Friedman said.
NHTSA said Thursday that its investigation is ongoing and its findings will be made public as it&rsquo.
The Delray Beach crash casts doubt on Musk’s statement that Tesla will have completely self-driving vehicles on the roads. Musk said last month that Tesla had developed a computer that could use artificial intelligence to safely navigate the roads with the identical camera and radar sensors that are on Tesla cars.
“Show me the data,” Friedman said. “Tesla is short on proof and extended on claims. Theyrsquo;re showing how to not do it by rushing out technology. ”
In a 2017 report on the Gainesville crash, the NTSB wrote that layout limits of Autopilot played a role. The agency said that Tesla advised Model S owners that Autopilot should be used only on highways interstates. The report said that Tesla did not incorporate protections.
The NTSB found that the Model S radar and cameras weren’t capable of detecting a vehicle. Instead, the systems are designed to detect vehicles they’re following to prevent rear-end collisions.
This story has been corrected to show that Tesla involved in the Delray Beach crash was a Model 3, not a Model S.
Buy Tickets for every event – Sports, Concerts, Festivals and more buy tickets