header banner
Default

eleven minutes and seconds left in a deadly Tesla Autopilot crash


The sun had yet to rise in Delray Beach, Fla., when Jeremy Banner flicked on Autopilot. His red Tesla Model 3 sped down the highway at nearly 70 mph, his hands no longer detected on the wheel.

Seconds later, the Tesla plowed into a semi-truck, shearing off its roof as it slid under the truck’s trailer. Banner was killed on impact.

Banner’s family sued after the gruesome 2019 collision, one of at least 10 active lawsuits involving Tesla’s Autopilot, several of which are expected to go to court over the next year. Together, the cases could determine whether the driver is solely responsible when things go wrong in a vehicle guided by Autopilot — or whether the software should also bear some of the blame.

The outcome could prove critical for Tesla, which has pushed increasingly capable driver-assistance technology onto the nation’s roadways far more rapidly than any other major carmaker. If Tesla prevails, the company could continue deploying the evolving technology with few legal consequences or regulatory guardrails. Multiple verdicts against the company, however, could threaten both Tesla’s reputation and its financial viability.

Jeremy Banner. (Family photo)

According to an investigation by the National Transportation Safety Board (NTSB), Banner, a 50-year-old father of four, should have been watching the road on that March morning. He agreed to Tesla’s terms and conditions of operating on Autopilot and was provided with an owner’s manual, which together warn of the technology’s limitations and state that the driver is ultimately responsible for the trajectory of the car.

But lawyers for Banner’s family say Tesla should shoulder some responsibility for the crash. Along with former transportation officials and other experts, they say the company’s marketing of Autopilot exaggerates its capabilities, creating a false sense of complacency that can lead to deadly crashes. That argument is echoed in several Autopilot-related cases, where plaintiffs say they believed Tesla’s claims that Autopilot was “safer than a human-operated vehicle.”

A Washington Post analysis of federal data found that vehicles guided by Autopilot have been involved in more than 700 crashes, at least 19 of them fatal, since its introduction in 2014, including the Banner crash. In Banner’s case, the technology failed repeatedly, his family’s lawyers argue, from when it didn’t brake to when it didn’t issue a warning about the semi-truck in the car’s path.

To reconstruct the crash, The Post relied on hundreds of court documents, dash cam photos and a video of the crash taken from a nearby farm, as well as satellite imagery, NTSB crash assessment documents and diagrams, and Tesla’s internal data log, which the NTSB included in its investigation report. The Post’s reconstruction found that braking just 1.6 seconds before impact could have avoided the collision.

The Tesla continues on for another 40 seconds, traveling about 1,680 feet — nearly a third of a mile — before finally coasting to a stop on a grassy median.

A surveillance video located on the farm where the truck driver had just made a routine delivery shows the crash in real time. This video, which was obtained exclusively by The Post, along with court documents, crash reports and witness statements, offers a rare look at the moments leading up to an Autopilot crash. Tesla typically does not provide access to its cars’ crash data and often prevents regulators from revealing crash information to the public.

CCTV captures the moment a Tesla crashes into a truck. (Security footage from Pero Family Farms obtained by The Post)

Braking even 1.6 seconds before the crash could have avoided the collision, The Post’s reconstruction found by reviewing braking distance measurements of a 2019 Tesla Model 3 with similar specifications, conducted by vehicle testers at Car and Driver. At this point the truck was well within view and spanning both lanes of southbound traffic.

Tesla braking distance map

Due to the uncertainty of Banner’s movements in the car, The Post did not depict him in the reconstruction. The NTSB investigation determined that Banner’s inattention and the truck driver’s failure to fully yield to oncoming traffic were probable causes of the crash.

However, the NTSB also cited Banner’s “overreliance on automation,” saying Tesla’s design “permitted disengagement by the driver” and contributed to the crash. Four years later, despite pleas from safety investigators, regulators in Washington have outlined no clear plan to address those shortcomings, allowing the Autopilot experiment to continue to play out on American roads, with little federal intervention.

While the Federal Motor Vehicle Safety Standards administered by the National Highway Traffic Safety Administration (NHTSA) spell out everything from how a car’s brakes should operate to where its lights should be located, they offer little guidance about vehicle software.

‘Fancy cruise control’

VIDEO: Tesla accelerated seconds before deadly crash on Hwy 101, NTSB report finds
ABC7 News Bay Area

Teslas guided by Autopilot have slammed on the brakes at high speeds without clear cause, accelerated or lurched from the road without warning and crashed into parked emergency vehicles displaying flashing lights, according to investigation and police reports obtained by The Post.

In February, a Tesla on Autopilot smashed into a firetruck in Walnut Creek, Calif., killing the driver. The Tesla driver was under the influence of alcohol during the crash, according to the police report.

In July, a Tesla rammed into a Subaru Impreza in South Lake Tahoe, Calif. “It was, like, head on,” according to a 911 call from the incident obtained by The Post. “Someone is definitely hurt.” The Subaru driver later died of his injuries, as did a baby in the back seat of the Tesla, according to the California Highway Patrol.

Tesla did not respond to multiple requests for comment. In its response to the Banner family’s complaint, Tesla said, “The record does not reveal anything that went awry with Mr. Banner’s vehicle, except that it, like all other automotive vehicles, was susceptible to crashing into another vehicle when that other vehicle suddenly drives directly across its path.”

Autopilot includes features to automatically control the car’s speed, following distance, steering and some other driving actions, such as taking exits off a freeway. But a user manual for the 2018 Tesla Model 3 reviewed by The Post is peppered with warnings about the software’s limitations, urging drivers to always pay attention, with hands on the wheel and eyes on the road. Before turning on Autosteer — an Autopilot feature — for the first time, drivers must click to agree to the terms.

In particular, Tesla noted in court documents for the Banner case that Autopilot was not designed to reliably detect cross-traffic, or traffic moving perpendicular to a vehicle, arguing that its user terms offers adequate warning of its limitations.

In a Riverside, Calif., courtroom last month in a lawsuit involving another fatal crash where Autopilot was allegedly involved, a Tesla attorney held a mock steering wheel before the jury and emphasized that the driver must always be in control.

Autopilot “is basically just fancy cruise control,” he said.

Tesla CEO Elon Musk has painted a different reality, arguing that his technology is making the roads safer: “It’s probably better than a person right now,” Musk said of Autopilot during a 2016 conference call with reporters.

Musk made a similar assertion about a more sophisticated form of Autopilot called Full Self-Driving on an earnings call in July. “Now, I know I’m the boy who cried FSD,” he said. “But man, I think we’ll be better than human by the end of this year.”

The NTSB said it has repeatedly issued recommendations aiming to prevent crashes associated with systems such as Autopilot. “NTSB’s investigations support the need for federal oversight of system safeguards, foreseeable misuse, and driver monitoring associated with partial automated driving systems,” NTSB spokesperson Sarah Sulick said in a statement.

NHTSA said it has an “active investigation” of Autopilot. “NHTSA generally does not comment on matters related to open investigations,” NHTSA spokeswoman Veronica Morales said in a statement. In 2021, the agency adopted a rule requiring carmakers such as Tesla to report crashes involving their driver-assistance systems.

Beyond the data collection, though, there are few clear legal limitations on how this type of advanced driver-assistance technology should operate and what capabilities it should have.

“Tesla has decided to take these much greater risks with the technology because they have this sense that it’s like, ‘Well, you can figure it out. You can determine for yourself what’s safe’ — without recognizing that other road users don’t have that same choice,” former NHTSA administrator Steven Cliff said in an interview.

“If you’re a pedestrian, [if] you’re another vehicle on the road,” he added, “do you know that you’re unwittingly an object of an experiment that’s happening?”

‘The car is driving itself’

VIDEO: Tesla driver loses control as car speeds down street causing two deaths
The Telegraph

Banner researched Tesla for years before buying a Model 3 in 2018, his wife, Kim, told federal investigators. Around the time of his purchase, Tesla’s website featured a video showing a Tesla navigating the curvy roads and intersections of California while a driver sits in the front seat, hands hovering beneath the wheel.

The video, recorded in 2016, is still on the site today.

“The person in the driver’s seat is only there for legal reasons,” the video says. “He is not doing anything. The car is driving itself.”

In a different case involving another fatal Autopilot crash, a Tesla engineer testified that a team specifically mapped the route the car would take in the video. At one point during testing for the video, a test car crashed into a fence, according to Reuters. The engineer said in a deposition that the video was meant to show what the technology could eventually be capable of — not what cars on the road could do at the time.

While the video concerned Full Self-Driving, which operates on surface streets, the plaintiffs in the Banner case argue Tesla’s “marketing does not always distinguish between these systems.”

Not only is the marketing misleading, plaintiffs in several cases argue, the company gives drivers a long leash when deciding when and how to use the technology. Though Autopilot is supposed to be enabled in limited situations, it sometimes works on roads it’s not designed for. It also allows drivers to go short periods without touching the wheel and to set cruising speeds well above posted speed limits.

For example, Autopilot was not designed to operate on roads with cross-traffic, Tesla lawyers say in court documents for the Banner case. The system struggles to identify obstacles in its path, especially at high speeds. The stretch of U.S. 441 where Banner crashed was “clearly outside” the environment Autopilot was designed for, the NTSB said in its report. Still, Banner was able to activate it.

Identifying semi-trucks is a particular deficiency that engineers have struggled to solve since Banner’s death, according to a former Autopilot employee who spoke on the condition of anonymity for fear of retribution.

Tesla tasked image “labelers” with repeatedly identifying images of semi-trucks perpendicular to Teslas to better train its software “because even in 2021 that was a heavy problem they were trying to solve,” the former employee said.

Because of the orientation of Tesla’s cameras, the person said, it was sometimes hard to discern the location of the tractor-trailers. In one view, the truck could appear to be floating 20 feet above the road, like an overpass. In another view, it could appear 25 feet below the ground.

Tesla complicated the matter in 2021 when it eliminated radar sensors from its cars, The Post previously reported, making vehicles such as semi-trucks appear two-dimensional and harder to parse.

In 2021, the chair of the NTSB publicly criticized Tesla for allowing drivers to turn on Autopilot in inappropriate locations and conditions — citing Banner’s crash and a similar wreck that killed another man, Joshua Brown, in 2016.

A third similar crash occurred this past July, killing a 57-year-old bakery owner in Fauquier County, Va., after his Tesla collided with a semi-truck.

Philip Koopman, an associate professor at Carnegie Mellon who has studied self-driving-car safety for more than 25 years, said the onus is on the driver to understand the limitations of the technology. But, he said, drivers can get lulled into thinking the technology works better than it does.

“If a system turns on, then at least some users will conclude it must be intended to work there,” Koopman said. “Because they think if it wasn’t intended to work there, it wouldn’t turn on.”

Andrew Maynard, a professor of advanced technology transitions at Arizona State University, said customers probably just trust the technology.

“Most people just don’t have the time or ability to fully understand the intricacies of it, so at the end they trust the company to protect them,” he said.

The truck’s trailer was damaged in the collision with Banner’s Tesla. (NTSB)

It is impossible to know what Banner was doing in the final seconds of his life, after his hands were no longer detected on the wheel. Tesla has argued in court documents that if he had been paying attention to the road, it is “undisputed” that “he could have avoided the crash.”

The case, originally set for trial this week in Palm Beach County Circuit Court, has been delayed while the court considers the family’s request to seek punitive damages against Tesla.

A small jolt

VIDEO: Video shows Tesla autopilot failing at site of fatal March crash
CBS News

Whatever the verdict, the crash that March morning had a shattering effect on the truck driver crossing U.S. 441. The 45-year-old driver — whom The Post is not naming because he was not charged — felt a small jolt against the back of his truck as Banner’s Tesla made impact. He pulled over and hopped out to see what had happened.

According to a transcript of his interview with the NTSB, it was still dark and difficult to see when the crash occurred. But the driver noticed pink-stained glass stuck on the side of his trailer.

“Are you the guy that drives this tractor?” he recalled a man in a pickup hollering.

“Yeah,” the driver said he responded.

“That dude didn’t make it,” the man told him.

The truck driver started to shake.

He said he should have been more careful at the stop sign that morning, according to an interview with federal investigators. Banner’s family also sued the driver, but they settled, according to the Banner family’s lawyer.

The truck driver told investigators that self-driving vehicles have always made him uneasy and that he doesn’t think they should be allowed on the road. He became emotional recounting the crash.

“I’ve done it a dozen times,” the driver said of his fateful left turn. “And I clearly thought I had plenty of time. I mean, it was dark, and the cars looked like they was back further than what they was.”

“Yeah,” the investigator said.

“And, I mean, it’s just something I’m —,” the driver said.

“It’s okay, it’s okay,” the investigator responded.

“Yeah, take your time,” another investigator said.

“Just,” the driver said, pausing again. “It’s something I’m going to have to live with.”

The frame rails of the truck were damaged when the Tesla hurtled underneath it. (NTSB)
Methodology

To reconstruct Banner’s crash, The Post relied on hundreds of court documents, dash cam photos and a video of the crash taken from a nearby farm, as well as satellite imagery, NTSB assessment documents and diagrams, and Tesla’s internal data log. Speeds included in the Tesla’s data log were used by The Post to plot and animate the movement of the Tesla vehicle within a 3D model of the highway produced from OpenStreetMap data and satellite imagery. The Post used other visual material, such as diagrams, dash cam stills and a surveillance video of the crash, to further clarify the changing positions of the Tesla and plot the movement of the truck. The Tesla’s data log also included information on when certain system and Autopilot features were activated or not activated, which The Post time-coded and added into the animation to present the sequence of system events before the crash.

The Tesla interface featured in the animation is based upon the default display in a Tesla Model 3.

About this story

Additional research by Alice Crites and Monika Mathur. Editing by Christina Passariello, Karly Domb Sadof, Laura Stevens, Nadine Ajaka and Lori Montgomery. Copy-editing by Carey L. Biron.

Sources


Article information

Author: Shawn Macdonald

Last Updated: 1699830004

Views: 1327

Rating: 4.7 / 5 (86 voted)

Reviews: 82% of readers found this page helpful

Author information

Name: Shawn Macdonald

Birthday: 2009-01-21

Address: 247 Leblanc Forks Apt. 107, Elizabethstad, NM 32297

Phone: +4378638781436190

Job: Electrician

Hobby: Metalworking, Surfing, Yoga, Scuba Diving, Bowling, Tea Brewing, Playing Guitar

Introduction: My name is Shawn Macdonald, I am a frank, cherished, unyielding, vivid, audacious, rich, lively person who loves writing and wants to share my knowledge and understanding with you.