Teslas Autopilot crashes nearly three times despite Elon Musks claims.jpgw1440

Tesla’s ‘Autopilot’ crashes nearly three times, despite Elon Musk’s claims

Tesla’s driver assistance system, known as Autopilot, has been involved in far more accidents than previously reported

June 10, 2023 at 7:00 p.m. EDT

(Illustration by Emily Sabens/The Washington Post; KTVU-TV/AP; iStock)Comment on this storyComment

SAN FRANCISCO — The school bus was displaying its stop sign and flashing red warning lights when 17-year-old Tillman Mitchell got off one afternoon in March, according to a police report. Then a Tesla Model Y approached on North Carolina Highway 561.

The car – rumored to be on autopilot mode – never slowed down.

It hit Mitchell at 45 miles per hour. According to his aunt Dorothy Lynch, the teenager was thrown into the windshield, flew into the air and landed face down in the road. Mitchell’s father heard the noise and ran off his porch to find his son lying in the middle of the street.

“If it had been a smaller child,” Lynch said, “the child would have been dead.”

The accident in Halifax County, North Carolina, in which a futuristic technology hurtled down a country road with devastating consequences, was one of 736 accidents in the US since 2019 involving Teslas on autopilot mode far more than previously reported, according to an analysis of data from the National Highway Traffic Safety Administration by the Washington Post. The number of such accidents has skyrocketed over the past four years, the data shows. This reflects the dangers associated with the proliferation of Tesla’s futuristic driver-assistance technology, as well as the growing presence of cars on the country’s roads.

Autopilot-related deaths and serious injuries have also increased significantly, the data shows. When authorities first released a partial tally of accidents involving autopilot in June 2022, they only counted three fatalities that were clearly technology-related. The latest data includes at least 17 fatal incidents, 11 since last May, and five serious injuries.

Mitchell survived the accident in March, but suffered a broken neck and leg and had to be put on a ventilator. He still suffers from memory problems and has trouble walking. His great-aunt said the incident was meant to serve as a warning about the dangers of the technology.

“I pray this is a learning curve,” Lynch said. “People are too trusting when it comes to a machine.”

Tesla CEO Elon Musk said cars running on Tesla’s Autopilot mode are safer than those driven solely by human drivers, citing accident rates when comparing driving modes. He has pushed the automaker to develop and deploy features programmed to maneuver the streets – navigating stopped school buses, fire engines, stop signs and pedestrians – arguing that the technology will usher in a safer, virtually accident-free future. While it’s impossible to say how many accidents may have been averted, the data reveals clear flaws in the technology being tested in real time on America’s highways.

Tesla’s 17 fatal accidents show a clear pattern, The Post found: Four of them involved a motorcycle. Another accident involved an emergency vehicle. Meanwhile, some of Musk’s decisions — such as widening the feature’s availability and removing radar sensors from vehicles — appear to have contributed to the reported spike in incidents, according to experts speaking to The Post.

Tesla and Elon Musk did not respond to a request for comment.

NHTSA said a report of an accident involving driver assistance doesn’t mean the technology was the cause. “NHTSA is conducting an active investigation into Tesla Autopilot, including fully automated driving,” spokeswoman Veronica Morales said, noting that the agency is not commenting on ongoing investigations. “NHTSA reminds the public that all advanced driver assistance systems require the human driver to remain in control and fully engaged in the driving task at all times. Accordingly, all state laws hold the human driver responsible for the operation of their vehicles.”

Musk has repeatedly defended his decision to make driver-assistance technologies available to Tesla owners, arguing that the benefits outweigh the harm.

“At the point where you think adding autonomy will reduce injuries and fatalities, I think you have a moral obligation to use it, even if you’re going to be sued and blamed by a lot of people,” Musk said last year. “Because the people whose lives you saved don’t know their lives were saved. And the people who occasionally die or get hurt certainly know it – or their state knows it.”

Former NHTSA chief safety adviser Missy Cummings, a professor in George Mason University’s College of Engineering and Computing, said the rise in Tesla accidents is worrying.

“Tesla has more serious — and deadlier — crashes than humans in a normal dataset,” she said in response to the numbers analyzed by The Post. A likely cause, she said, is the expanded adoption of “full self-driving,” which brings driver assistance to city and residential streets, over the past year and a half. “The fact that… anyone and everyone can have it. … Can it be assumed that this could lead to increased accident rates? Sure, absolutely.”

Cummings said the death toll compared to the total number of accidents is also a concern.

It’s unclear if the data captures every accident involving Tesla’s driver assistance systems. NHTSA’s data includes some incidents where it is “unknown” whether autopilot or fully autonomous driving was used. That includes three fatalities, including one in the last year.

NHTSA, the nation’s top auto safety regulator, began collecting data after a federal regulation in 2021 required automakers to disclose accidents involving driver-assistance technology. The total number of accidents involving the technology is negligible compared to all traffic accidents; NHTSA estimates that more than 40,000 people died in accidents of all kinds last year.

The data shows that the vast majority of the 807 automation-related accidents since the introduction of reporting requirements have involved Tesla. Tesla — which has experimented more aggressively with automation than other automakers — is also linked to almost all of the deaths.

Subaru is second with 23 reported accidents since 2019. The huge gap likely reflects the broader adoption and deployment of automation across Tesla’s vehicle fleet, as well as the broader range of circumstances in which Tesla drivers are encouraged to use Autopilot.

Autopilot, introduced by Tesla in 2014, is a set of features that allow the car to maneuver independently from the highway entrance to the highway exit while maintaining speed and distance from other vehicles and following lane lines. Tesla offers it as a standard feature on its vehicles, more than 800,000 of which come with Autopilot on US roads, though more advanced iterations come at a price.

“Full Self-Driving,” an experimental feature that customers must purchase, allows Teslas to maneuver from point A to point B by following turn-by-turn directions along a route, stopping at stop signs and traffic lights, turning and changing lanes, and reacting for dangers along the way. With both systems, says Tesla Drivers must monitor the road and intervene if necessary.

The Post has asked experts to analyze videos of Tesla’s beta software, and reporters Faiz Siddiqui and Reed Albergotti are testing the car’s performance firsthand. (Video: Jonathan Baran/The Washington Post)

The spike in crashes coincides with Tesla’s aggressive rollout of “full self-driving,” which has grown from around 12,000 users to nearly 400,000 in just over a year. Nearly two-thirds of all driver-assistance accidents Tesla reported to NHTSA occurred in the past year.

Philip Koopman, a professor at Carnegie Mellon University who has been researching the safety of autonomous vehicles for 25 years, said the proliferation of Teslas in the data raises critical questions.

“A significantly higher number is certainly a cause for concern,” he said. “We need to understand if it’s actually due to worse accidents or if there’s another factor, like a significantly higher number of kilometers driven with autopilot on.”

In February, Tesla recalled more than 360,000 vehicles equipped with Full Self-Driving over concerns the software was causing its vehicles to ignore traffic lights, stop signs and speed limits.

Disregarding traffic rules, safety agency documents say, “could increase the risk of a collision if the driver doesn’t intervene.” Tesla said it fixed the problems with an over-the-air software update, thereby eliminating the risk remotely.

While Tesla was constantly tweaking its driver-assistance software, the company also took the unprecedented step of removing its radar sensors from new cars and disabling them from vehicles already on the road — depriving them of an important sensor while Musk found a simpler set of hardware worldwide riddled lack of computer chips. musk called last year: “Only radar with very high resolution is relevant.”

Steps have recently been taken to reintroduce radar sensors, according to government documents first reported by Electrek.

In a presentation in March, Tesla stated that the accident rate for fully self-driving vehicles is at least five times lower than for vehicles in normal driving conditions, as measured by the number of kilometers driven per collision. This claim, and Musk’s characterization of Autopilot as “clearly safer,” cannot be verified without access to the detailed data Tesla has.

Autopilot, largely a freeway system, operates in a less complex environment than the situations experienced by a typical road user.

It is unclear which of the systems was used in the fatal accidents: Tesla has asked NHTSA not to disclose this information. In the section of the NHTSA data that reports the software version, Tesla’s incidents state — in capital letters — “redacted, may contain confidential business information.”

Both autopilot and fully autonomous driving have come under criticism in recent years. Transport Secretary Pete Buttigieg told the Associated Press last month that “autopilot” is not an appropriate name “when the fine print says you have to keep your hands on the wheel and your eyes on the road at all times.”

Six years after Tesla advertised the flawless driving of self-driving cars, a car using the latest beta software, Full Self-Driving, couldn’t drive the track flawlessly. (Video: Jonathan Baran/The Washington Post)

NHTSA has opened multiple investigations into Tesla’s accidents and other issues with its driver-assistance software. One of these focused on ‘phantom braking’, a phenomenon in which vehicles brake abruptly in the face of imaginary dangers.

In a case last year detailed by The Intercept, a Tesla model braked old.

In other complaints filed with NHTSA, owners say the cars hit the brakes when they collided with semi-trucks on oncoming lanes.

Many crashes involve similar settings and conditions. For example, NHTSA has received more than a dozen reports of Teslas on Autopilot mode crashing into parked emergency vehicles. Last year, NHTSA expanded its investigation of these incidents into a “technical analysis.”

Also last year, NHTSA launched two back-to-back special investigations into fatal accidents involving Tesla vehicles and motorcyclists. One occurred, according to Utah authorities, when a motorcyclist was riding a Harley-Davidson just after 1 a.m. in a busy lane on Interstate 15 outside of Salt Lake City. A Tesla in autopilot mode rammed the bike from behind.

“The driver of the Tesla did not see the motorcyclist and collided with the rear of the motorcycle, throwing the driver off the motorcycle,” the Utah Department of Public Safety said. The motorcyclist died at the scene of the accident, the Utah authorities said.

“It’s very dangerous for motorcycles to be around Teslas,” Cummings said.

From hundreds of Tesla driver-assist accidents, NHTSA has focused on about 40 Tesla incidents for further analysis, hoping to gain deeper insight into how the technology works. Among them was the accident in North Carolina involving Mitchell, the student getting off the school bus.

Afterwards, Mitchell awoke in the hospital with no memory of what happened. He still doesn’t understand the seriousness of the situation, said his aunt. His memory problems hamper his attempts to catch up in school. Local news agency WRAL reported that the Tesla’s windshield was shattered by the force of the impact.

Tesla driver Howard G. Yee was charged with multiple felonies in the accident, including reckless driving, overtaking a stopped school bus and hitting a person, a Class I felony, according to North Carolina State Highway Patrol Sgt. Marcus Bethea.

Authorities said Yee placed weights on the steering wheel to trick the autopilot ining the presence of a driver’s hands: the autopilot disables functions if steering pressure is not applied after a long period of time. Yee did not respond to a request for comment.

NHTSA is still investigating the crash and a spokeswoman for the agency declined to provide further details, citing the ongoing investigation. Tesla asked the agency not to make the company’s summary of the incident public, saying it “could contain confidential business information.”

Lynch said her family thought of Yee and viewed his actions as a mistake caused by over-reliance on technology, in what experts call “automation complacency.”

“We don’t want his life to be ruined by this stupid accident,” she said.

But when asked about Musk, Lynch had sharper words.

“I think they need to ban automated driving,” she said. “I think it should be banned.”