Comment on this storyCommentAdd to your saved storiesSave
SAN FRANCISCO – Two months before Cruise’s driverless cars were pulled off the road here after hitting a pedestrian and dragging her about 20 feet, California regulators said they were confident in the self-driving technology and gave the company permission to to operate its robotaxi service around the clock in the city.
This approval was a pivotal moment for the self-driving car industry as it expanded one of the world’s largest test cases for the technology. But now, after a horrific accident on Oct. 2 that left a spinning pedestrian critically injured — and Cruise’s initial misrepresentation of what actually happened that night — authorities here are reconsidering whether self-driving cars are ready, and experts are encouraging them other states are doing the same.
This Thursday, just two days after the California Department of Motor Vehicles suspended Cruise’s driverless permits, the company announced that it would suspend all driverless operations nationwide to review its process and regain the public’s trust.
“It was only a matter of time before an incident like this occurred,” San Francisco District Attorney David Chiu said of the Oct. 2 crash. “And it was incredibly unfortunate that it happened, but it’s not a complete surprise.”
The final 11 seconds of a fatal Tesla Autopilot crash
Immediately after the California Public Utilities Commission (CPUC) voted in August to allow General Motors’ Cruise and Google’s Waymo to charge 24-hour fees for rides around San Francisco, Chiu filed a motion to allow the commercial expansion stop, arguing that self-driving cars would have serious “public safety implications.”
Here in California, the whiplash from approval to ban in just two months highlights the fragmented oversight of the self-driving car industry – a system that allowed Cruise to drive on San Francisco streets for more than three weeks after the October collision, Although it took a long time, the person was trapped under the vehicle.
California Assemblymember Phil Ting (D), whose district includes San Francisco, said the DMV did “the right thing” by suspending permits when it learned the full extent of the crash. As state lawmakers grapple with how to control this rapidly evolving industry, he said the DMV already has a rigorous approval process for autonomous vehicles. Cruise, for example, said it has received seven different permits from the DMV to operate in California in recent years.
In California alone, there are more than 40 companies — from young startups to tech giants — that have permission to test their self-driving cars in San Francisco, according to the DMV. The companies collectively report millions of miles on public roads and hundreds of mostly minor accidents each year, according to a Washington Post analysis of the data.
“It’s hard to be first, that’s the problem,” Ting said. “We do the best we can with what we know, and we know it [autonomous vehicles] are part of our future. But how do we regulate it instead of suppressing it?”
A distorted version of events
Just as the light turned green at a chaotic intersection in downtown San Francisco that October night, a pedestrian stepped onto the street. A human-driven car struck the woman, causing her to roll onto the windshield for a few moments before being thrown into the path of the driverless cruise car.
The human-driven car fled the scene while the cruiser stayed until officers arrived.
The morning after the collision, Cruise showed the Post and other media outlets footage taken by the self-driving vehicle. In the video shared via Zoom, the self-driving vehicle appeared to brake as soon as it collided with the woman. Then the video ended.
When asked by The Post what happened next, Cruise spokeswoman Hannah Lindow said the company had no further footage to share and that the autonomous vehicle “braked aggressively to minimize the impact.” According to the DMV, DMV representatives were initially shown a similar video.
But that original video only captured part of the story.
Aaron Peskin, president of the San Francisco Board of Supervisors, said first responders who responded to the accident noticed a trail of blood from the moment she hit the woman to where the vehicle eventually stopped about 20 feet away.
The DMV said it met with Cruise the day after the crash but did not receive additional footage until 10 days later, after “another government agency” notified the DMV that it existed. While the Cruise vehicle initially braked, the company reported, the longer video showed the car moving back toward the side of the road.
According to the DMV, the cruise vehicle may have dragged the woman trapped underneath about 20 feet aggravated her injuries.
Cruise refutes the DMV’s account, saying, “Shortly after the incident, our team proactively shared information with state and federal investigators.”
“We have been in close contact with regulators to answer their questions and assisted police in identifying the hit-and-run driver’s vehicle,” Lindow said in a statement. “Our teams are currently conducting an analysis to identify possible improvements [autonomous vehicle’s] Reaction to such an extremely rare event.”
In its decision to revoke Cruise’s driverless permits on Tuesday, the DMV said Cruise vehicles are “not safe for public use” and also noted that the company misrepresented “information regarding the safety of autonomous technology.”
Meanwhile, the National Highway Traffic Safety Agency also opened an investigation into Cruise this month over reports that vehicles “may have failed to exercise appropriate caution toward pedestrians on the roadway.”
Ed Walters, who teaches autonomous vehicle law at Georgetown University, said driverless technology is critical to a future with fewer traffic deaths because robots do not drive drunk or are distracted. But this accident shows, he said, that Cruise isn’t “quite ready for testing” in such a dense urban area.
“In hindsight you have to say that it was too early to bring these cars onto the market in this environment,” he said. “This is a cautionary tale that we should proceed gradually. That we should take this step by step and do as many tests as possible with the people in the cars to see when they are safe and whether they are safe.”
The DMV autonomous vehicle program requires companies to publicly report collisions involving self-driving cars only when they are in test mode. This means that if an incident like the Oct. 2 accident occurs while the company is technically operating as a commercial service, the company is not required to report it publicly as an Autonomous Vehicle Collision Report.
As of mid-October, the DMV said it had received 666 such reports. The crash on October 2nd is not one of them.
“In commercial use, filing accident reports with the state is essentially voluntary,” Julia Friedlander, senior manager of automated driving at the San Francisco Municipal Transportation Agency, told city officials at a recent meeting. “It is possible that some companies will make the decision to file reports at times and not necessarily file reports at other times.”
Cruise said it complied with “all required reports from our regulators” and that the company was in “regular discussions with regulators on a range of reportable and non-reportable incidents.” Lindow, the spokeswoman, said the company reported the Oct. 2 accident to the DMV under reporting requirements that are not publicly available.
This is just one example of how difficult it is to get an accurate picture of the performance of self-driving cars.
There There are few clear federal regulations that set rules for how autonomous vehicles must operate and what standards they must meet before being tested on public roads. At the federal level, the National Highway Traffic Safety Administration usually gathers Companies’ self-reported crash data. In California, the DMV issues permits for testing and deployment, and the CPUC regulates commercial passenger service programs.
In San Francisco, city officials have no control over whether or how cars are used on their streets.
This lack of control has unsettled city officials, especially as self-driving cars from Cruise and Waymo have become ubiquitous in San Francisco. The cars have caused great concern throughout the city First responders were disrupted on numerous occasions, from rolling into scenes cordoned off with caution tape to colliding with a fire truck on the way to an emergency scene. City leaders attempted to stop the expansion by drawing attention to these incidents, but were ultimately unsuccessful.
In an interview with The Washington Post last month, Cruise CEO Kyle Vogt said criticism of self-driving cars and incidents involving his company were overblown.
“Everything we do differently than humans is sensationalized,” he said at the time.
Who is liable if there is no driver?
While it was a human who hit the pedestrian and a cruise vehicle that dragged her 20 feet, board President Peskin told those on the CPUC that granted the company expanded permits — despite a flood of Problems reported with the technology – also share responsibility for the crash.
“Yeah, I blame Cruise,” he said. “But there should be a check and balance system – and that check and balance system has completely failed, and failed spectacularly.”
Terrie Prosper, a spokeswoman for the CPUC, declined to make any of the commissioners available for an interview on the issue, saying, “This matter is currently under discussion.”
Looking forward, Chiu, the San Francisco city attorney, said officials are still working on their request to appeal Waymo’s permits to operate their robotaxi service in the city.
Although the company hasn’t caused as many high-profile incidents as Cruise recently, he said it’s important for the state to “go back to the drawing board” until regulators can set clearer standards for the technology.
“The fact that we have multiple state agencies that seem to be working in different directions is a challenge,” he said. “Who is ultimately responsible for safety on our roads?”