Cruise’s crash highlights fragmented regulation for self-driving cars

SAN FRANCISCO — Two months before Cruise’s driverless cars were yanked off the streets here for rolling over a pedestrian and dragging her about 20 feet, California regulators said they were confident in self-driving technology and gave the company permission to operate its robotaxi service around the city.

That approval was a pivotal moment for the self-driving car industry, as it expanded one of the biggest test cases in the world for the technology. But now, following a horrendous Oct. 2 crash that critically injured a jaywalking pedestrian — and Cruise’s initial misrepresentation over what actually happened that night — officials here are rethinking whether self-driving cars are ready for the road, and experts are encouraging other states to do the same.

This Thursday, just two days after the California Department of Motor Vehicles suspended Cruise’s driverless permits, the company said it would suspend all driverless operations around the country to examine its process and earn back public trust.

“It was just a matter of time before an incident like this occurred,” San Francisco City Attorney David Chiu said of the Oct. 2 crash. “And it was incredibly unfortunate that it happened, but it is not a complete surprise.”

The final 11 seconds of a fatal Tesla Autopilot crash

Immediately after California’s Public Utilities Commission (CPUC) voted in August to allow General Motors’ Cruise and Google’s Waymo to charge for rides 24/7 around San Francisco, Chiu filed a motion to halt the commercial expansion, arguing the driverless cars had serious “public safety ramifications.”

Here in California, the whiplash from approval to ban in just two months highlights the fragmented oversight governing the self-driving car industry — a system that allowed Cruise to operate on San Francisco’s roads for more than three weeks following the October collision, despite dragging a human pinned underneath the vehicle.

California Assembly member Phil Ting (D), whose district includes San Francisco, said the DMV did “the right thing” by suspending the permits when it learned the full extent of the crash. While state legislators are grappling with how to control this rapidly developing industry, he said the DMV already has a rigorous permit approval process for autonomous vehicles. Cruise, for example, said it has received seven different permits over the past few years from the DMV to operate in California.

In California alone, there are more than 40 companies — ranging from young start-ups to tech giants — that have permits to test their self-driving cars in San Francisco, according to the DMV. According to a Washington Post analysis of the data, the companies collectively report millions of miles on public roads every year, along with hundreds of mostly minor accidents.

“It’s hard being first, that’s the problem,” Ting said. “We are doing the best we can with what we know, while knowing that [autonomous vehicles] are part of our future. But how do we regulate it, not squash it?”

A skewed version of events

Just as the light turned green at a chaotic intersection in downtown San Francisco that October night, a pedestrian stepped into the road. A human-driven car rammed into the woman, causing her to roll onto the windshield for a few moments before she was flung into the path of the Cruise driverless car.

The human-driven car fled the scene, while the Cruise remained until officials arrived.

The morning after the collision, Cruise showed The Post and other media outlets footage captured by the driverless vehicle. In the video shared via Zoom, the driverless vehicle appeared to brake as soon as it made impact with the woman. Then the video ended.

When asked by The Post what happened next, Cruise spokeswoman Hannah Lindow said the company had no additional footage to share and that the autonomous vehicle “braked aggressively to minimize the impact.” According to the DMV, representatives from the DMV were initially shown a similar video.

But that original video captured only part of the story.

President of the San Francisco Board of Supervisors Aaron Peskin said that first responders who tended to the crash noted a trail of blood from the point of impact with the woman to where the vehicle ultimately stopped about 20 feet away.

The DMV said it met with Cruise the day after the crash, but only received additional footage 10 days later after “another government agency” told the DMV it existed. While the Cruise vehicle did initially brake as the company reported, the longer video showed the car began moving again toward the side of the road.

According to the DMV, the Cruise vehicle dragged the woman pinned underneath for about 20 feet, a move that may have worsened her injuries.

Cruise rebuts the DMV’s account, saying “shortly after the incident, our team proactively shared information” with state and federal investigators.

“We have stayed in close contact with regulators to answer their questions and assisted the police with identifying the vehicle of the hit and run driver,” Lindow said in a statement. “Our teams are currently doing an analysis to identify potential enhancements to the [autonomous vehicle’s] response to this kind of extremely rare event.”

In its decision to revoke Cruise’s driverless permits Tuesday, the DMV said that Cruise vehicles are “not safe for the public’s operation” and also determined the company misrepresented “information related to safety of the autonomous technology.”

Meanwhile, the National Highway Traffic Safety Agency also opened an investigation into Cruise this month over reports where vehicles “may not have exercised appropriate caution around pedestrians in the roadway.”

Ed Walters, who teaches autonomous vehicle law at Georgetown University, said that driverless technology is critical for a future with fewer road fatalities because robots don’t drive drunk or get distracted. But, he said, this accident shows that Cruise was not “quite ready for testing” in such a dense urban area.

“In hindsight you would have to say it was too early to roll these cars out in that environment,” he said. “This is a cautionary tale that we should be incremental. That we should do this step by step and do as much testing as we can with people in the cars to see when they are safe and whether they are safe.”

Under the DMV’s autonomous vehicle program, companies are asked to publicly report collisions involving driverless cars only when they are in test mode. That means if an incident like the Oct. 2 crash occurs while the company is technically operating as a commercial service, the company does not have to publicly report it as an “Autonomous Vehicle Collision Report.”

As of mid-October, the DMV said it received 666 such reports. The Oct. 2 crash is not one of them.

“In commercial deployment, filing crash reports with the state is essentially voluntary,” Julia Friedlander, San Francisco Municipal Transportation Agency’s senior manager of automated driving policy, told city officials during a recent meeting. “It’s possible that some companies are making the decision to file reports sometimes and not necessarily file reports at other times.”

Cruise said it complies “with all required reporting from our regulators” and the company has “conversations with regulators about a number of reportable and non-reportable incidents on a regular basis.” Lindow, the spokeswoman, said the company reported the Oct. 2 crash to the DMV under reporting requirements that are not publicly available.

This is just one example of how difficult it is to get an accurate picture of the performance of driverless cars.

There are few clear federal regulations that set rules for how autonomous vehicles must function, and what standards they must meet before they are tested on public roads. At the federal level, the National Highway Traffic Safety Administration gathers mostly self-reported crash data from companies. In California, the DMV issues permits for testing and deployment, and the CPUC regulates commercial passenger service programs.

In San Francisco, city officials have no say over if — or how — the cars are deployed on their streets.

That lack of control has unnerved city officials, especially as driverless cars created by Cruise and Waymo have become ubiquitous in San Francisco. The cars have caused major headaches around the city, as they have disrupted first responders on numerous occasions, from rolling into scenes cordoned off by caution tape to once colliding with a firetruck on its way to an emergency scene. City leaders attempted to halt the expansion by highlighting these incidents, but were ultimately unsuccessful.

In an interview with The Washington Post last month, Cruise CEO Kyle Vogt said the criticism of driverless cars and the incidents involving his company were overblown.

“Anything that we do differently than humans is being sensationalized,” he said at the time.

Who’s responsible when there’s no driver?

While it was a human that hit the pedestrian and a Cruise vehicle that dragged her for 20 feet, the Board of Supervisors president, Peskin, said those on the CPUC who granted the company expanded permits — despite a flurry of issues reported with the technology — also bear some responsibility for the crash.

“Yes I blame Cruise,” he said. “But there was supposed to be a check and balance — and that check and balance completely failed, and it failed in a spectacular way.”

Terrie Prosper, a spokesperson for the CPUC, declined to make any of the commissioners available for an interview about this issue, saying “this matter is under deliberation.”

Moving forward, Chiu, the San Francisco city attorney, said officials are still working on their request to appeal Waymo’s permits to operate their robotaxi service in the city.

While the company has not caused as many high-profile incidents as Cruise lately, he said it is important for the state to “go back to the drawing board” until regulators can figure out clearer standards for the technology.

“The fact that we have multiple state agencies that appear to be working in different directions is challenging,” he said. “Who is ultimately responsible for ensuring safety on our streets?”

Reference

Denial of responsibility! Pedfire is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment