SAN FRANCISCO - For a few hours late last month, Tesla cars began behaving erratically after receiving an overnight software update. Cars suddenly started slamming on the brakes at highway speeds, owners reported, risking collisions. CEO Elon Musk took to Twitter to acknowledge a problem with the software and vow that the update was being rolled back.
Ordinarily, that would have been the end of it for Musk, whose company has often flouted standard regulatory practices. But late last month, Tesla unexpectedly reported the glitch to the National Highway Traffic Safety Administration and issued an official recall notice detailing the problem, which affected as many as nearly 12 000 vehicles.
Tesla's decision comes as the Biden administration has stepped up enforcement of federal safety regulations regarding advanced driver-assistance systems - particularly Tesla's habit of issuing software fixes without reporting underlying problems. Last month, NHTSA publicly dinged Tesla for failing to issue a formal recall when it issued a separate software update to enable its cars to better see parked emergency vehicles in low light following around a dozen prior crashes involving parked emergency vehicles while Autopilot was activated.
It wasn't an isolated instance of criticism for the electric carmaker.
The National Transportation Safety Board, which has investigated multiple crashes involving Tesla's Autopilot software, has publicly called out the carmaker over its failure to follow up on its safety recommendations. NHTSA, the top federal vehicle safety regulator, is investigating the Autopilot software itself - bringing potential regulatory authority to the equation. Musk has taken aim at federal regulators, but they haven't budged. Transportation Secretary Pete Buttigieg said recently the CEO was free to take up his concerns directly with him.
Longtime auto industry observers and safety experts say Tesla is getting the message, even if Musk continues to be combative. The repeated incidents have potential financial and legal ramifications for Tesla, they said.
"Inside Tesla, there has been a shift," said John Rosevear, senior auto analyst at the Motley Fool. "'We're exposing ourselves here and we need to maybe get more serious about this.'"
Tesla, which has disbanded its public relations department, did not respond to a request for comment. The company has argued in the past that using Autopilot is safer than normal driving, based on comparisons of crash data. Musk has called Autopilot "unequivocally safer." Upon confirming the Full Self-Driving rollback, he said occasional issues are "to be expected with beta software," which is intended to be tested in a variety of conditions to iron out problems.
Company officials pledged to work with NHTSA in a recent earnings call, saying Tesla embraced the scrutiny on its software.
The official recall notice marks a sharp departure from Tesla's more typical mode of operation, in which it acts like its Silicon Valley neighbors to send update after update to the software that powers its products, making fixes in real time. It also issued nondisclosure agreements to users.
But Tesla vehicles are also being beta tested in real time on the road - and the latest updates highlight the heightened dangers that come with putting software that is still a work in progress in the hands of drivers.
Full Self-Driving is the latest iteration of the company's software, now in the hands of roughly 12 000 drivers who paid as much as $10 000 (R150 000) to upgrade and received early access or passed a safety screening. It adds capabilities to navigate city and residential streets, with an attentive owner behind the wheel at all times. The features are not autonomous by regulatory and industry standards.
But already drivers have been reporting issues. Videos uploaded to social media show the software struggling to navigate roundabouts, veering toward pedestrians and even abruptly turning toward oncoming traffic.
Even Musk acknowledged in July that the software - which began its roll out to users a year ago - was a "debatable" proposition for potential subscribers.
The company's less advanced version of the software, dubbed Autopilot, is standard on Tesla vehicles. The software can navigate highways from on-ramps to off-ramps, and can also steer within marked lanes.
Things started escalating when NHTSA announced over the summer it would begin requiring Tesla and other manufacturers to report on incidents involving advanced driver-assistance systems, such as Autopilot. And in August, the agency launched a formal probe of Autopilot after nearly a dozen crashes involving parked emergency vehicles.
One of Tesla's latest run-ins with NHTSA came in October, after Tesla didn't notify officials of the emergency vehicle software update in September. It issued the update shortly after the government opened a probe into Tesla collisions with emergency vehicles.
NHTSA notified Tesla such an action would typically be initiated through the federally established recall process, intended to remedy urgent safety risks through a combination of manufacturer expertise and government oversight.
"Any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA," an agency official wrote in the Oct. 12 letter.
The agency also issued a letter in October raising concerns about another practice: requiring Full Self-Driving beta testers to sign a nondisclosure agreement prohibiting them from sharing certain information about the software beta. The agency noted it relies on the feedback from the public to learn of potential safety issues.
Tesla did away with the agreement, according to Musk, who compared it to toilet paper.
Kevin Smith, of Murfreesboro, Tenn., drives a Tesla Model Y and is part of the beta test. On October 24, he hoped to test out the latest update but instead was locked out of the system, he said. And as he tried to get it to work, he heard from a fellow beta tester.
"He was screaming 'Do not use it! Do not use it!' " Smith said. "'We are trying to wake up the folks at Tesla, trying to get the word to Tesla.'"
One of Smith's fellow testers had shared a video showing one of the emergency braking events - including "a pretty dramatic slamming of the brakes," he recalled. "For that to trigger undesirably at high speeds is an incredibly dangerous event."
Smith noticed later that day Tesla had also remotely disabled his auto emergency braking and forward collision warning functions, safety features that he would ordinarily keep activated. And they hadn't let him know.
"Dear @elonmusk, are you in there crossing the streams? I didn't change this brah," he wrote in a tweet. "This isn't ok without any communication. Communication is not hard. I'm doing it now. Please advise."
By the following day, NHTSA had asked Tesla for more information about the incident, according to NHTSA spokeswoman Lucia Sanchez.
Auto safety experts say Tesla's tweaks to safety features - without any notice to owners - were an unprecedented violation of trust. And it was exactly the type of behavior that had triggered the attention of federal auto safety regulators in the past.
Carnegie Mellon University professor Philip Koopman, who focuses on autonomous vehicle safety, described it as "incredible - not in a good way."
"If you're testing, you need to know if they're changing your vehicle out from under you," he said. "Just taking that away and not making it super super obvious to drivers that that's happened is extremely concerning."
Tesla submitted its recall notice the following Friday. In the Oct. 29 bulletin, the company detailed why corrective action was necessary. It was the first time many drivers learned of what actually happened. That document, which confirmed what Telsa called the "false-positive braking" and accompanying warning chimes experienced by drivers, detailed the sequence of events that resulted from the buggy software update.
"Tesla began to receive reports of false [forward collision warning] and [automatic emergency braking] events from customers," it wrote. "In a matter of hours, we investigated the reports and took actions to mitigate any potential safety risk. This included cancelling [the update] on vehicles that had not installed it, disabling FCW and AEB on affected vehicles, and/or reverting software to the nearest available version."
Tesla also laid out the risk to owners. "If the AEB system unexpectedly activates while driving, the risk of a rear-end collision from a following vehicle may increase," it wrote.
Safety experts say it appears Tesla issued the recall because of the mounting regulatory pressure. NHTSA's Sanchez declined to say whether the recall came at federal safety regulators' urging. "NHTSA will continue its conversations with Tesla to ensure that any safety defect is promptly acknowledged and addressed," she said.
Publicly, Musk has reacted to the increased regulatory attention. He's taken aim on Twitter at the Biden administration and federal auto safety appointees from both the NTSB and NHTSA.
He lashed out last month after NHTSA appointed Duke University professor Missy Cummings, who has been critical of Tesla's Autopilot and autonomous ambitions, as senior safety adviser.
In a tweet, he called her track record "biased." Tesla-supporting Twitter users swarmed her account, attacking her record. Buttigieg defended the appointment and invited Musk to raise his concerns with him directly, according to news reports.
NHTSA declined to make Cummings available for an interview.
NTSB chair Jennifer Homendy recently spoke with The Post and other publications, publicly scolding Tesla for rolling out new features without addressing prior recommendations about Autopilot. Those included instituting better driver monitoring, and implementing safeguards to make sure it is used in the conditions for which it is intended.
Musk tweeted her Wikipedia page soon after. His followers also went after her.