Self-driving feature is ready for Tesla owners but regulators are not convinced

Published Sep 27, 2021

Share

TESLA is expected to issue a wide-release software it deems to be self-driving, giving owners the option as soon as midnight to upgrade to its most advanced driver-assistance suite and soon putting thousands of drivers on the road with the unregulated and largely untested features.

It’s the first time the company has let typical owners upgrade to the software it calls “Full Self-Driving,” although the name itself is an exaggeration by industry and regulatory standards.

Tesla chief executive Elon Musk said in a tweet that Tesla would release a button as soon as Friday, letting owners obtain the upgraded suite of advanced driver-assistance features, which Tesla says is a beta, although they won't receive the capabilities right away.

Owners will have to agree to let Tesla monitor their driving behaviour for seven days through a company insurance calculator. If their driving is deemed to be “good,” Musk said on Twitter, “beta access will be granted.”

It’s the latest twist in a saga that has regulators, safety advocates and family of Tesla crash victims up in arms because of the potential for chaos as the technology is unleashed on real-world roads. Until now, roughly 2 000 beta testers have had access to the technology.

Friday’s presumed general release would make it available to all who have purchased the now-$10 000 (about R150 000) software upgrade, and those who have purchased a subscription from Tesla for about $100 to $200 a month.

Still, as recently as July, Musk said the technology was a “debatable” proposition, arguing, that “we need to make Full Self-Driving work in order for it to be a compelling value proposition.”

And already, investigators are looking at its predecessor, dubbed Autopilot. That navigates vehicles from highway on-ramp to off-ramp, can park and summon cars, with a driver monitoring the software. The National Highway Traffic Safety Administration opened an investigation last month into around a dozen crashes involving parked emergency vehicles while Autopilot was engaged.

“Full Self-Driving” expands Autopilot’s capabilities to city streets and offers the ability to navigate the vehicle turn-by-turn, from point A to point B.

Tesla and NHTSA did not immediately respond to requests for comment. Tesla has repeatedly argued that Autopilot is safer than cars in manual driving when the modes are compared using Tesla data and information from NHTSA.

Musk has said “Autopilot is unequivocally safer” than typical cars. The data is not directly comparable, however, because Autopilot is supposed to be activated on certain types of roads in conditions where it can function properly.

Tesla’s move to rapidly roll out the features to large numbers of users is drawing criticism from regulators and industry peers who say it is taking a hasty approach to an issue that requires careful study and an emphasis on safety.

Despite its name, the new software does not qualify as “self-driving” under criteria set by the auto industry or safety regulators, and drivers should always pay attention while it is activated.

“I do think that their product is misleading and overall leads to further misuse and abuse,” said National Transportation Safety Board chairperson Jennifer Homendy, before turning to Musk himself. “I’d just ask him to prioritise safety as much as he prioritises innovation and new technologies ... safety is just as important, if not more important, than the development of the technology itself.”

As for the evaluation period for drivers who want to sign up, a full criteria for qualification has not been laid out – but Musk has dropped hints, saying drivers who make frequent use of the company’s Autopilot software will be rated “Good.” Owners will be able to track their progress in real-time, he said, and will be guided on how they can satisfy the requirements.

Late last month, industry group Chamber of Progress took aim at Tesla’s marketing of the technology.

Tesla’s cars “aren't actually fully self-driving,” wrote the group, which is supported by Apple, Alphabet-owned Waymo and General Motors-backed Cruise. “The underlying issue here is that in case after case, Tesla’s drivers take their eyes off the road because they believe they are in a self-driving car. They aren’t.”

Homendy, the NTSB chair, said Tesla has not shown an active interest in improving the safety of its products. She said the board has made recommendations stemming from fatal crashes in Williston and Delray Beach, Florida, as well as Mountain View, California, but they have gone unanswered.

“Tesla has not responded to any of our requests,” she said. “From our standpoint they’ve ignored us – they haven’t responded to us. And if those are not addressed and you’re making additional upgrades, that’s a problem,” she added.

Following an investigation into a 2018 crash that killed a driver when his vehicle slammed into a highway barrier, the safety board called on NHTSA to evaluate whether Tesla’s systems posed an unreasonable safety risk.

Homendy said NHTSA needs to take a more active role in the matter. The agency recently began requiring reporting on all crashes involving driver-assistance systems.

“It is incumbent on a federal regulator to take action and ensure pubic safety,” Homendy said. “I am happy that they’ve asked for crash information from all manufacturers and they’re taking an initial step with Tesla on asking for crash information on emergency vehicles. But they need to do more.”

On Twitter, a steady stream of videos from early beta tests have depicted the still nascent “Full Self-Driving” system’s confusion at new obstacles. The system has been shown struggling with roundabouts and unprotected left turns, abruptly veering toward pedestrians, and crossing a double-yellow line into oncoming traffic.

In the latter case, the user wrote: “I want the best for Tesla, but going wide release is not the move, not right now at least.”

Others said they have suffered personally from Tesla’s rapid deployment of its software, and urged the company to reconsider.

Bernadette Saint Jean’s husband, Jean Louis, was killed in July on the Long Island Expressway when a Tesla believed to be using automated features struck him on the side of the road, a crash being investigated by NHTSA.

“Tesla should not be expanding its Autopilot or Traffic-Aware cruise control systems until they can tell me why my husband and all of those First Responders had to die and be injured,” said Saint Jean, of Queens, in a statement through her attorney, Joshua Brian Irwin.

The Washington Post