Washington DC - Despite promises that self-driving cars are in the fast lane, it could be a long haul before they merge with everyday lives.
Industry trackers and analysts caution that technical challenges along with legal and liability issues will be speed bumps on the road to self-driving cars becoming common out on the street.
The US House of Representatives on Wednesday approved legislation aimed at clearing the path for introduction of self-driving vehicles by requiring consistent regulations across the 50 states.
A tweet from Representative Greg Walden, who chairs the House panel that drafted the bill, said: "The Self Drive Act will help pave the way for self-driving cars nationwide and ensures America stays a global leader in innovation."
The bill, which needs Senate approval before being sent to the White House, would prevent states from imposing regulations on autonomous vehicles that would make it more difficult for manufacturers to deploy self-driving cars nationwide.
At the end of June, the Group of Seven countries expressed a commitment to remove regulatory obstacles and smooth the way for self-driving vehicles. Big car companies are racing to get 'autonomous vehicles' in gear, and their competition includes Silicon Valley innovators such as Apple, Google, Tesla and Uber.
Major carmakers have promised to have self-driving models coming off assembly lines as early as the year 2020; even computer chip giant Intel announced plans for a fleet of self-driving cars, breaking the news after closing a $15 billion (R193 billion) deal to buy Israeli autonomous technology firm Mobileye.
Waymo, a self-driving car company owned by Google-parent Alphabet, is testing self-driving cars with volunteers in Arizona, and in California alone, about 40 companies have permits from the state to test cars without drivers on the roads. New York is open to similar testing with an eye toward reducing accidents.
Meanwhile, Tesla boasts that all of its models are equipped with sensors, cameras, and other technology to enable them to navigate routes without human involvement, and car makers have already put aspects of the technology to work with features such as self-parking and automatic braking to avoid collisions.
It is estimated that more than 90 percent of driving accidents result from human error, and advocates of autonomous vehicles argue they will save lives and avert injuries. Tesla founder Elon Musk has publicly contended that the foundation is laid for cars to navigate completely on their own twice as safely as vehicles controlled by human.
Despite advances in sensors, software and machine smarts, some argue for companies and authorities to throttle back expectations.
Technalysis Research analyst Bob O'Donnell commented: "It's time to face some challenging realities when it comes to the world of autonomous cars. For those predicting radical changes in how consumer-purchased cars and trucks are built, bought, and used over the next few years, it's time to stop the charade."
He listed concerns including security, design complexity, legal expectations, lack of infrastructure for electric cars, suggesting a long timeline until reliable self-driving vehicles merge into the mainstream.
Electric cars are seen as leading the charge in self-driving, and those models are a scant portion of vehicle sales. Regulations in countries around the world would have to catch up to, understand, and adapt to self-driving technology, while insurance companies, and probably the courts, will need to work out who gets the blame when accidents happen.
According to US press reports, more than a dozen engineers and Tesla executives working on autonomous capabilities in cars have internally expressed worries about whether the technology is safe enough to be out in the wild. Musk has held firm that the company's cars need only a green light from regulators to start driving themselves.
Vulnerabilty to hacking
Even if the technology left to its own devices proved trustworthy, some worry about the potential for hackers to remotely take control of vehicles in scenes seeming fit for futuristic action films. In 2016 Tesla deployed a security patch for the Model S after Chinese researchers claimed to have hacked into one through a wireless connection.
A Tesla model with autopilot was involved in a fatal accident in the United States in 2016 and, while the technology was cleared of culpability several opinion polls indicate many people remain reluctant to take their hands from steering wheels.
The question also arises of what kind of ethics will be programmed into car-controlling software. For example, what should a self-driving car do if forced to chose between saving its occupant or a pedestrian?