Although cars like the Tesla S can do much of the driving themselves, drivers still need to pay attention to the road. Beck Diefenbach / Reuters.

Johannesburg - As we marvel at the advances made in semi-autonomous cars, it’s important to remember that they’re not quite ready to take over full driving duties just yet.

Earlier this year, the much-publicised death of a Tesla driver due to the failure of the car’s autopilot system underlined the disparity between reality and consumer expectations - at least the expectations of early adopters who are ready to wholeheartedly embrace new technology.

In May a Tesla Model S crashed into a truck on a Florida highway while in its semi-autonomous autopilot mode. The car didn’t brake after its sensors failed to detect the truck’s white exterior against the bright sky, and the driver didn’t take over and apply the brakes - either because he was distracted or because he put his complete trust in the autopilot.

Tesla has now updated the software so that the car can better sense the surrounding environment, including bouncing radar waves under a vehicle in front to see further ahead.

However, it’s important to remember that the Tesla and other cars with semi-autonomous systems (including the Mercedes E-Class and BMW 7 Series) are just that: semi-autonomous.

While they are able to accelerate, brake and steer themselves to improve convenience and safety, drivers are still required to keep their hands on the steering wheel and eyes on the road. Such cars have built-in safety systems that disengage the autopilot if the driver ignores warnings to grip the wheel.

In the E-class, if the driver fails to place their hands on the steering within a certain time, the car automatically comes to a stop and switches on its hazard flashers.

For now, today’s semi-autonomous cars are still a stepping stone to a future where vehicles will be able to drive themselves without any human intervention. Systems like adaptive cruise control and automated braking and steering are preparing us for the psychological change of one day operating a car that makes all the driving decisions by itself.

Legal and ethical questions

Before we set these artificial-intelligence vehicles loose on the world, there is a wide range of legal and ethical questions to be answered. Cars still need to be taught some ethics, ie. in the event of an unavoidable crash will the car choose to swerve one way and hit a pedestrian, or another and hit a motorcyclist?

There’s also the minefield of liability, and who will be responsible in a crash involving an autonomous car: the driver or the motor company?

These are some of the obstacles to overcome before we can simply sit back in a car and say “Home James”, not the least of which is that a car’s computer could potentially be hacked.

However, the day of robotised cars is getting ever closer and it’s estimated they will account for up to a quarter of car sales in 20 years’ time.

It’s indeed a future to look forward to. Unlike humans, autonomous cars will stick to the road rules and won’t succumb to road rage, and this good behaviour will greatly reduce the 1.3 million death toll on the world’s roads every year.

Although cars like the Tesla S can do much of the driving themselves, drivers still need to pay attention to the road.

Star Motoring

Like us on Facebook

Follow us on Twitter

Subscribe to our Newsletter