How driverless cars will be a hijack risk

Published Feb 15, 2017

Share

It is the year 2030. Driverless cars are gaining in popularity, sold on the promise they will make travel safer and more enjoyable. But cunning criminals have worked out how to exploit the vehicles’ high-tech security systems - if they throw something at them, the machines will instantly come to a halt in order to avoid injury or damage.

Gangs of armed youths roam wealthy areas, forcing the autonomous cars to stop, smashing their windows then robbing their helpless occupants of jewellery and money. Or worse, hijacking them.

This nightmare scenario may sound like something out of a dystopian science-fiction novel, but it is actually a prediction made in a study that was commissioned by the UK Government-commissioned about the imminent arrival of driverless vehicles on their streets, and there are clearly even bigger implications if driverless cars ever become a common sight on South Africa's crime-ridden streets.

The study paints a disturbing picture of the future of transport in which cars driven by computers get in the way of ambulances, are hacked to give away celebrities’ whereabouts and even decide whether it is better to run over a child or a pensioner.

It is a very different vision from that of UK Ministers, who want to put Britain at the forefront of the coming revolution in motoring in the hope it will generate as much as £51 billion (R826bn) for the UK economy, save 2500 lives and create 320 000 jobs by 2030.

Last year’s Autumn Statement pledged £100 million for new ‘testing infrastructure’ and a Modern Transport Bill will be published soon to change insurance law so that users of driverless cars are covered as well as their vehicles.

Four trials of the vehicles are already under way, in Greenwich, Bristol, Coventry and Milton Keynes, while ‘platoons’ of driverless trucks connected by computers are earmarked for pilot studies on the M6 in Cumbria.

Roads will be safer though

Transport Minister John Hayes told a House of Lords committee last year that driverless cars "could make driving safer" because "human error is the principal cause of most road traffic accidents". He also said the technology "might be beneficial in respect of accessibility" because it would allow the elderly and disabled to get around more easily. And it might lead to less congestion on the roads and fewer harmful emissions because "people will acquire a car when they need it, rather like they might book a taxi now when they need it" rather than everyone having to own one.

Engineers have devised a scale of six levels of vehicle automation, from zero where the driver is in complete control, to five where computers are in charge of everything and occupants never take the wheel.

Although some ‘level four’ cars have been tested at slow speeds on UK roads, experts say level five is many years off because manufacturers are still trying to perfect the sensors and cameras needed to make sure drivers are never required to intervene.

The Department for Transport is also aware that the public will have to be convinced that it is safe to let computers take control of the steering wheel and brakes of a speeding car, and so commissioned experts at University College London’s Transport Institute to "identify the key social and behavioural questions that should be addressed relating to automated vehicles".

This included drawing up a "number of plausible scenarios of future technologies and usage patterns… representing a possible future where the introduction and/or adoption of autonomous vehicles had led to unexpected and, in some cases, controversial events".

Automated mugging

One of the 12 scenarios was described in a mocked-up article from a transport magazine dated February 2026 and headlined ‘Automated mugging’. It imagined MPs had launched an inquiry into "the vulnerability of occupants of fully autonomous vehicles following a series of high-profile vehicle-jackings and personal muggings in wealthy, low-density areas at night - throwing into question the whole idea of 'hands-free' driving".

The future article stated: "AVs travelling down residential streets have been suddenly surrounded by groups of young men, wielding bars and bats. The vehicles halt, to avoid causing injury, and then remain immobile while windows are smashed and occupants are threatened. Having suffered the fear and humiliation of the attack, the occupants are further angered by the vehicle’s monitoring systems identifying damage and thereby refusing to restart so they can resume their journey."

The report painted a scenario: "Sue Brown was returning home from a night out with friends and while her vehicle was passing a local park something was thrown in front of her car, which made an emergency stop.

"Immediately she was surrounded by four youths; one smashed a side window and demanded her necklace, watch and purse. 'What could I do?' she recalls. 'If I’d had my old manual car I’d have driven at them and they would have soon scattered! We bought this car as we were told it was a lot safer… we traded it in the next day for a ‘proper’ car.'" In fairness, though, this is rather mild compared to some of the crimes that that a motorist such as Sue would be vulnerable to.

Participants in the Whitehall event were asked if this would be a serious enough problem to halt the progress of driverless cars, and if it made sense to allow technology where "the occupant has no control over the actions – or inactions – of the vehicle".

The research also imagined that driverless cars could hinder the progress of emergency service vehicles, leading to unnecessary deaths, because some would pull over to the left of the road and others to the right.

Leaving you stranded and ethical dilemmas

And it invented a social-media post by a woman whose car – called a Wu Ming X36 and boasting SitBack technology – had suffered a "complete systems failure" while she was in the Highlands and left her stranded as it had no manual override or even a steering wheel. 

One scenario, written in the style of a newspaper column, imagined how "the world’s most renowned moral philosophers" had developed a ‘behavioural algorithm’ to decide how autonomous vehicles would deal with ethical dilemmas.

But it led to a professor’s Oxford home being firebombed because of the decision the software made in one crash. The article imagined that the car had chosen to spare the life of "a four-year-old boy with a degenerative disease likely to mean he would die before the age of ten", but leading to the death of "a 78-year-old with 13 grandchildren".

Those taking part in the research last year were asked if we can "ever get used to the notion of setting rules that place higher value on some lives than others", and whether it would make a difference if ‘we expect many fewer people would be dying than is currently the case without autonomous vehicles’.

Another scenario imagined that children would take to playing ‘chicken’ with the ‘Robodrivers’ by stepping in front of them to see if they stopped in time.

A more positive article highlighted the possibility that commuters would "gain an hour a day" as they could use their journeys to work or play computer games with their children instead of having to concentrate on the road ahead.

Participants in the Government-commissioned workshops were asked: "How far away in technological terms do you think we are from a scene like this occurring in practice?"

Having been confronted with these images of a dark future, they could be forgiven for hoping it is still many decades off.

Mail on Sunday

Related Topics: