Self-driving car users shouldn't be accountable for accidents, report suggests

An autonomous car on a winding road
(Image credit: Getty Images)

Drivers should be reclassified when operating a fully autonomous vehicle and not be held legally accountable for any road accidents, a new report has proposed.

Rather, the company behind the driving system should be responsible, according to the law commissions for England, Wales and Scotland.

Have driverless cars stalled? The slow road to driverless cars Driverless Cars: Uber car involved in fatal crash had software flaws

The law commissions were asked to look into self-driving cars back in 2018 with the ultimate goal of publishing a series of reports for a regulator framework for autonomous vehicles and their use on public roads.

While driverless car technology has been deployed in various settings, the technology is still not ready to be fully implemented on the UK's roads.

The report, which will be laid before UK, Scottish and Welsh governments for consideration, would potentially force wholesale changes for both car makers and road safety laws.

It recommends that a clear definition of 'autonomous cars' is legally stipulated to avoid future confusion. This, it states, should create a "clear bright line" between systems that require attention and those that do not. This would help tackle the "problem of passivity", according to the report, which cites research into human behaviour that shows people find it difficult to monitor a task passively, rather than when one is fully engaged.

"Once their eyes and minds wander away from the road, they have limited ability to respond appropriately to events," the report said. "They should not be held accountable for failing to notice problems."


Seven leading machine learning use cases

Seven ways machine learning solves business problems


In 2018, an Uber test vehicle hit and killed Elaine Herzberg as she attempted to cross a road with her bicycle. Reports have since suggested the car had software faults, specifically that the system couldn't distinguish Herzberg from the bike. However, there was also a driver in the car, who was said to be watching a TV show on his phone at the time of the crash.

It's incidents such as this that the report is looking to address. It states that having a system that still requires "passive" attention is flawed, and recommends that a new "authorisation" scheme is needed to decide whether an autonomous is or is not 'self-driving' as a matter of law.

This would mean laws will need to change so that the person in the driving seat is no longer classified as a 'driver'. They will be the 'user-in-charge' and have immunity from a range of offences related to the way the vehicle drives, including forms of dangerous driving and exceeding the speed limit.

Other makers of autonomous cars, such as Tesla, have already seen a number of fatalities with their self-driving vehicles. Most recently, two men died when their Tesla veered into a tree and caught fire. An investigation into the crash is still ongoing with law enforcement and Tesla boss Elon Musk disputing whether the car's autonomous features were engaged and, as such, at fault for the crash.

This accident is also relevant to the law commission's report because Tesla allegedly failed to hand over data for the car to the police. Under the proposed laws, such data must be made accessible and there would also be sanctions for car makers that fail to reveal how their systems work.

Bobby Hellard

Bobby Hellard is ITPro's Reviews Editor and has worked on CloudPro and ChannelPro since 2018. In his time at ITPro, Bobby has covered stories for all the major technology companies, such as Apple, Microsoft, Amazon and Facebook, and regularly attends industry-leading events such as AWS Re:Invent and Google Cloud Next.

Bobby mainly covers hardware reviews, but you will also recognize him as the face of many of our video reviews of laptops and smartphones.