Back to blog

LCN Blogs

The legal challenges of self-driving cars

The legal challenges of self-driving cars

Max Alexander-Jones

25/11/2021

Reading time: four minutes

The Department of Transport recently announced its intention to allow Automated Lane-Keeping Systems (ALKS) to be used on British roads by the end of 2021.

The technology is similar to the ‘Autopilot’ function in a Tesla — it’s responsible for controlling the position and speed of a car in a single lane, with a speed limited to 37 mph (60 km/h).

Cars functioning with ALKS would be defined as self-driving, pending a GB-style approval, which would mean drivers would not be required to have their hands on the wheel or monitor roads.

This would represent a significant change for car insurance and road safety. However, it’s crucial to consider why such a change is regarded as a necessary step forward in automation.

The benefits of introducing ALKS

According to the Society of Motor Manufacturers and Traders, automated driving systems could prevent 47,000 serious accidents and save 3,900 lives over the next decade.

The UK’s ambitious climate targets combined with the ongoing COP26 are also a reason why automated electric vehicles might be necessary. With the UK setting ambitious carbon net-zero targets by 2050 and a 78% cut in emissions by 2035, electrical vehicles and their potential self-driving technology will be vital in reducing harmful emissions from petrol and gas.

Risks of introducing ALKS

However, the reward of new electric self-driving vehicles is not without risk. Many are apprehensive about the marketing of automated vehicles. Matthew Avery, director of research at Thatcham Research, states: "ALKS as currently proposed by the government are not automated […]. They are assisted driving systems as they rely on the driver to take back control.”

Perhaps the most pressing safety issue will be driver overconfidence, seen in 2018 when a Tesla-driving Nottingham resident climbed into their passenger seat while on the motorway. Notably, the safety mechanisms in even some of the most sophisticated self-driving cars have demonstrated worrying issues.

The consumer advocacy group Consumer Reports found that the Tesla’s Autopilot system (which requires drivers to have their hands on the wheel) could be gamed by sitting on the driver's seat belt and using a small, weighted chain on the steering wheel.

Although self-driving cars are touted as bulwarks against human error, their potentially misleading advertising may instead embolden individuals to act in irresponsible ways while their cars drive for them. This may lead to unsafe road behaviour and could cause an increase in serious road accidents.

If this were the case, who would be held responsible:

  • The Netflix-watching ‘driver’;
  • the manufacturer; or
  • the tech boffin in Silicon Valley?

The proposed law on liability is quite clear here. Drivers behind the wheel of a GB-approved automated vehicle in self-driving mode would not be deemed as the active driver. Rather, they are a “user-in-charge”, free to keep their hands off the wheel and catch up on Netflix.

The ‘user-in-charge’, in all their TV-watching, lunch-eating freedom, would not be prosecuted for various breaches of traffic rules. Instead, the issue is to be resolved between the safety assurance regulator and the Automated Driving System Entity (ASDE). This entity will be the vehicle manufacturer, software developer or both.

Nevertheless, this does not leave the user-in-charge free to ignore warnings presented by the car. A ‘transition demand’ that eliminates screen use unrelated to driving, provides clear signals and gives enough warning to the user-in-charge would be expected to provoke a response from the driver.

How liable is the human driver?

However, this does beg the question of how liable the human driver would be having received these warnings. A significant challenge to legislating this issue revolves around the severity of action needed by the driver and the length of warning provided.

Can a vehicle truly be deemed ‘self-driving’ if a human is required to respond to external events such as a tyre blow-out, changing weather or the presence of emergency services on roads? Does the requirement of potential evasive action while driving (eg, a hard stop) negate a car’s ability to be defined as autonomous or self-driving?

A further challenge in devising laws centres on the functionality of driverless systems outside of motorways and their responsiveness to road alerts. Currently, ALKS cannot operate outside of motorway conditions.

There’s also concern about their ability to adapt to HGVs that may obscure signs that overrule the road’s status or limits at the time. Manufacturers and software developers must consider whether accepting full liability will be a shared operation, or whether those responsible for the software should foot the bill.

Whatever the legal outcome, it may not be too long until you can binge-watch a Netflix blockbuster series behind the wheel of a car!

For more on the rise of autonomous vehicles: