Autonomous Vehicles and Liability – Social values behind the software’s automated decision

Your AV hits pedestrians, in order to save your life. Is it acceptable to put other lives at risk instead of our own? And, who is liable for the accident, the driver or the software’s programmer?


An autonomous vehicle (AV) or self-driving car is a vehicle that can move safely without human input. Researchers of this emerging technology estimate a reduction of 90% of road fatalities.[1]

Smart mobility is one of the key elements in the cities of the future, and when it comes to smart mobility, one of the main themes is automated cars. It is, therefore, foreseeable that AV’s will, at some point, replace conventional human-driven cars.

Self-driving cars will stimulate innovation in the fields of mobility, sustainability and efficiency, of which the reduction of pollution, saving of energy, optimizable traffic by AV’s intercommunication (eliminating bumper-to-bumper traffic), minimizing unproductive and stressful driving time are good examples.

Sustainability is an important element in AV’s, as they have the potential to reduce pollution and save energy, which relates to less injuries and death. Furthermore, it also opens opportunities for improvement in mobility for people with disabilities, elderly population, and others.

The Society of Automotive Engineers (SAE) created six levels of driving automation, which means there is a spectrum and not an opposing distinction between human-driven and autonomous vehicles.[2] When discussing AV’s, it is usually referring to levels four and five of automation, where human interaction is optional or there is no human interaction at all, respectively. This paper focuses on these last two levels of automation.

Although the advantages are tremendous, other issues will arise when AV’s hit the road, especially in terms of ethics and liability. For instance, would we prefer to buy a car that would save the passengers at any costs, or to have a utilitarian car, that would minimize the risks, even if that meant killing its passengers? And who is liable for a self-driving car crash? Some point to the vehicle owner, others to the manufacturers or the software’s programmers. This article aims to consider and discuss the various alternatives expressed in the legal community and, ultimately, propose the fairest option.

The Trolley Problem and Autonomous Vehicles

The trolley problem involves an ethical dilemma in which there is a scenario with a runaway trolley that is about to collide with five people. But a bystander has the chance to intervene by changing the track of the trolley, which will result in the death of only one person.

There are many variants to the trolley problem, but the ethical question is always the same: is it better to do nothing (which will result in the death of five people), or to interfere in its course, and in that way, sacrifice one person that would otherwise be safe from this accident?

The trolley problem is often brought up when discussing autonomous vehicles since it is the programmer of the software who has to anticipate various scenarios in which there are inevitable deaths. Then, the AV needs to make a decision whether to protect the passengers, the passers-by or the bystanders.

If we were in the same situation with a human-driven car, we (the driver) would panic react by either staying on course or swerving, but it would be just a reaction and not a deliberate decision.

As for AV’s, the decision it makes will not be an instinctual one, but an instruction by a programmer, which could be seen as a discriminatory, given that the algorithm will systematically favour a certain object or type of person to crash into.

Civil Liability Issues

Nowadays, with conventional cars, liability is generally attributed to the driver, so current laws may not apply to AVs. As there is artificial intelligence involved, researchers identify, other than the AV’s owner, also the manufacturer, the programmer, the occupants, the supplier or even a hacker, as possible responsible actors.

Note that when it comes to the design of the program and how the AV will make decisions in a situation with unavoidable fatalities (as mentioned in The Trolley Problem), policymakers have not yet reached a conclusion on the standards to use – for instance, if the number of victims is most relevant, or the likelihood of severe injuries.

When it comes to AVs of levels three and four, where the driving experience is shared between AV and human driver, one solution is to have a “black box” similar to airplanes to monitor the driving, to make it easier for insurance companies to ascertain who is at fault.[3] That means that, in the case of AVs, the driver can still be liable, but also that this liability can also be shifted to the design and manufacturing of AVs, in case it involves a matter of product defect and safety.

It is important to assign liability to the proper part, which will, in most cases, be the party in control of the vehicle when an accident occurs. Therefore, there is a need for products liability law to accommodate the emergence of AVs in our everyday reality. Since in most cases it is the manufacturer who should be liable for an accident, in case the vehicle was in autonomous mode, meaning it was the technology itself that was operating the vehicle, thus, making it responsible for the accident.

This type of product liability also ensures that manufacturers provide a safe product to consumers, by holding the manufacturers liable for any mistake or defect, which creates an incentive for programmers to deliver safety safeguards.[4]

Moreover, some researchers point in the direction of comparative fault between programmer and driver, depending on the facts of a specific car crash and who was in control of the car.[5]

In terms of enforcement of liability, one important difference to point out is the one between a product liability lawsuit and a car crash lawsuit, since the first one is more time-consuming and expensive than the last and, in that way, making it a more disadvantageous option for some victims. The 2016 RAND study suggested, due to this situation, to make the liability shift from driver to manufacturer without the need to prove malfunction, in other words, a no-fault automobile insurance that quickly compensates the victims without the need of proving fault of one party.

Nevertheless, having a system that always makes the manufacturer liable will reflect on the prices of AV’s, since their cost will have to contain the projected liability.


Autonomous Vehicles have gained tremendous attention over the last few years, with Tesla and Google leading the market, followed by BMW and Volvo.

On the one hand, some researchers point in the direction of comparative fault between programmer and driver, depending on the facts of a specific car crash and who was in control of the car. On the other hand, it is also considered the possibility to have manufacturers be liable in all instances, with a no-fault insurance to automatically compensate the victims. One of the reasons why this last solution is proposed is to make the lawsuit available for everyone, since a product liability lawsuit is overall more expensive., On the whole, having a system where liability is analysed according to a specific situation might seem to be the fairest path for now.

Considering the shortcomings of each option, there is still a long way to go in terms of policies and legislation, although the clock is already ticking as AVs are expected to be deployed in the next couple of years.

[1] Viktória Ilková, Legal aspects of autonomous vehicles – an overview, 21st international conference on process control, 2017, p. 1

[2] The first levels (zero, one and two) always require human intervention, though they have some automated system, like cruise control or steering performance. The third level has environmental detection capability, but still requires human input, thus the driver must be alert to take over if the system isn’t able to perform a certain task.

[3] Alawadhi, Mohamed, Almazrouie, Jumah, Kamil, Mohammed and Khalil, Khalil Abdelrazek, Review and analysis of the importance of autonomous vehicles liability: a systematic literature review, International Journal of System Assurance Engineering and Management, 2020

[4] Jeffrey Gurney, Sue my car not me: products liability and accidents involving autonomous vehicles, Journal of Law, Technology & Policy, vol. 2013, p. 25

[5] Lynne McChristian and Richard Corbett, Regulatory Issues Related to Autonomous Vehicles, Journal of Insurance Regulation, vol. 35, n.º 7, 2016, pp. 9-10

Os Insights aqui publicados reproduzem o trabalho desenvolvido para este efeito pelo respetivo autor, pelo que mantêm a língua original em que foram redigidos. A responsabilidade pelas opiniões expressas no artigo são exclusiva do seu autor pelo que a sua publicação não constitui uma aprovação por parte do WhatNext.Law ou das entidades afiliadas. Consulte os nossos Termos de Utilização para mais informação.

Deixe um Comentário

Gostaríamos muito de ouvir a tua opinião!

Estamos abertos a novas ideias e sugestões. Se tens uma ideia que gostarias de partilhar connosco, usa o botão abaixo.