Commission’s Report on Ethics of Connected and Automated Vehicles

On September 18th, 2020, the European Commission published a report made by an independent group of experts on the Ethics of Connected and Automated Vehicles (CAVs). The report provides twenty recommendations for a safe and ethical transition towards driverless mobility and CAVs.

“Who should be responsible in case of a collision when there is no human driver? How can ethical and responsible data sharing by CAVs be ensured? Are pedestrians and cyclists more at risk with CAVs in traffic? In this new report, the Expert Group outlines twenty recommendations on road safety, privacy, fairness, AI explainability and responsibility for the development and deployment of connected and automated vehicles”.


In 2019, the European Commission formed an independent Expert Group “to advise on ethical issues raised by driverless mobility”. The Expert Group was created to ensure “a safe and responsible transition to connected and automated vehicles (CAVs)” through ethical considerations and adequate regulation.


Through the report and the twenty recommendations, the expert group covers ethical and legal challenges from “dilemma situations (involving crash-avoidance situations), the creation of a culture of responsibility, and the promotion of data, algorithm and AI literacy through public participation”.


The following are the 20 recommendations produced by the group of experts: 

Source: New recommendations for a safe and ethical transition towards driverless mobility, 18th September 2020, European Commission.

Additionally, the report is divided into three Chapters. 

The first one – Road safety, risk and dilemmas – demonstrates why safety improvements should “be achieved in compliance with basic ethical and legal principles, such as a fair distribution of risk and the protection of basic rights, including those of vulnerable users”.

The second one – Data and algorithm ethics: privacy, fairness ad explainability – details how the “acquisition of processing of static and dynamic data by the CAVs should safeguard basic privacy rights” and not “create discrimination between users”  and happen via “processes that are accessible and under stable to the subjects involved”.

The last one, focusing on – Responsibility – goes beyond seeking for who is liable for compensation. This part details an interesting legal analyze on why “it is also important to make different stakeholders willing, able and motivated to take responsibility for preventing undesirable outcomes and promoting societally beneficial outcomes of CAVs, that is creating a culture of responsibility for CAVs”. 



Like this article?
Share on Facebook
Share on Twitter
Share on Linkdin
Share by Email