Written by Rafał Mańko,
The European Parliament has recently called upon the Commission to table a legislative proposal laying down a set of civil law rules on robotics and artificial intelligence. These should address such issues as liability for damages caused by a robot, as well as establish a European agency for robotics and artificial intelligence.
Growing numbers of robots
Between 2010 and 2014, the average increase in sales of robots stood at 17 % per year, and in 2014 sales rose by 29 %. The main drivers of the growth are automotive parts suppliers as well as the electrical and electronics industries. In January 2015, the European Parliament’s Legal Affairs Committee established a working group on legal questions related to the development of robotics and artificial intelligence in the EU, with a focus on civil law aspects. The group held 10 meetings between May 2015 and September 2016, and heard advice from a number of stakeholders, scientists and lawyers. In June 2016, the EPRS Scientific Foresight Unit published an expert study on the Ethical Aspects of Cyber-Physical Systems (CPS). CPS are intelligent robotics systems, linked with the Internet of Things, or technical systems of networked computers, robots and artificial intelligence that interact with the physical world. Examples include automated cars and drones, as well as robots used in healthcare, as aids for disabled people and in agriculture. The study drew attention to possible risks from the development of robotics, including such aspects as employment, privacy protection, safety and civil liability.
On 16 February 2017 Parliament adopted a resolution formally requesting the Commission to bring forward a legislative proposal establishing civil law rules on robotics and artificial intelligence, governing in particular questions of liability and ethics of robotics.
Registering smart robots
The resolution proposes to introduce a system of registration for ‘smart robots’, that is, those which have autonomy through the use of sensors and/or interconnectivity with the environment, which have at least a minor physical support, which adapt their behaviour and actions to the environment and which cannot be defined as having ‘life’ in the biological sense. The system of registration of advanced robots would be managed by an EU agency for robotics and artificial intelligence. This agency would also provide technical, ethical and regulatory expertise on robotics.
Liability for damages caused by robots
As regards liability for damage caused by robots, the resolution suggests that liability could either be based on strict liability (no fault required) or on a risk-management approach (liability of a person who was able to minimise the risks). Liability should be proportionate to the actual level of instructions given to the robot and to its degree of autonomy. Rules on liability could be complemented by a compulsory insurance scheme for robot users, and a compensation fund to pay out in cases where no insurance policy covered the risk.
Codes of conduct
The resolution proposes, as an annex, two draft codes of conduct – a Code of Ethical Conduct for Robotics Engineers and a Code for Research Ethics Committees. The first code puts forward four ethical principles in robotics engineering: 1) beneficence (robots should act in the best interests of humans); 2) non-maleficence (robots should not harm humans); 3) autonomy (human interaction with robots should be voluntary); 4) justice (the benefits of robotics should be distributed fairly).
This note has been prepared by EPRS for the European Parliament’s Open Days in May 2017.