Written by Mihalis Kritikos,
The current EU liability framework is incredibly complex and often inadapted to modern entities. It is therefore difficult for the subjects involved to understand exactly when a given obligation applies to them, and what kind of behaviour is required. This was one of the main conclusions of the study on ‘Liability of online platforms‘, carried out by Professor Andrea Bertolini of the Scuola Superiore Sant’Anna (Pisa) at the request of the STOA Panel, following a proposal from Christian Ehler (EPP, Germany), First Vice-Chair of the Panel for the Future of Science and Technology (STOA).
Online platforms (OPs) have gained significant economic and societal importance in the last decade and the public debate on their responsibilities and liability has reached an unprecedented level. They have penetrated all product and service markets and have changed the way in which goods are sold and purchased, and in which information is exchanged and obtained, allowing a shift from the off-line world to the online environment, where they provide numerous digital services including the mass diffusion of any type of content, both legal and illegal.
As a result of their growing importance, users and policy-makers raise questions about the platforms’ responsibility in the digital domain. The contractual responsibility of online platform operators has been subject to an intensive debate in the recent past, in the frame of the debate on the effective detection and removal of illegal material. While the operators of transaction platforms usually seek a role of mere intermediary, without considerable liability for the proper performance of the main contracts, there is increasing support for extending their responsibility. Moreover, the lack of international legal mechanisms to enforce the removal of abusive material complicates the tracing and prosecution of abusive behaviours online. Self-regulation efforts appear suboptimal, due to the presence of externalities and asymmetric information problems, warranting some form of liability rules.
What are the main legal/regulatory challenges associated with the operation of OPs and the efforts to detect and remove illegal/harmful material, content and/or products? Can we map the whole range of liabilities associated with the operation of online platforms and provide the conceptual clarifications necessary to address them systematically? Is the existing EU legal framework adequate to ensure protection for users and their fundamental rights and freedoms? Does the current liability regime reflect the position of users and platforms alone? Does it adequately address the interests of third parties that are potentially violated by user-generated content?
Against this background, the study identifies and assesses the relevant legal framework at the EU level, discussing the policy issues that deserve consideration, and identifies the possible policy issues and concerns, with respect to the application of the existing legal framework – comprised of both hard- and soft-law initiatives – deserving discussion and, in some cases at least, even regulatory intervention. The review of the main legal/regulatory challenges associated with the operation of OPs involves an analysis of the incentives for OPs, their users and third parties, to detect and remove illegal/harmful material, content and/or products.
One of the most important aspects of the study is the detailed discussion of the notion of OPs and the comprehensive classification it provides on the basis of multiple criteria. In fact, it maps and critically assesses the whole range of OP liabilities, taking hard and soft law, self-regulation, as well as national legislation, into consideration.
In doing so, the study sets out a much-needed conceptual framework by analysing the difference between responsibility and liability, and the different types of liability, distinguishing, on the one hand, between civil, criminal and administrative liability and between strict, semi-strict or fault-based liability on the other hand. It also makes an important distinction between liabilities connected with the activities performed or the content uploaded by OP users and alternative sources of liability, such as OPs’ contractual liability towards users, both businesses and consumers, as well as that deriving from infringements of privacy and data protection law. The proposed classifications demonstrate their plurality, as platforms differ pursuant to the activities and functions they serve, the multiplicity of actors they involve and the various ways in which they interact with them in their operation, their different sources of revenue and associated business models, the way in which they use and exploit data, and the level of control they exercise on users’ activities.
Against the analysis of OPs’ rights, duties and liabilities under the existing EU regulatory framework, the study suggests a set of policy options which could be used to shape this framework regarding the liability of OPs, and especially that relating to the illegal/harmful content or products distributed and/or made available through their infrastructures, such as content that infringes intellectual property rights (IPR), hate speech, terrorist content, content that harms children, counterfeit and unsafe products.
One of the most innovative aspects of the study is that the policy options are assessed against a variety of criteria including cost and benefits, feasibility and effectiveness, sustainability, their coherence with EU objectives, ethical, social and regulatory impacts and the effects on EU citizens’ fundamental rights and freedoms, and presented along a scale of increasing interventionism.
This new STOA study provides a timely, in-depth overview of the discussion on the liabilities of OPs that will inform the discussions on the recent European Commission proposal on a Digital Services Acts, which aims to establish a new, comprehensive transparency and accountability regime for OPs. While proposing the maintenance of the liability exemptions for tech companies by not subjecting them to a general monitoring obligation regarding user content, this legislative proposal proposes that, in certain circumstances, platforms could be held liable for third-party content, for instance, when failing to act after being alerted of illegal content. The study is expected to provide EU legislators with a wide range of pragmatic and well-balanced policy options during the discussion of this proposal.
Your opinion counts for us. To let us know what you think, get in touch via firstname.lastname@example.org.