The European Parliament regularly receives enquiries from citizens about how the EU regulates social media and protects its users.
The European Union (EU) has adopted a series of rules to protect the rights of social media users. These intend to provide a safer online environment for users and set clear standards on how the tech companies operate whilst promoting digitalisation.
Protection of personal data and privacy
Your right to protection of your personal data is enshrined in the EU charter of fundamental rights. In 2016, the EU adopted the General Data Protection Regulation – often referred to as ‘GDPR rules’. The regulation applies to all companies that process their users’ data within the EU. Under the GDPR, social media companies must obtain explicit consent from their users to access and process their data. It establishes a series of rights for citizens including the rights to:
- access, correct and erase personal data held by companies (‘right to be forgotten’)
- receive your personal data collected by a company and to have it transmitted to another company (‘data portability’)
- receive notification of a personal data breach.
Individual data protection authorities in the 27 EU countries enforce the GDPR. They have the power to investigate complaints and impose fines for breaches. They are independent from government and work together to ensure consistent application of the rules across the EU through the European Data Protection Board. In 2023, the Irish Data Protection Commission imposed the largest fine so far – €1.2 billion – on Facebook’s parent company in Ireland, Meta Platforms. Meta’s export to and storage in the United States of EU Facebook users’ personal data (based on standard contractual clauses) was deemed to have breached GDPR rules. Meta has announced plans to appeal.
Under the 2002 EU e-privacy rules, social media platforms and messaging services like WhatsApp are banned from enabling the surveillance of their users, unless the user has consented, or if the surveillance is carried out by a legally authorised person, such as the police. In 2017, the European Commission proposed new rules to enhance the security and confidentiality of communications and set clearer rules on tracking technologies such as cookies. Parliament has adopted its position, but the procedure is awaiting agreement among the governments of EU countries.
The new EU digital rulebook
One of the priorities of the von der Leyen Commission was to make Europe ‘fit for the digital age’. In 2022, Parliament and EU governments brought in two major new laws to create a fairer and safer online world: the Digital Markets Act and the Digital Services Act. Broadly, the idea is that ‘what is illegal offline should be illegal online’.
The Digital Markets Act – limiting the power of big digital companies
The Digital Markets Act creates a level-playing field for all digital companies, enabling smaller firms and start-ups to compete more easily with the industry giants.
The act sets clear rules for large platforms (‘gatekeepers’) to stop them imposing unfair conditions on businesses and consumers. The European Commission has so far designated six gatekeepers: Alphabet (Google, YouTube), Amazon, Apple, ByteDance (TikTok), Meta (Facebook, Instagram) and Microsoft.
These platforms will no longer be able to favour their own services and products over those offered by third parties on their platform. They will be required to give users the option to remove any pre-installed software or applications, making it easier for users to switch between platforms and apps.
The act will also enhance compatibility between different messaging platforms. This means that whether they are on a small or large platform, users will be able to send messages, share files, and make video calls across different messaging applications.
Non-compliance risks significant fines: up to 10 % of the company’s total worldwide annual turnover, or up to 20 % in the event of repeated infringements. In March 2024, the Commission opened investigations into certain uncompetitive practices that might breach the act. This includes Alphabet (for giving preference to its own services on Google Search); Apple (for preventing users from being able to choose services on iPhones); and Meta (for obliging customers to consent to their data being used for targeted advertising if they do not agree to pay a monthly fee – the ‘pay or OK model’).
The Digital Services Act – ensuring a safe online environment
The Digital Services Act is a ground-breaking new law. From 17 February 2024, it applies to any digital platform, including social media, that acts as an intermediary to connect users with goods, services, and content. It applies to all digital organisations providing services in the EU, including those established outside the EU. It applies to both large and small operators, but very large online platforms or search engines are subject to additional rules. The European Commission has designated 19 platforms as such, including social media and networking channels such as Facebook, Instagram, TikTok or X (previously Twitter), and the Google and Bing search engines.
The act will hold these platforms legally liable for their users’ unlawful behaviour if they are aware of illegal content. Such ‘content’ includes child sexual abuse material, terrorist content, illegal hate speech or illegal goods and services.
The new rules focus on:
- Countering illegal content and dangerous and counterfeit goods, by making it easier for users to report them and for authorities to enforce action against them;
- Tackling online harassment and cyber bullying, by making sure any non-consensual private images and other abusive content can be quickly flagged by users and removed;
- Protecting children, by requiring platforms to ensure a high level of privacy, safety and security of minors on their services;
- Banning targeted advertising online based on profiling children or on sensitive data like sexuality, religion or race;
- Banning ‘dark patterns’ or ‘nudging’ techniques that might manipulate users into making choices they do not intend to make.
National authorities and the European Commission can enforce the act through a set of investigative and sanctioning measures. Companies who do not comply face hefty fines (up to hundreds of millions of euro) and an EU-wide ban. On 19 February 2024, the Commission announced formal proceedings to assess whether TikTok has breached the Digital Services Act in relation to the protection of minors, advertising transparency, data access for researchers, and the risk management of addictive design and harmful content.
Media Freedom Act
The European Media Freedom Act, adopted in April 2024, protects EU journalists and media from political or economic interference. It introduces a mechanism to prevent very big online platforms, such as X (formerly Twitter), Facebook or Instagram, from arbitrarily restricting or deleting independent media content.
Parliament calls for more protection for social media users
In December 2023, Parliament urged the Commission to propose new legislation against addictive design features such as automatic play and infinite scrolling, which affect children and young people in particular, and can lead to behavioural patterns and internet use that mirror addiction. It also asked the Commission to put forward a digital ‘right not to be disturbed’ allowing consumers to turn off attention-seeking features.
Following a petition regarding the impossibility of accessing basic banking services without a mobile phone, Parliament acknowledged that a divide exists between people who are able to use a digital means of payment, or to access public services, and others who cannot or are reluctant to use them. Parliament stressed that companies providing everyday services should offer a non-digital solution. It called on the Commission to consider the risks of discrimination against older people and other vulnerable groups when assessing payment services, and to ensure that digitisation is ‘human-centric’.
Further information
- Digital Markets Act, overview, European Commission
- Digital Services Act overview, European Commission
- Q & A on Digital Services Act, European Commission
- Implications of the Digital Transformation on Different Social Groups, Study, European Parliament
Keep sending your questions to the Citizens’ Enquiries Unit (Ask EP)! We reply in the EU language that you use to write to us.




Comments are closed for this post.