Members' Research Service By / February 19, 2025

Fact-checking and content moderation

The spread of false information can have devastating effects on our societies, undermine democratic values, polarise public opinion, and endanger health, security and the environment.

© patpitchaya / Adobe Stock

Written by Polona Car.

Fact-checking of content on online platforms has so far played an important role in protecting democracy, by verifying statements and making sure trustworthy sources are used. Many social media platforms use fact-checkers to help them enforce content moderation policies, with the aim of protecting their users from harm. The EU’s Digital Services Act – a binding legal instrument – strengthens content moderation obligations for online platforms, while the voluntary EU Code of Practice on Disinformation encourages signatories to use fact-checking services consistently. However, some major platforms have recently challenged this approach.

Background

The spread of false information can have devastating effects on our societies, undermine democratic values, polarise public opinion, and endanger health, security and the environment. In a July 2024 Eurobarometer survey, 45 % of respondents said they believed that fake news and disinformation had the biggest personal impact on their lives. In a September 2023 UNESCO survey, as many as 85 % of respondents worldwide expressed concern about the impact of disinformation in their country. To counter this phenomenon, the European Digital Media Observatory (EDMO) was set up in 2020. It aims to help European fact-checking organisations and researchers cooperate and share information, including with tools and resources.

Who are the fact-checkers?

Fact-checking is ‘the process of checking that all the facts in a piece of writing, a news article, a speech, etc. are correct’. It enables readers to make informed decisions based on accurate facts. In this way, fact-checkers expose false information and promote accountability. Fact-checkers are researchers, civil society associations or journalists, who often gather within fact-checking organisations. Fact-checking can be done in two ways: in-house, for instance when journalists and editors check their sources thoroughly before publishing a story; or by third-party organisations providing services to social platforms, or independent fact-checkers playing the role of society’s watchdogs. Additionally, fact-checking can be automated, to debunk large amounts of false information. Algorithms are much faster; however, when it comes to halting the fast spread of false information, they face challenges relating to the quality of training datasets and bias. Fact-checking generally appears to have a positive effect on countering misperceptions, although less so in polarising situations such as elections. According to research, in such polarising situations, beliefs prevail over facts. Alleged subjectivity and bias by fact-checkers is a frequent criticism, with opponents claiming fact-checking generates censorship and undermines freedom of expression.

What is content moderation?

While fact-checking is about verifying the accuracy of specific content, content moderation is about ensuring that content is in line with specific platforms’ guidelines and does not harm their users. Content moderation can go beyond addressing factual accuracy, to deal also with topics such as harassment, copyright infringements and hate speech. It can result in offensive content being removed or fake accounts being closed. Meta, for example, uses a ‘removereduceinform‘ strategy, and defines in its Community Standards what is and what is not allowed. Likewise, Google takes a range of actions to enforce their policy.

In the EU, the Digital Services Act (DSA) has introduced stricter rules for content moderation. Online platforms must implement measures that prevent the spread of illegal content, goods and services, and protect their users from harm. However, platforms are not required to monitor proactively the content they display. Users can challenge content-moderation decisions either in out-of-court mediation or in court. Very large online platforms and search engines (VLOPs and VLOSEs) are subject to stricter rules under the DSA. They have to address systemic risks such as the dissemination of illegal content, disinformation, and harm to fundamental rights, elections and public health. According to Articles 34 and 35 of the DSA, VLOPs and VLOSEs are required to conduct annual risk assessments, adjust their services and algorithms to minimise harm, and submit to independent audits. Additionally, they must be prepared to address public security during crises, and give relevant authorities and vetted researchers access to their data and algorithms. The European Commission has already opened several proceedings against major platforms for not fulfilling these obligations. If a breach is established, this can result in high fines of up to 6 % of those platforms’ worldwide annual turnover.

EU Code of Practice on Disinformation: Voluntary commitment to fact-checking

Fact-checking is one of the commitments of the strengthened EU Code of Practice on Disinformation (the Code) to create a safer, more transparent and trustworthy online environment. So far, the Code has 48 signatories: online platforms, including major ones such as Google, Meta, Microsoft and TikTok (but not Telegram, for example), as well as fact-checking, advertising, research and civil society organisations. This voluntary instrument is highly relevant for VLOPs and VLOSEs, as it will be converted into a code of conduct – a co-regulatory instrument under the DSA, set to become a framework for evaluating compliance with the DSA. The European Board for Digital Services working group on integrity of the information space, during its November 2024 meeting, reported that signatories submitted a request to convert the Code under Article 45 DSA. This formally triggered the obligation for the Commission and the board to assess the Code against the requirements set out in that article. Their positive assessments will enable the integration of the Code into the DSA framework, which will take effect from 1 July 2025.

Moreover, a Transparency Centre assures visibility of the signatories’ commitments to implement the Code and fight disinformation through regular reports to the Commission. Monitoring is part of the Code, and aimed at measuring the implementation of listed commitments. The Code gives signatories a possibility to explain the reasons why they do not sign up to a specific commitment or measure if they establish it is not relevant or pertinent for their services.

Changing commitments

Several VLOPs and VLOSEs have updated their subscription forms to the Code, particularly regarding fact-checking commitments. X pulled out from the Code completely in May 2023. Meta announced in January 2025 that it would stop third-party fact-checking, but only for their United States (US) users. The same month, Google announced that it would not comply with these commitments, as it deems its current content moderation sufficient and fact-checking not effective for its services. As a result, Google will not incorporate fact-checked results into its search or YouTube videos, or build fact-checking into its ranking system or algorithms. Instead, it allows YouTube users to add contextual notes to videos. Also in January 2025, LinkedIn stated that it was not subscribing to the commitment, a decision it said was intended to empower the fact-checking community, since the commitment was not proportionate to its risk profile. Conversely, TikTok said it was prepared to commit to such measures – if other signatories providing similar services did too. The European Fact-Checking Standards Network expressed concerns over ‘some platforms’ retraction’ from their earlier commitments.

Towards crowdsourced fact-checking?

In 2021, Twitter (now X) introduced community notes, the possibility to add context to tweets, allowing selected users to fact-check posts on the platform. X’s approach of replacing independent third-party fact-checkers by crowdsourced fact-checking seems to serve as a model for other major platforms. As said, Google is enabling some users to add contextual notes to YouTube videos in order to circumvent misinformation.

Meta outlined that Facebook and Instagram remained committed (‘subscribed’) to the fact-checking community under the Code, albeit subject to evolutions in their practices. These will be reviewed continuously in order to assess whether it would be appropriate to make changes, such as the deployment of community notes. Such notes were announced in January 2025 for their US users, to replace its third-party fact-checking programme (see above). Meta claims it introduced content moderation changes in the US to ‘restore free speech‘ and lessen alleged censorship, implementing these changes by updating their Hateful Conduct policy. This now allows their US users, for example, to make ‘allegations of mental illness or abnormality when based on gender or sexual orientation’. As announced by Meta CEO Zuckerberg, Meta removed restrictions on topics such as racism, immigration and gender identity. Meta also intends to reintroduce civic content (posts about elections, politics or social issues), with a personalised approach based on likes and views, recommending more content that is political. Meta’s announcements have prompted reactions inter alia from Reporters Without Borders and the European Federation of Journalists.


Read this ‘at a glance’ note on ‘Fact-checking and content moderation‘ in the Think Tank pages of the European Parliament.


Related Articles

Comments are closed for this post.

Discover more from Epthinktank

Subscribe now to keep reading and get access to the full archive.

Continue reading

EPRS Logo
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.