Members' Research Service By / February 25, 2020

How should democracies respond to the disinformation dilemma?

Often sponsored by authoritarian state actors, disinformation undermines our democracies by eroding trust in institutions and media, increasing rifts and tensions in society and weakening our ability to take informed decisions.

EPRS - The Dilemma of Disinformation: How should democracies respond?

Written by Naja Bentzen,

EPRS – The Dilemma of Disinformation: How should democracies respond?

Often sponsored by authoritarian state actors, disinformation undermines our democracies by eroding trust in institutions and media, increasing rifts and tensions in society and weakening our ability to take informed decisions. Democracies all over the world – including the United States and the European Union – are currently seeking adequate responses to the ongoing challenge of online disinformation (deliberately designed to deceive people). On 17 February 2020, the first joint event by EPRS and Stanford University discussed how democratic countries can respond effectively to disinformation without compromising core values such as freedom of expression.

Following an introduction by EPRS Director-General Anthony Teasdale, Director of Stanford Internet Observatory and former Facebook Chief Security Officer Alex Stamos presented the Observatory’s research on these topics. Stamos explained that the 2016 US elections saw five ‘lanes’ of interference; two offline – overt propaganda by Russian state broadcaster RT and similar media outlets, as well as people-to-people interactions – and three online lanes:

1) Mimetic warfare: this can take the form of a truly false narrative; amplification of a divisive interpretation of a true fact (‘the Democratic Party rigged the primary against Bernie’); amplification of division or extreme political views; and undermining the very idea of truth – the last two being the key goals of mimetic information operations.

2) Hack and leak: a key actor here is the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (GRU), exemplified by the John Podesta emails. However, people in Brussels (including EU and NATO staff) are also targeted. Following a hack, such content can be leaked strategically (in the Podesta case, via the Russian-sponsored ‘DC Leaks’ platform) to the press.

3) Hacking election infrastructure: In general, this is more of a problem for the USA, where the decentralised elections are posing severe security challenges.


Following Alex Stamos’ presentation, Etienne Bassot, Director of the Members’ Research Service, EPRS, moderated the panel discussion. Discussing the challenges and opportunities for democracies facing the ‘disinformation dilemma’ were Paul Nemitz (Principal Adviser in the Commission’s Directorate-General for Justice and Consumers), Wojtek Talko (Communications Adviser to Vice-President of the European Commission for Values and Transparency, Věra Jourová), Erika Widegren (Chief Executive of Re-Imagine Europa) and Naja Bentzen (Policy Analyst, Members’ Research Service, EPRS).

Stamos explained that the real problem is not the existence of disinformation, but the amplification of the message. Social media platforms use different amplification models (advertisements, private groups, recommendation engines etc.). The more a platform amplifies a message, the more responsibility the platform carries. He highlighted the need to focus on curbing amplification of mis- or disinformation (which is legal, but unwanted), rather than on the content side. In other words, you can allow people to access misinformation if they want to, as well as hinder the amplification of the message. In addition, we need to enable the collaborations necessary to finding organised ill-intentioned actors. Academia, governments, social media platforms, non-governmental organisations etc. all need each other at various stages, and social media platforms should be incentivised to allow these relationships to exist.

Alex Stamos’ slides:

Paul Nemitz emphasised that Russian disinformation is not the only problem; our media ecosystem is under tremendous pressure. Last year in America, thousands of journalists lost their jobs. The free press is ‘going down the drain’, because Google and Facebook are draining the information environment for advertising revenue that used to be journalism’s main revenue source. We have to boost our free press, which is a vital component in the functioning of our democracies. There is however no ‘silver bullet’ in terms of social media regulation: we need to both tackle disinformation and the disappearance of traditional media. In the EU, the Commission, policy-makers and academics are united in their calls for a holistic approach to protecting democracy. In the 2019 EU elections, intersectoral, cross-silo cooperation produced good results. For Nemitz, all the steps Stamos advocates must be taken, but we also need an active policy to strengthen our institutions, public broadcasting and the free press. We also need the ‘technical intelligentsia’, experts like Stamos, to engage with parliaments and political parties. Lawmakers may know little about the technical side, but the technical intelligentsia also needs to learn what democracy is.

Erika Widegren

Explaining that Vice-President Věra Jourová is working on a European democracy action plan, intended to provide precisely such a holistic approach, Wojtek Talko fully agreed with the focus on amplification of legal but harmful content; Jourová intends to focus on behaviour rather than content. The challenge is to connect and coordinate various policy instruments and cyber-security, election, technology, and foreign affairs expertise. The European election cooperation network is an example of a complex, cross-sector network that exemplifies the important role of coordination. However, as we are learning, the actors who peddle disinformation are also learning. We have therefore to constantly adjust our responses and develop our technological capacity. Our main goal is to increase social media platforms transparency, make the platforms share information, at least with researchers, and ensure society is fully informed of their activities and intentions. In the long term, Talko concluded, we need to vaccinate society against disinformation.

Erika Widegren, referring to Re-Imagine Europe’s Taskforce on Democracy in a Digital Society, noted that one of their key conclusions was that a deeper change is happening. Digital technology is the third communication revolution, after the inventions of writing and printing, and, as she pointed out in Churchill’s words, ‘we shape the buildings we live in, and afterwards our buildings shape us’. One such example was the arrival of television in the USA, which revolutionised advertising. Before TV, the Teachers’ Union was almost as powerful in society as bankers. All this changed however when money was allowed ‘to talk’. Similarly, we are seeing a more aggressive media environment take hold in the digital age. The incentive online is to go viral, and the easiest way to do so is to pick a fight. Political extremes profit from this tendency, of which Greta Thunberg is a perfect example. The 85 % of the debate regarding whether people love or hate her eclipses her actual message. Democracy is based on compromise and finding solutions, which obviously clashes with this polarisation. For Widegren, ‘we will not be able to recreate what we once had, we need to clean up our “information river” to make it clean, transparent and safe again’.

BASSOT, Etienne; TALKO, Wojtek;

According to recent research published by the Oxford Internet Institute, Naja Bentzen noted that countries where a political party or government agency use social media manipulation campaigns at home has increased by 150 % over the past two years. In 2017, there was evidence of social media manipulation targeting domestic audiences in 28 countries. In 2018, it was 48 countries, and in 2019, the number had jumped to 70 countries. In these countries, such campaigns were used to suppress fundamental human rights, discredit political opposition and drown out political dissent. At the same time, in 2019, there was evidence of agents in seven countries using these techniques to influence audiences abroad: Russia, China, Iran, Pakistan, Saudi Arabia, Venezuela and India. At the same time, research from the StratCom Centre of Excellence in Riga has shown that it is still far easier to buy inauthentic amplification than it is to combat the phenomenon. Generally, authoritarian state actors can react swiftly and firewall their internet from the rest of the internet, criminalise creating and spreading rumours that ‘undermine economic and social order’, as China has done, or ban disrespect of authorities and the spreading of what the government deems to be ‘fake news‘, as has happened in Russia. This type of action is far more difficult for democracies, which strive to maintain human rights and free speech.

A key lesson from the debate was that – precisely because there is no silver bullet – we need coordinated authentic democratic behaviour and responses to combat ‘coordinated inauthentic behaviour’ sponsored by autocracies. Full, advanced democracies carry a significant responsibility for pushing for a rules-based world order in the digital sphere. The EU has put unprecedented pressure on online platforms to counter online disinformation. The ‘Brussels effect’ of the Commission’s final response to the online platforms’ behaviour ahead of the European elections cannot be overestimated: Silicon Valley’s recent charm offensive in Brussels, including Mark Zuckerberg’s visit on the day of the event, speaks volumes for the significance of the EU’s role, and the global responsibility that comes with it.

This event in audio:

This slideshow requires JavaScript.

Related Articles

Leave a Reply

%d bloggers like this: