A Europe that protects: EU reports on progress in fighting disinformation ahead of European Council

European Commission - Press release

Today the Commission and the High Representative report on the progress achieved in the fight against disinformation and the main lessons drawn from the European elections, as a contribution to the discussions by EU leaders next week.

Protecting our democratic processes and institutions from disinformation is a major challenge for societies across the globe. To tackle this, the EU has demonstrated leadership and put in place a robust framework for coordinated action, with full respect for European values and fundamental rights. Today’s joint Communication sets out how the Action Plan against Disinformation and the Elections Package have helped to fight disinformation and preserve the integrity of the European Parliament elections. 

High Representative/Vice President Federica Mogherini, Vice-President for the Digital Single Market Andrus Ansip, Commissioner for Justice, Consumers and Gender Equality Věra Jourová, Commissioner for the Security Union Julian King, and Commissioner for the Digital Economy and Society Mariya Gabriel said in a joint statement:

“The record high turnout in the European Parliament elections has underlined the increased interest of citizens in European democracy. Our actions, including the setting-up of election networks at national and European level, helped in protecting our democracy from attempts at manipulation.

We are confident that our efforts have contributed to limit the impact of disinformation operations, including from foreign actors, through closer coordination between the EU and Member States. However, much remains to be done. The European elections were not after all free from disinformation; we should not accept this as the new normal. Malign actors constantly change their strategies. We must strive to be ahead of them. Fighting disinformation is a common, long-term challenge for EU institutions and Member States.

Ahead of the elections, we saw evidence of coordinated inauthentic behaviour aimed at spreading divisive material on online platforms, including through the use of bots and fake accounts. So online platforms have a particular responsibility to tackle disinformation. With our active support, Facebook, Google and Twitter have made some progress under the Code of Practice on disinformation. The latest monthly reports, which we are publishing today, confirm this trend. We now expect online platforms to maintain momentum and to step up their efforts and implement all commitments under the Code.”

While it is still too early to draw final conclusions about the level and impact of disinformation in the recent European Parliament elections, it is clear that the actions taken by the EU – together with numerous journalists, fact-checkers, platforms, national authorities, researchers and civil society – have helped to deter attacks and expose attempts at interfering in our democratic processes. Increased public awareness made it harder for malicious actors to manipulate the public debate.

In particular, EU action focused on four complementary strands: 

  1. The EU has strengthened its capabilities to identify and counter disinformation, via the Strategic Communication Task Forces and the EU Hybrid Fusion Cell in the European External Action Service. It has also improved the coordinated response by setting up a Rapid Alert System to facilitate the exchange of information between Member States and the EU institutions. 
  1. The EU worked with online platforms and industry through a voluntary Code of Practice on disinformation to increase transparency of political communications and prevent the manipulative use of their services to ensure users know why they see specific political content and ads, where they come from and who is behind them.
  1. The Commission and the High Representative, in cooperation with the European Parliament, helped increase awareness and resilience to disinformation within society, notably through more dissemination of fact-based messaging and renewed efforts to promote media literacy.
  1. The Commission has supported Member States’ efforts to secure the integrity of elections and strengthen the resilience of the Union’s democratic systems. The establishment of election networks at EU and national level, with links to the Rapid Alert System, improved cooperation on potential threats.

However, more remains to be done to protect the EU’s democratic processes and institutions. Disinformation is a rapidly changing threat. The tactics used by internal and external actors, in particular linked to Russian sources, are evolving as quickly as the measures adopted by states and online platforms. Continuous research and adequate human resources are required to counter new trends and practices, to better detect and expose disinformation campaigns, and to raise preparedness at EU and national level. 

Update by online platforms under the Code of Practice

Online platforms have a particular responsibility in tackling disinformation. Today the Commission also publishes the latest monthly reports by Google, Twitter and Facebook under the self-regulatory Code of Practice on Disinformation. The May reports confirm the trend of previous Commission assessments. Since January, all platforms have made progress with regard to the transparency of political advertising and public disclosure of such ads in libraries that provide useful tools for the analysis of ad spending by political actors across the EU. Facebook has taken steps to ensure the transparency of issue-based advertising, while Google and Twitter need to catch up in this regard.

Efforts to ensure the integrity of services have helped close down the scope for attempted manipulation targeting the EU elections but platforms need to explain better how the taking down of bots and fake accounts has limited the spread of disinformation in the EU. Google, Facebook and Twitter reported improvements to the scrutiny of ad placements to limit malicious click-baiting practices and reduce advertising revenues for those spreading disinformation. However, no sufficient progress was made in developing tools to increase the transparency and trustworthiness of websites hosting ads.

Despite the achievements, more remains to be done: all online platforms need to provide more detailed information allowing the identification of malign actors and targeted Member States. They should also intensify their cooperation with fact checkers and empower users to better detect disinformation. Finally, platforms should give the research community meaningful access to data, in line with personal data protection rules. In this regard, the recent initiative taken by Twitter to release relevant datasets for research purposes opens an avenue to enable independent research on disinformation operations by malicious actors. Furthermore, the Commission calls on the platforms to apply their political ad transparency policies to upcoming national elections.

Next Steps

As set out in its conclusions in March, the European Council will come back to the issue of protecting elections and fighting disinformation at its June Summit. Today’s report will feed into this debate by EU leaders who will set the course for further policy action.

The Commission and the High Representative remain committed to continue their efforts to protect the EU’s democracy from disinformation and manipulation. Still this year, the Commission will report on the implementation of the elections package and assess the effectiveness of the Code of Practice. On this basis, the Commission may consider further actions to ensure and improve the EU’s response to the threat.

Background

The European Union has been actively tackling disinformation since 2015. Following a decision of the European Council in March 2015, in order to challenge Russia’s ongoing disinformation campaigns, the East StratCom Task Force in the European External Action Service (EEAS) was set up. In 2016, the Joint Framework on countering hybrid threats was adopted, followed by the Joint Communication on increasing resilience and bolstering capabilities to address hybrid threats in 2018.

In April 2018, the Commission outlined a European approach and self-regulatory tools to tackle disinformation online. In October 2018, the Code of Practice was signed by Facebook, Google, Twitter and Mozilla as well as the trade associations representing online platforms, the advertising industry, and advertisers. In addition, Facebook, Google and Twitter committed to report monthly on measures taken ahead of the European Parliament elections. The Commission, with support by the European Regulators Group for Audiovisual Media Services (ERGA) closely monitored the progress and published monthly assessments together with the submitted reports. On 22 May also Microsoft joined the Code of Practice and subscribed to all its commitments.

The Code of Practice goes hand-in-hand with the Recommendation included in the election package announced by President Juncker in the 2018 State of the Union Address to ensure free, fair and secure European Parliament elections. The measures include greater transparency in online political advertisements and the possibility to impose sanctions for the illegal use of personal data to influence the outcome of the European elections. Member States were also advised to set up a national election cooperation network and to participate in a European election network.

Source: https://europa.eu/rapid/press-release_IP-19-2914_en.htm

Leave a Reply

Twitter