TheEU Code of Practice on Disinformation was an important experiment that has now come to an end. But what should follow? Without a renewed focus on stakeholder engagement, efforts could stall, putting everyone at risk of disinformation attacks.
The Technology and International Affairs Program develops insights to address the governance challenges and large-scale risks of new technologies. Our experts identify actionable best practices and incentives for industry and government leaders on artificial intelligence, cyber threats, cloud security, countering influence operations, reducing the risk of biotechnologies, and ensuring global digital inclusion.
The goal of the Partnership for Countering Influence Operations (PCIO) is to foster evidence-based policymaking to counter threats in the information environment. Key roadblocks as found in our work include the lack of: transparency reporting to inform what data is available for research purposes; rules guiding how data can be shared with researchers and for what purposes; and an international mechanism for fostering research collaboration at-scale.
The EU Code of Practice on Disinformation (COP) produced mixed results. Self-regulation was a logical and necessary first step, but one year on, few of the stakeholders seem fully satisfied with the process or outcome. Strong trust has not been built between industry, governments, academia, and civil society. Most importantly, there is more to be done to better protect the public from the potential harms caused by disinformation. As with most new EU instruments, the first year of COP implementation has been difficult, and all indications are that the next year will be every bit as challenging.
This working paper offers a nonpartisan briefing on key issues for developing EU policy on disinformation. It is aimed at the incoming European Commission (EC), representatives of member states, stakeholders in the COP, and the broader community that works on identifying and countering disinformation. PCIO is an initiative of the Carnegie Endowment for International Peace and does not speak on behalf of industry or any government.
The relationships among some COP stakeholders are fraught with tension, and there are indications that regulation derived from the Digital Services Act (DSA) and European Democracy Action Plan could complement or replace the COP.3 At the outset of the COP, the EC had limited evidence, legal basis, political will, and terminology on which to regulate. In preparation for the next steps, whatever they may be, a more inclusive process is necessary to ensure that the regulation hits the mark. This working paper suggests some concrete steps and considerations for the road ahead.
The data used in this assessment are based on published self-reported compliance with the COP through October 2019. Therefore, the analysis is limited by the inconsistent data currently available. It is based on the first annual report, fifteen progress reports, five roadmaps, three EC evaluation reports, and the initial EC communication. Background interviews were conducted with key stakeholders and observers, and PCIO partners were given the opportunity to comment on a draft of this paper. However, the final publication is solely the responsibility of the named author.
As evaluations of the COP are finalized in early 2020, the EC has several options to take the process forward. One is to continue with a COP 2.0, with additional mechanisms and reporting requirements based on the lessons learned from the first year. A second is to continue with the self-assessment approach of the COP but to back it up with some form of regulatory intervention derived from the DSA. Such regulations would be designed to improve oversight, set minimum standards that apply to actors beyond signatories, and add some form of punishment for noncompliance; and they would also be tied to the forthcoming European Democracy Action Plan. A third option is to take a harder line on regulation, developed from the DSA. All three options raise the question of whether EU efforts should focus on the amount of verifiably false content removed or on assessing the processes and procedures used by stakeholders. Similarly, they highlight the need to address how signatory performance can be stated in useful ways that augment further private and public sector efforts to counter influence operations.
In October 2018, leading tech platforms voluntarily signed the COP, submitting themselves to transparent self-regulation as laid out by the EC. In January 2019, each signatory submitted a baseline report to serve as a roadmap for future efforts to combat disinformation.10 In addition to the original requirement of a baseline report and an annual report, the commission asked in December 2018 for some signatories to submit progress reports monthly until the conclusion of the European Parliament elections in May 2019.11 Signatory companies complied with that new request. Following the May 2019 elections, the code required signatories to publish an annual report in October 2019 on the progress of their efforts. All reports indicate progress against the following five categories of commitments outlined in the COP:
Many of the stakeholders involved in mitigating or countering influence operations use their own terminologies to encapsulate their view of the problem. Frequently used terms include information operations,13 computational propaganda,14 information manipulation,15 information warfare,16 information disorder,17 hybrid warfare,18 strategic deception,19 and manipulative interference.20 PCIO the term influence operations as an umbrella term for adversary-led interference in a society; it involves techniques such as the spread of disinformation, targeted information operations, and coordinated inauthentic behavior.21
Terminology in this area remains challenging. Early efforts in the COP process aimed to tackle fake news.22 Then in 2018, the COP switched terminology to refer to online disinformation and later just disinformation.23 Companies also use varying terminology. For example, in their quarterly returns for the COP, Facebook referred to coordinated inauthentic behavior,24 Google used terms including influence operations and misrepresentation,25 and Twitter referred to malicious automation, inauthentic activities, and information operations.26 Member states also use their own preferred terminologies, some of which provide the basis for national regulation and legislation.
Inconsistent terminology indicates a lack of consensus among key stakeholders regarding the scope of the issue and therefore its potential solutions. Clarity over objectives and terminology is required. Furthermore, the process of achieving consensus could itself inform private- and public-sector policymaking by forcing lawmakers to agree on the scope of the issue.
Tech platforms have responded to data requests by collecting data on the products, policies, teams, and processes they use to mitigate IO and reporting these as metrics. There are questions about the quality of this data and how verifiable self-reported data really are. The EC and stakeholders should aim to reach clarity on what data are needed to support methodologically sound research on impact that can be used to develop policy. This is not always a question of raw data but of companies having proportionate goals, processes, products, and tools in place; this can also be evaluated. An appropriate forum to support a transparent, consultative, and iterative process is lacking.
The EC should support a dialogue between research and industry to maximize opportunities for productive research aimed at better understanding the impact of IO and of countermeasures. Further, the commission should proactively examine the nature of the current regulatory environment and how it could be adjusted to enable such research, particularly in light of the challenges existing regulation has presented to Social Science One. To this end, PCIO will commission state-of-the-art reports on industry-research standards and collaboration frameworks.
The idea that the tech platforms are somehow the problem, and that this problem can be solved by regulation, has hampered trust-building between stakeholders. The real problem is threat actors who intend to abuse the platforms, of which platforms are also victims. A regulatory response from the EU is likely to push threat actors from the mainstream platforms to other platforms that are not part of the current COP community; and while regulation can reach these actors, they are less likely to be part of a genuinely collaborative effort. As a result, the tone and direction of future collaborations should evolve on the grounds that current signatories are still the most viable partners for developing sound EU policy. Threat actors exploiting technology are a significant problem that cannot be tackled by regulation alone. This will require significant efforts from the international community (including the EU) to advance strategies, policies, and law enforcement approaches to counter those actors. All stakeholders share a common interest in protecting end users and must learn to express this in common terms and common goals.
Internationally, the debate on regulating tech platforms is intensifying. Much of the debate seems harmful to freedom of speech, and the debate risks becoming rearticulated to suit the needs of authoritarian states. The EU should aim to become the global leader in setting reasonable, collaborative, workable, and measurable solutions to disinformation. Tech platforms and other stakeholders should make EU collaboration their priority, with a view to setting convincing global standards.
During the first year of the COP, it has become clear that different actors have different skin in the game and that they are better prepared to report on different requests based on those interests. Furthermore, the ability to quickly and efficiently produce relevant data is different for different actors. Each actor, and especially industry, needs to be able to address its own problems based on the way its platforms and services are structured, which makes blanket regulation unlikely to succeed. Every step forward should be applauded, but ad hoc, uncoordinated measures will achieve limited results. The next phase of the COP should develop a clear strategic direction that guides actors in their long-term planning so that structures capable of meeting EC requirements can develop over time. Yet a rapidly evolving field also demands agility. Any long-term vision should be backed up by recurring check-ins and opportunities for iterative adaptations where necessary.
3a8082e126