Convening Series on Countering Disinformation
Strategic Insights and Questions
Produced by: Reos Partners Convened by: CIFF and Oak Foundation
1
Contents
Introduction
A Caveat on Naming and Framing
1. Levels
2. Lenses figE
3. Leverage Areas and Game-changing Strategies
4. Lifecycle
6. Collaboration
Participating Organisations
2
Introduction
About the Convening Series on Countering Disinformation
The Convening Series on Countering Disinformation was conducted between December 2021 and February 2022, convened by CIFF and Oak Foundation and designed and facilitated by Reos Partners. �
The series convened participants from funding agencies, implementing organizations, and think tanks/ research organizations, to explore collective strategies for addressing challenges and obstacles presented by disinformation. To do this, the series employed a systemic approach to disinformation, exploring both the impact of disinformation across various thematic areas and the underlying causes and drivers that enable disinformation to thrive and spread. Through a series of online workshops and meetings, participants explored the patterns, structures, and mental models that underpin disinformation dynamics, and identified potential leverage areas and game changing strategies in relation to the broader disinformation ecosystem. With these elements in view, they then explored different opportunities for influencing this system in meaningful and effective ways by reflecting on the kinds of roles that participants might play, both individually, and collectively.
�The results represent an initial, rather than a definitive, sketch of some of the key elements of a complex and rapidly evolving system. But they offer a frame for thinking about what may be needed to more effectively counter disinformation, and points to new opportunities for collaborative action.
About this Resource
This document supports stakeholders affected by or working on disinformation challenges to use the initial insights created during the series as a resource for developing programmatic, organizational, and collective strategies for countering disinformation as a systemic challenge.
�To that end, the elements used to structure exploration during the convening series are re-presented below along with a set of key prompt questions to:
�In this sense, this resource is intended as a compass - a tool for plotting courses of action in what may be unfamiliar terrain - rather than a map of specific roads to follow.
3
3
A Caveat on Naming and Framing
Introduction
This series was entitled “Countering Disinformation”, however this framing has been a topic of discussion throughout.
�During our outreach to potential participants, some declined on this basis, explaining that from their perspective, this framing had thus far not proven adequate to the complex nature of the problems being faced and therefore is no longer productive.
�During the convening itself, expert presentations also problematized this framing, questioning the utility of the the terms mis- and disinformation, and observing that a widespread tendency to focus on “the atoms of information” is causing us to miss the larger narratives and complex flows of information of which they are a part.
�Participant discussions also drew attention to considerations about the effectiveness of focusing on countering measures as cornerstone strategies, and shared diverse perspectives for framing both problems and solutions.
�We think this is useful. By approaching disinformation as a systemic challenge, this series has endeavoured to create a space to reflect on this framing and widen the aperture.
�
From this perspective, we have noticed two issues:
�We hope that this resource encourages stakeholders in this space to continue reflecting on this language and the mindsets that underpin it and to choose consciously what words to use.
4
1. Levels
5
1. Levels
Fighting Fires + Addressing Drivers
By considering disinformation from a systemic perspective, we are looking not only at the event level or at isolated incidents but seeing the more complex interplay of factors that are continuously reproducing the phenomena that we find problematic.
Taking a systems view helps us consider (at least) two levels for engagement with regards to disinformation:
Both are needed:
Fighting fires alone leaves the roots of the problem to grow and evolve, while addressing drivers alone does little to protect individuals and societies exposed to harmful impacts with potentially devastating consequences.
6
6
Reflection
In the convening series, we worked with the “iceberg” model from systems thinking, which distinguishes four levels - events, patterns, structures, and mental models - and asserts that your ability to influence increases with your ability to understand and act on the deeper levels of the iceberg (mental models and structures).
1. Levels
Reflection Questions
7
7
2. Lenses
8
2. Lenses
When people - from journalists, activists, and policy-makers, to citizens, tech developers, educators and others - talk about disinformation problems and solutions they often do so from particular perspectives.
It can be helpful to turn to a specific perspective, or lens, on complex systems like disinformation in order to learn more deeply about some dimension, or to frame complex problems in more actionable ways. But since the way we frame a challenge or problem influences the kinds of solutions we seek or see as relevant, it’s important to consider what different lenses bring into view.
�The lenses that are commonly applied in the disinformation space (and which were explored in the Disinformation Convenings) include:
�Lens 1: We have a technology problem! �Lens 2: We have an information problem!
Lens 3: We have an education and awareness problem!�Lens 4: We have an accountability problem!�Lens 5: We have a relationship problem!
These 5 lenses are described in more detail on the next page.
No single dimension or perspective is adequate for fully capturing, explaining, understanding, or addressing disinformation.
Combining, sharing and acknowledging multiple perspectives helps bring the larger system into view and can illuminate blind-spots.
9
Lenses
Lens 1: We have a technology problem!
Viewed through this lens, the disinformation problem is seen to stem primarily from how the technology is engineered, the algorithms, infrastructure and platforms, the way it uses data, its capabilities (its “godlike” nature), as well as who controls the technology.
�Solutions commonly prescribed from this perspective include: the use of artificial intelligence, new algorithms, public service technologies, redirect methods and visibility filtering.
�Lens 2: We have an information problem!
�Viewed through this lens, the disinformation problem is a problem of information - “fake news”, rumours, viral slogans, conspiracy theories, and the people creating and spreading them, as well as freedom of expression and information overload. ��Solutions commonly prescribed from this perspective include: fact-checking, counter-messaging, pre- and debunking, labelling, and de-platforming.
Lens 3: We have an education and awareness problem!
�Viewed through this lens, the disinformation problem is a problem of lack of media literacy, awareness, resilience, digital hygiene practices and skills to detect and respond among those who are affected by it and inadvertently participating in it. ��Solutions commonly prescribed from this perspective include: media literacy, digital hygiene, awareness-raising campaigns, referral services, and support to local journalism.
�Lens 4: We have an accountability problem!
�Viewed through this lens, the disinformation problem is a problem of roles and responsibilities across the system: clarity of roles, duty of care, risk mitigation, safety protocols and lack of regulation and accountability to regulators, as well as to and among citizens. ��Solutions commonly prescribed from this perspective include: regulation, litigation, internet governance, public interest infrastructure, industry reform, and consumer mobilisation.
�Lens 5: We have a relationship problem!
�Viewed through this lens, the disinformation problem is a problem of national and geo-politics and conflicts, behaviour of “bad actors”, social polarisation, need for belonging, and a general fraying of trust, social cohesion and the social contract. ��Solutions commonly prescribed from this perspective include: depolarisation, countering hate speech, dialogue and engagement, hope-based messaging, narrative-building, and conflict transformation.
2. Lenses
In the convenings, participants mapped the patterns, structures and mental models from each of these perspectives in order to bring the system more fully into view.
10
Reflection
2. Lenses
Reflection Questions
What would another lens bring into view?
11
11
3. Leverage Areas
12
3. Leverage Areas
Drawing from systems thinking, leverage areas can be defined as cross-cutting and consequential issues, themes, or concerns where influence or change can help shift problematic dynamics in a complex system.
�Leverage areas are places where you can strategically take action to address a given situation. They are considered low leverage if a small amount of effort will lead to a small change, and high leverage if a relatively small amount of force can lead to a large change. In dealing with complex social problems, high-leverage areas are those that address “root causes” getting to the level of structures and mental models.
One way of influencing leverage areas is through game-changing strategies. These are interventions that change the rules of the game by interceding at the structural level (as opposed to addressing instances at the event level/ “firefighting”).
�In this Convening Series, participants identified 5 leverage areas (though there are certainly more to be discovered). A description of each of these areas and their accompanying game-changing strategies follows.
13
13
3. Leverage Areas
Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:
Leverage Area 1: Financial and Commercial Incentives
�Core Question: How can we shift or influence the financial and commercial incentives of the disinformation/ information economy?
This leverage area captures a strong theme that emerged around the business models animating social media companies that incentivize technology, products, services and practices that privilege profit over people and foster the spread of disinformation throughout information ecosystems. This leverage area pertains not only to the attention economy, but also encompasses the role of advertisements in legacy media, and incentives to platform users to create opportunities to monetize information in this poorly regulated space.
14
Leverage Area 2: Effective Communication
�Core Question: How can we produce fact-based information and depolarising narratives that are more resonant and gain better traction with communities?
This leverage area is concerned with the power of disinformation to dominate fact-based information as well as so-called “pro-social” narratives in seizing attention, tapping into deeply held beliefs, capitalizing on fears, and resisting counter-messaging. It focuses on finding ways to ensure messages and narratives that serve society, foster cohesion, support democracy, protect well-being, and respect human rights can gain as much traction with communities as disinformation does.
Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:
3. Leverage Areas
15
3. Leverage Areas
Leverage Area 3: The Role and Voice of Local Communities
�Core Question: How can we ensure local communities are involved in the prioritisation of risk and harm and other key decisions related to disinformation, and empowered to address disinformation in their contexts?
The constrained role and voice of affected and marginalized communities in framing problems and developing solutions to disinformation arose in different ways across comments and discussions related to Convening Session 1. This leverage area encompasses challenges related to top-down approaches to disinformation when there is much to learn from affected communities; the dominance of Northern voices; a need for rights-based approaches; and to address the marginalization of activists and communities in dialogue and decision making around approaches to disinformation that affect them.
Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:
�
16
16
3. Leverage Areas
Leverage Area 4: Research, Systemic Learning and the Application Gap
�Core Question: How can we develop spaces, infrastructures, and capacities for systemic learning about disinformation while still enabling and connecting to needed short-term tactical actions?
As Dr. Claire Wardle observed in Convening Session 1, the field of disinformation studies is both very new, and rapidly evolving. So are the technologies, tactics, and broader dynamics implicated by disinformation themselves. This leverage area takes up the importance of improving sustained opportunities for deep, rigorous, and empirical research; elevating the level of discussion about disinformation across different stakeholder groups through such research; improving awareness, understanding and skills among those involved in response; and addressing the existing “application gap” (e.g. the gap around processes for employing knowledge resources to inform and improve approaches and responses to emergent events and challenges).
Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:
17
3. Leverage Areas
Leverage Area 5: Frameworks and Approaches to
Information Ecosystems
�Core Question: How can we better conceptualise and convey information ecosystems so we can better assess and prioritise threats, and guide effective interventions?
The ability to design and implement effective responses to disinformation dynamics relies at least in part on the capacity of our frameworks and processes to conceptualize and communicate information ecosystems, as well as the broader systems at play in disinformation, in useful and useable ways for a range of actors. Likewise, there is an increasing need for frameworks that are relevant for different kinds of actors and organizations to detect how disinformation plays out in and across the information ecosystems they are involved in, and to assess its implications. Promising and useful approaches to information ecosystems are being used by several organizations in the context of their work on (for example) elections, health communication, conflict, and in humanitarian contexts, among others. But challenges still exist when it comes to moving from the detection of disinformation to the design of interventions, and in mobilizing the findings of such information ecosystem assessments for diverse programming purposes, context, and mandates. This leverage area pertains to how we can improve, and make wider use of, frameworks and approaches for detecting, assessing, and responding to disinformation and related challenges.
Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:
18
18
Reflection
��
3. Leverage Areas
Reflection Questions
(Perhaps could be included on each page?)
19
19
4. Lifecycle
20
4. Lifecycle
From C. Wardle, H. Derakshan, 2017, p. 6
Countering disinformation effectively requires efforts to intervene or influence along the different stages of the disinformation lifecycle. In the same way that much attention is often directed toward the event level of disinformation, there is also an inclination to focus on distribution phase of the disinformation life cycle.
These phases of message creation and distribution also sit within, and interact with, larger sociocultural systems of human communication, both on and offline, whereby disinformation can contribute to the erosion of social cohesion. In turn, weakened social cohesion can create greater susceptibility to, and participation in, disinformation. Connecting (or extending) the disinformation life cycle (e.g. creation - production - distribution - reproduction - etc…) to the social processes and worlds of which it is part is an important part of taking a systemic approach to addressing disinformation.
�For example, as messages draw their content and meaning from social worlds, paying attention to the values, tensions, and relationships they may seek to exploit can be a valuable contribution in addressing the disinformation lifecycle.
21
21
Reflection
���
4. Lifecycle
Reflection Questions
22
22
5. Roles and Responsibilities
23
5. Roles and Responsibilities
Roles
Reflecting on the kinds of roles we play in a system can help generate important insights that can be useful for identifying strategic opportunities.
With the understanding that a single organisation may well fit more than one category, we used three general “actor types” to help us design and facilitate discussions, and include different perspectives in the Convening Series:
Participants were also asked what roles they would like to see each participating actor group play towards shifting the disinformation system.
Responsibilities
No matter what roles we may play, we are all inherently information actors - both as individuals and as organisations. This means we all have the potential to become both targets and unwitting vectors of disinformation, a reality that is born out daily. The threats, risks, and harms presented by disinformation dynamics therefore activate a new set of responsibilities to take in to account, especially as we consider roles to play and actions to take to address them.
24
24
Implementers / Practitioners
5. Roles and Responsibilities
For the purposes of this convening, we’ve used the term “implementers” to describe organisations that provide support through services or through carrying out programming and interventions. The implementers participating in this series represent a wide range of “third sector” organisations working across an array of thematic concerns using different approaches and techniques to address issues, challenges, and threats related to disinformation dynamics. A smaller group of participating implementers focus on thematic areas including domestic violence, public health, peacebuilding, polarization, conflict, development, and climate change, but are grappling with the challenges of disinformation in the contexts in which they work.
Role propositions
25
25
Implementers / Practitioners
5. Roles and Responsibilities
Reflection Questions
-Vis a vis the communities you serve?
-Vis a vis your staff and partners?
-Vis a vis others in your sector?
-Vis a vis other sectors?
Consider:
What is your potential for becoming a target of disinformation?
What is your potential for becoming an unwitting vector of disinformation?
What are the unintended harms that can emerge from such conditions, and for whom?
Given the roles you play, what protocols should you have in place to mitigate and/or respond to such harms?
26
26
Knowledge Generators / Thought Leaders
The term thought leader is commonly used to refer to influential individuals who help shape public discourse and understanding of a particular topic. For the purposes of this convening, we have also used this term to broadly include technical experts and organisations typically situated within universities, governments, or multilateral organisations, dedicated to conducting research as their primary purpose, and who are recognized as authorities on disinformation. As individuals or organisations, thought leaders (thus defined) influence and guide fields of study, methods and approaches taken up across sectors and stakeholders, and shape both public and policy discourse.
5. Roles and Responsibilities
Role propositions
27
27
Knowledge Generators / Thought Leaders
5. Roles and Responsibilities
Reflection Questions
-Vis a vis the communities you serve?
-Vis a vis your staff and partners?
-Vis a vis others in your sector?
-Vis a vis other sectors?
Consider:
What is your potential for becoming a target of disinformation?
What is your potential for becoming an unwitting vector of disinformation?
What are the unintended harms that can emerge from such conditions, and for whom?
Given the roles you play, what protocols should you have in place to mitigate and/or respond to such harms?
28
28
Funders
The philanthropies participating in this series include organizations with significant expertise and experience supporting work on disinformation and the digital space from a number of vantage points (e.g. responding to incidents, mitigating harms, addressing drivers), as well as those who are newer to the space, and those who are working in other thematic areas but who are facing challenges related to disinformation in carrying out their work.
5. Roles and Responsibilities
Role propositions
29
29
Funders
5. Roles and Responsibilities
Reflection Questions
-Vis a vis the communities you serve?
-Vis a vis your staff and partners?
-Vis a vis others in your sector?
-Vis a vis other sectors?
Consider:
What is your potential for becoming a target of disinformation?
What is your potential for becoming an unwitting vector of disinformation?
What are the unintended harms that can emerge from such conditions, and for whom?
Given the roles you play, what protocols should you have in place to mitigate and/or respond to such harms?
30
30
6. Collaboration
31
Collaboration + Enabling Conditions
6. Collaboration
Systemic problems can only be addressed with the help of collaboration across organisations, sectors and disciplines. While autonomous action may be efficient when fighting fires, shifting leverage areas cannot be done by single actors on their own. This requires bringing together complementary approaches, skills, and resources to operationalize game-changing strategies.
Collaboration is not always the most accessible, agile, straightforward, or cost-effective option, and the intention should not be to collaborate for collaboration’s sake.
It is important to consciously decide when, on what, and with whom to collaborate and to support necessary collaborations through incentives, investment, and enabling vehicles, processes, and mechanisms.
In this series, participants developed an initial set of collaborative opportunities to shift key leverage areas of the disinformation ecosystem.
To help develop further ideas, we present questions to spur reflection on where collaboration is needed, and to identify the specific enabling conditions needed for success.
32
Collaborative Opportunities
6. Collaboration
The following propositions indicate areas where participants suggest that collaboration - if sufficiently scoped, resourced, organized, coordinated and facilitated - can make a difference for both tackling emergent disinformation events and in shifting key leverage areas.
In addition, a number of ideas for collaborative initiatives were shared:
to: set shared strategy and goals at the systems level; identify and make visible gaps in knowledge, information, and practice; set agenda and standards for research and investigations needed in response; and to assess collective progress.
33
Reflection
6. Collaboration
Reflection Questions
Identifying Need
�
Identifying Enabling Conditions
34
Incentivising and Supporting Collaboration
6. Collaboration
This Convening Series emerged out of the recognition that addressing the threats, challenges, and consequences involved in disinformation necessitates collaboration across many of the silos that presently shape how we work.
Participants have been clear, however, that they face many obstacles to advancing the kind of collaboration needed to cope with the challenges disinformation presents and creates.
Although the opportunity to collaborate in relevant areas and ways is desirable, in reality this is a skill- and resource-heavy endeavour, requiring know-how, time, management and coordination, and the funds to support these roles and activities so that collaboration can bear tangible fruit.
Many promising avenues have been proposed in this convening as well in as many others that have been taking place in the disinformation space. But moving these forward from ideas to action requires funders to enable, incentivise, and support collaboration.
Participants have indicated some key areas of attention to address:
35
Convened by:
For full outputs, recordings, and mural board links from the Convening Series, click here.
Produced by:
36