1 of 36

Convening Series on Countering Disinformation

Strategic Insights and Questions

Produced by: Reos Partners Convened by: CIFF and Oak Foundation

1

2 of 36

Contents

Introduction

A Caveat on Naming and Framing

1. Levels

      • Firefighting + Addressing Systemic Drivers
      • Reflection Questions

2. Lenses figE

  • 5 Lenses on the Disinformation Challenge
  • Reflection Questions

3. Leverage Areas and Game-changing Strategies

  • Financial and Commercial Incentives
  • Effective Communication
  • The Role and Voice of Local Communities
  • Research, Systemic Learning, and the Application Gap
  • Frameworks and Approaches to Information Ecosystems
  • Reflection Questions

4. Lifecycle

  • The Disinformation Lifecycle
  • Reflection Questions

  1. Roles and Responsibilities
  2. The Role of Implementors
  3. Reflection Questions for Implementors
  4. The Role of Thought Leaders
  5. Reflection Questions for Thought Leaders
  6. The Role of Funders
  7. Reflection Questions for Funders

6. Collaboration

  • Collaboration and enabling Conditions
  • Collaborative Opportunities
  • Reflection Questions
  • Incentivising Collaboration

Participating Organisations

2

3 of 36

Introduction

About the Convening Series on Countering Disinformation

The Convening Series on Countering Disinformation was conducted between December 2021 and February 2022, convened by CIFF and Oak Foundation and designed and facilitated by Reos Partners. �

The series convened participants from funding agencies, implementing organizations, and think tanks/ research organizations, to explore collective strategies for addressing challenges and obstacles presented by disinformation.  To do this, the series employed a systemic approach to disinformation, exploring both the impact of disinformation across various thematic areas and the underlying causes and drivers that enable disinformation to thrive and spread. Through a series of online workshops and meetings, participants explored the patterns, structures, and mental models that underpin disinformation dynamics, and identified potential leverage areas and game changing strategies in relation to the broader disinformation ecosystem. With these elements in view, they then explored different opportunities for influencing this system in meaningful and effective ways by reflecting on the kinds of roles that participants might play, both individually, and collectively.

�The results represent an initial, rather than a definitive, sketch of some of the key elements of a complex and rapidly evolving system. But they offer a frame for thinking about what may be needed to more effectively counter disinformation, and points to new opportunities for collaborative action.

About this Resource

This document supports stakeholders affected by or working on disinformation challenges to use the initial insights created during the series as a resource for developing programmatic, organizational, and collective strategies for countering disinformation as a systemic challenge. 

�To that end, the elements used to structure exploration during the convening series are re-presented below along with a set of key prompt questions to:

  • Catalyse open and reflective strategic thinking and conversation among actors working to address the cross-cutting challenges, risks and harms presented by disinformation at any level; and

  • Stimulate individual and collective strategic actions to influence the disinformation ecosystem.

�In this sense, this resource is intended as a compass -  a tool for plotting courses of action in what may be unfamiliar terrain - rather than a map of specific roads to follow.

3

3

4 of 36

A Caveat on Naming and Framing

Introduction

This series was entitled “Countering Disinformation”, however this framing has been a topic of discussion throughout. 

�During our outreach to potential participants, some declined on this basis, explaining that from their perspective, this framing had thus far not proven adequate to the complex nature of the problems being faced and therefore is no longer productive. 

�During the convening itself, expert presentations also problematized this framing, questioning the utility of the the terms mis- and disinformation, and observing that a widespread tendency to focus on “the atoms of information” is causing us to miss the larger narratives and complex flows of information of which they are a part. 

�Participant discussions also drew attention to considerations about the effectiveness of focusing on countering measures as cornerstone strategies, and shared diverse perspectives for framing both problems and solutions.

�We think this is useful. By approaching disinformation as a systemic challenge, this series has endeavoured to create a space to reflect on this framing and widen the aperture.

From this perspective, we have noticed two issues:

  • How the framing of  “countering disinformation” trains attention to the event level  - the tip of the iceberg -  rather than the underlying drivers and societal dynamics or the wider ecosystem in which disinformation proliferates. 

  • How focusing on “countering” or “combatting” can cause us to take up a belligerent approach, rather than a nuanced, systemic approach. It also implies a focus on attacking a problem rather than enabling a vision or aspiration. 

�We hope that this resource encourages stakeholders in this space to continue reflecting on this language and the mindsets that underpin it and to choose consciously what words to use.     

4

5 of 36

1. Levels

5

6 of 36

1. Levels

Fighting Fires + Addressing Drivers

By considering disinformation from a systemic perspective, we are looking not only at the event level or at isolated incidents but seeing the more complex interplay of factors that are continuously reproducing the phenomena that we find problematic.

Taking a systems view helps us consider (at least) two levels for engagement with regards to disinformation:

  • Fighting fires - which focuses at the event level, is concerned with responding to disinformation campaigns and events as these emerge (e.g. in relation to elections, social conflict, public health emergencies, natural disasters, etc.) often through a range of countering measures and techniques (including counter messaging, debunking, fact checking, and rumor tracking).

  • Addressing systemic drivers - focuses on patterns, structures, and mental models and the complex interplay of factors continuously reproducing the phenomena we find problematic. This level is concerned with working to shift the larger system that gives rise to and perpetuates disinformation dynamics, such as financial incentives, regulatory environments, technological innovations, as well as the sociotechnological conditions in particular contexts. 

 

 

Both are needed:

Fighting fires alone leaves the roots of the problem to grow and evolve, while addressing drivers alone does little to protect individuals and societies exposed to harmful impacts with potentially devastating consequences.

 

6

6

7 of 36

Reflection

In the convening series, we worked with the “iceberg” model from systems thinking, which distinguishes four levels - events, patterns, structures, and mental models - and asserts that your ability to influence increases with your ability to understand and act on the deeper levels of the iceberg (mental models and structures).   

 

1. Levels

Reflection Questions 

  • Where does your organization focus its attention and resources? To what extent do you focus your attention and dedicate resources to firefighting vs to addressing systemic drivers?  

  • Why?

  • Given your organizations’ expertise, capacities, and networks, what other opportunities do you see?

  • How might connecting your work with that of others working on a different level improve impact?

7

7

8 of 36

2. Lenses

8

9 of 36

2. Lenses

When people -  from journalists, activists, and policy-makers, to citizens, tech developers, educators and others - talk about disinformation problems and solutions they often do so from particular perspectives. 

It can be helpful to turn to a specific perspective, or lens, on complex systems like disinformation in order to learn more deeply about some dimension, or to frame complex problems in more actionable ways. But since the way we frame a challenge or problem influences the kinds of solutions we seek or see as relevant, it’s important to consider what different lenses bring into view.

�The lenses that are commonly applied in the disinformation space (and which were explored in the Disinformation Convenings) include: 

Lens 1: We have a technology problem!  Lens 2: We have an information problem!

Lens 3: We have an education and awareness problem!Lens 4: We have an accountability problem!Lens 5: We have a relationship problem!

These 5 lenses are described in more detail on the next page.

No single dimension or perspective is adequate for fully capturing, explaining, understanding, or addressing disinformation.

Combining, sharing and acknowledging multiple perspectives helps bring the larger system into view and can illuminate blind-spots.

9

10 of 36

Lenses

Lens 1: We have a technology problem!  

Viewed through this lens, the disinformation problem is seen to stem primarily from how the technology is engineered, the algorithms, infrastructure and platforms, the way it uses data, its capabilities (its “godlike” nature), as well as who controls the technology.

Solutions commonly prescribed from this perspective include: the use of artificial intelligence, new algorithms, public service technologies, redirect methods and visibility filtering.

Lens 2: We have an information problem!

�Viewed through this lens, the disinformation problem is a problem of information - “fake news”, rumours, viral slogans, conspiracy theories, and the people creating and spreading them, as well as freedom of expression and information overload. ��Solutions commonly prescribed from this perspective include: fact-checking, counter-messaging, pre- and debunking, labelling, and de-platforming.

Lens 3: We have an education and awareness problem!

�Viewed through this lens, the disinformation problem is a problem of lack of media literacy, awareness, resilience, digital hygiene practices and skills to detect and respond among those who are affected by it and inadvertently participating in it. ��Solutions commonly prescribed from this perspective include: media literacy, digital hygiene, awareness-raising campaigns, referral services, and support to local journalism.

Lens 4: We have an accountability problem!

�Viewed through this lens, the disinformation problem is a problem of roles and responsibilities across the system: clarity of roles, duty of care, risk mitigation, safety protocols and lack of regulation and accountability to regulators, as well as to and among citizens. ��Solutions commonly prescribed from this perspective include: regulation, litigation, internet governance, public interest infrastructure, industry reform, and consumer mobilisation.

Lens 5: We have a relationship problem!

�Viewed through this lens, the disinformation problem is a problem of national and geo-politics and conflicts, behaviour of “bad actors”, social polarisation, need for belonging, and a general fraying of trust, social cohesion and the social contract. ��Solutions commonly prescribed from this perspective include: depolarisation, countering hate speech, dialogue and engagement, hope-based messaging, narrative-building, and conflict transformation.

2. Lenses

In the convenings, participants mapped the patterns, structures and mental models from each of these perspectives in order to bring the system more fully into view.

10

11 of 36

Reflection

2. Lenses

Reflection Questions

  • What lens do you primarily use to explain or understand disinformation in your work? 

    • What does this lens help us see and understand about disinformation that other lenses may miss? (What features, dynamics, impacts, harms, etc?)

  • What are some of the limitations of the lens you primarily use? (What does it miss, that may be important to know about?)  

  • What kinds of solutions does this lens focus your attention towards? 

What would another lens bring into view?  

  • How might combining lenses help create a fuller view of the challenges you are working on, and of the kinds of solutions that might be effective?

11

11

12 of 36

3. Leverage Areas

12

13 of 36

3. Leverage Areas

Drawing from systems thinking, leverage areas can be defined as cross-cutting and consequential issues, themes, or concerns where influence or change can help shift problematic dynamics in a complex system.

�Leverage areas are places where you can strategically take action to address a given situation. They are considered low leverage if a small amount of effort will lead to a small change, and high leverage if a relatively small amount of force can lead to a large change. In dealing with complex social problems, high-leverage areas are those that address “root causes” getting to the level of structures and mental models. 

One way of influencing leverage areas is through game-changing strategies. These are interventions that change the rules of the game by interceding at the structural level (as opposed to addressing instances at the event level/ “firefighting”).

�In this Convening Series, participants identified 5 leverage areas (though there are certainly more to be discovered). A description of each of these areas and their accompanying game-changing strategies follows.

13

13

14 of 36

3. Leverage Areas

Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:

  • Develop carefully crafted policy and legislation to regulate around incentives, transparency, safety, liability, algorithms, and anti-trust.
  • Mobilize pressure from advertisers and shareholders to shift incentives.
  • Mobilize users and stakeholders in boycotts, strategic lobbying, and advocacy.
  • Build / support public interest social platforms and media that aren't financially incentivized or profiting from causing harm.
  • Change business models and governance structures of social media companies from within.

Leverage Area 1: Financial and Commercial Incentives

Core Question: How can we shift or influence the financial and commercial incentives of the disinformation/ information economy?

 

This leverage area captures a strong theme that emerged around the business models animating social media companies that incentivize technology, products, services and practices that privilege profit over people and foster the spread of disinformation throughout information ecosystems. This leverage area pertains not only to the attention economy, but also encompasses the role of advertisements in legacy media, and incentives to platform users to create opportunities to monetize information in this poorly regulated space.

14

15 of 36

Leverage Area 2: Effective Communication

Core Question: How can we produce fact-based information and depolarising narratives that are more resonant and gain better traction with communities?

 

This leverage area is concerned with the power of disinformation to dominate fact-based information as well as so-called “pro-social” narratives in seizing attention, tapping into deeply held beliefs, capitalizing on fears, and resisting counter-messaging. It focuses on finding ways to ensure messages and narratives that serve society, foster cohesion, support democracy, protect well-being, and respect human rights can gain as much traction with communities as disinformation does.

Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:

  • Develop holistic and coordinated approaches addressing the whole information lifecycle including production and distribution.
  • Create spaces and opportunities for cross-sectoral, cross-organizational exchange to accelerate learning and improve approaches /capacities for effective communication.
  • Create funding models that support an adaptive communication response to triggering events, disinformation patterns, surges, threats and risks.
  • Conduct systematic research into local cultural discourses, audience perspectives and beliefs, and systems of cultural communication and use this information explicitly in building messages and communication strategies.
  • Understand, activate, equip, and support a wider variety of messengers who are in touch with the community.
  • Build consumer awareness about disinformation and its sources/dynamics to create a more receptive environment for pro-social/depolarizing/fact-based messages.

3. Leverage Areas

15

16 of 36

3. Leverage Areas

Leverage Area 3: The Role and Voice of Local Communities

Core Question: How can we ensure local communities are involved in the prioritisation of risk and harm and other key decisions related to disinformation, and empowered to address disinformation in their contexts? 

The constrained role and voice of affected and marginalized communities in framing problems and developing solutions to disinformation arose in different ways across comments and discussions related to Convening Session 1. This leverage area encompasses challenges related to top-down approaches to disinformation when there is much to learn from affected communities; the dominance of Northern voices; a need for rights-based approaches; and to address the marginalization of activists and communities in dialogue and decision making around approaches to disinformation that affect them.

Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:

  • Develop funding approaches that meet the needs of community actors (e.g., smaller grants, simpler reporting, local languages...)
  • Support capacity-development (including South-South) at scale across and within communities.
  • Create multi-local practice-based coalitions across geographies.
  • Create/require community-based processes for standard-setting and for identifying and prioritizing risks and harms.
  • Employ local journalism as a key tool in giving voice to community priorities.
  • Improve community access to data and info about targeting to support their informed participation.
  • Ensure processes, discussions, tools and resources are conducted and available in local languages.

16

16

17 of 36

3. Leverage Areas

Leverage Area 4: Research, Systemic Learning and the Application Gap

Core Question: How can we develop spaces, infrastructures, and capacities for systemic learning about disinformation while still enabling and connecting to needed short-term tactical actions?

   

As Dr. Claire Wardle observed in Convening Session 1, the field of disinformation studies is both very new, and rapidly evolving. So are the technologies, tactics, and broader dynamics implicated by disinformation themselves. This leverage area takes up the importance of improving sustained opportunities for deep, rigorous, and empirical research; elevating the level of discussion about disinformation across different stakeholder groups through such research; improving awareness, understanding and skills among those involved in response; and addressing the existing “application gap” (e.g. the gap around processes for employing knowledge resources to inform and improve approaches and responses to emergent events and challenges).

Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:

  • Support research infrastructure and mechanisms for collaboration in the Global South.
  • Build a global infrastructure including repositories of research and practice; observatories for disinfo monitoring, testing, citizen science, directories of researchers and practitioners; and platforms for testing potential interventions.
  • Incentivize research cooperation and collaboration through pooled investment, cooperation mandates, and the establishment of joint strategies.
  • Create mechanisms for collaboration across the research, application, and policy cycle to support and incentivize applied research.
  • Build consistency in methods, metrics, and tools / Standardize research for tracking (use systematic methods and consistent parameters) to provide useful information to regulators.

17

18 of 36

3. Leverage Areas

Leverage Area 5: Frameworks and Approaches to

Information Ecosystems

Core Question: How can we better conceptualise and convey information ecosystems so we can better assess and prioritise threats, and guide effective interventions?

 

The ability to design and implement effective responses to disinformation dynamics relies at least in part on the capacity of our frameworks and processes to conceptualize and communicate information ecosystems, as well as the broader systems at play in disinformation, in useful and useable ways for a range of actors. Likewise, there is an increasing need for frameworks that are relevant for different kinds of actors and organizations to detect how disinformation plays out in and across the information ecosystems they are involved in, and to assess its implications. Promising and useful approaches to information ecosystems are being used by several organizations in the context of their work on (for example) elections, health communication, conflict, and in humanitarian contexts, among others. But challenges still exist when it comes to moving from the detection of disinformation to the design of interventions, and in mobilizing the findings of such information ecosystem assessments for diverse programming purposes, context, and mandates. This leverage area pertains to how we can improve, and make wider use of, frameworks and approaches for detecting, assessing, and responding to disinformation and related challenges.

Game-changing Strategies ��Game-changing Strategies identified in the Convening Series for this leverage area include:

  • Use emerging technologies and futures work to help visualise evolving and future disinformation dynamics.
  • Diversify the perspectives involved in development, critique, and testing of frameworks for conceptualizing information ecosystems (particularly including non-Western non-dominant perspectives).
  • Apply Human Rights as a lens for describing and investigating information ecosystems (detect, assess, respond).
  • Build models of information ecosystems including revenue supply chains, stakeholder incentives, social vulnerabilities, drivers, and precipitating conditions.
  • Establish a globally applicable, shared vocabulary/heuristic that both research and policy can organise around.

18

18

19 of 36

Reflection

3. Leverage Areas

Reflection Questions 

(Perhaps could be included on each page?)

  • What leverage area(s) are you focusing on?

  

  • Are you aware of other actors working on the same leverage areas? How might your work complement each other’s in influencing these leverage areas?

  • To what extent is your strategy for influencing different leverage areas game-changing?  Does it influence the situation at the level of rules of the game – structures, practices, mental models?

  • Are you aware of other actors working on other leverage areas, and the importance of their work in complementing yours? 

19

19

20 of 36

4. Lifecycle

20

21 of 36

4. Lifecycle

From C. Wardle, H. Derakshan, 2017, p. 6

Countering disinformation effectively requires efforts to intervene or influence along the different stages of the disinformation lifecycle. In the same way that much attention is often directed toward the event level of disinformation, there is also an inclination to focus on distribution phase of the disinformation life cycle. 

These phases of message creation and distribution also sit within, and interact with, larger sociocultural systems of human communication, both on and offline, whereby disinformation can contribute to the erosion of social cohesion. In turn, weakened social cohesion can create greater susceptibility to, and participation in, disinformation. Connecting (or extending) the disinformation life cycle (e.g. creation - production - distribution - reproduction - etc…) to the social processes and worlds of which it is part is an important part of taking a systemic approach to addressing disinformation.

�For example, as messages draw their content and meaning from social worlds, paying attention to the values, tensions, and relationships they may seek to exploit can be a valuable contribution in addressing the disinformation lifecycle.

21

21

22 of 36

Reflection

��

4. Lifecycle

Reflection Questions

 

  • Is your work focused on a specific phase of the disinformation lifecycle?

  • Are you aware of actors working on the other parts of the cycle and the importance of their work in complementing yours? 

  • Are there ways of aligning strategies, initiatives, or interventions working at different points of the disinformation life cycle? Might this improve impact?

22

22

23 of 36

5. Roles and Responsibilities

23

24 of 36

5. Roles and Responsibilities

Roles

Reflecting on the kinds of roles we play in a system can help generate important insights that can be useful for identifying strategic opportunities.

With the understanding that a single organisation may well fit more than one category, we used three general “actor types” to help us design and facilitate discussions, and include different perspectives in the Convening Series:

  • Implementers / Practitioners
  • Knowledge Generators / Thought Leaders
  • Funders

Participants were also asked what roles they would like to see each participating actor group play towards shifting the disinformation system. 

Responsibilities

No matter what roles we may play, we are all inherently information actors - both as individuals and as organisations. This means we all have the potential to become both targets and unwitting vectors of disinformation, a reality that is born out daily. The threats, risks, and harms presented by disinformation dynamics therefore activate a new set of responsibilities to take in to account, especially as we consider roles to play and actions to take to address them.

24

24

25 of 36

Implementers / Practitioners

5. Roles and Responsibilities

For the purposes of this convening, we’ve used the term “implementers” to describe organisations that provide support through services or through carrying out programming and interventions. The implementers participating in this series represent a wide range of “third sector” organisations working across an array of thematic concerns using different approaches and techniques to address issues, challenges, and threats related to disinformation dynamics. A smaller group of participating implementers focus on thematic areas including  domestic violence, public health, peacebuilding, polarization, conflict, development, and climate change, but are grappling with the challenges of disinformation in the contexts in which they work. 

Role propositions

  • Problem Solver
  • Enabler
  • Cultivator
  • Explorer
  • Innovator
  • Educator
  • Informer
  • Evaluator
  • Critic
  • Collaborator
  • Convener

25

25

26 of 36

Implementers / Practitioners

5. Roles and Responsibilities

Reflection Questions

    • What roles do you currently play? 

    • Do you notice other roles where you could be making more of a contribution? 

    • Are there some roles that are poorly aligned with your expertise, mandate, or resources?

    • Are you aware of actors playing other roles and the importance of their work in complementing yours? 

    • What roles are you playing in supporting the “applied move” (mobilizing information and evidence as you design action) on matters of disinformation ?

    • Where might you be more effective collaborating and sharing resources with others, and where is autonomous action more effective for you and your organisation?

    • How might you enhance your role in sharing your learning with others to strengthen the work of your allies?

    • Given the roles you play, what are your responsibilities in the context of disinformation:

-Vis a vis the communities you serve?

-Vis a vis your staff and partners?

-Vis a vis others in your sector?

-Vis a vis other sectors?

Consider:

What is your potential for becoming a target of disinformation?

What is your potential for becoming an unwitting vector of disinformation?

What are the unintended harms that can emerge from such conditions, and for whom?

Given the roles you play, what protocols should you have in place to mitigate and/or respond to such harms?

26

26

27 of 36

Knowledge Generators / Thought Leaders

The term thought leader is commonly used to refer to influential individuals who help shape public discourse and understanding of a particular topic. For the purposes of this convening, we have also used this term to broadly include technical experts and organisations typically situated within universities, governments, or multilateral organisations, dedicated to conducting research as their primary purpose, and who are recognized as authorities on disinformation. As individuals or organisations, thought leaders (thus defined) influence and guide fields of study, methods and approaches taken up across sectors and stakeholders, and shape both public and policy discourse.

5. Roles and Responsibilities

Role propositions

  • Strategiser
  • Educator
  • Informer
  • Threat detector
  • Pressure Builder
  • Collaborator
  • Convenor
  • Evaluator
  • Problem solver
  • Safeguard

27

27

28 of 36

Knowledge Generators / Thought Leaders

5. Roles and Responsibilities

Reflection Questions

    • What roles do you currently play? 

    • Do you notice other roles where you could be making more of a contribution? 

    • Are there some roles that are poorly aligned with your expertise, mandate, or resources?

    • Are you aware of actors playing other roles and the importance of their work in complementing yours? 

    • What roles are you playing in supporting the “applied move” on matters of disinformation, and what others might you consider stepping into (e.g. developing and sharing information and evidence that is applicable in the design of action; guiding the use of such information in design itself; creating methods, framework, processes or mechanisms for making the applied move, etc.)?

    • Given the roles you play, what are your responsibilities in the context of disinformation:

-Vis a vis the communities you serve?

-Vis a vis your staff and partners?

-Vis a vis others in your sector?

-Vis a vis other sectors?

Consider:

What is your potential for becoming a target of disinformation?

What is your potential for becoming an unwitting vector of disinformation?

What are the unintended harms that can emerge from such conditions, and for whom?

Given the roles you play, what protocols should you have in place to mitigate and/or respond to such harms?

28

28

29 of 36

Funders

The philanthropies participating in this series include organizations with significant expertise and experience supporting work on disinformation and the digital space from a number of vantage points (e.g. responding to incidents, mitigating harms, addressing drivers), as well as those who are newer to the space, and those who are  working in  other thematic areas but who are facing challenges related to disinformation in carrying out their work.

 

5. Roles and Responsibilities

Role propositions

  • Long term investor
  • Enhancer/capacity builder
  • Force multiplier
  • Collaboration incentiviser
  • Supporter
  • Connector
  • Community champion
  • Awareness raiser
  • Coordinator
  • Information sharer
  • Risk taker
  • Agenda setter
  • Rapid responder

29

29

30 of 36

Funders

5. Roles and Responsibilities

Reflection Questions

    • What roles do you currently play? 

    • Do you notice other roles where you could be making more of a contribution? �
    • Are there some roles that are poorly aligned with your expertise, mandate, or resources?

    • Are you aware of actors playing other roles and the importance of their work in complementing yours? (e.g. are you complementing the work of other funders and working in alliances?) 

    • Is your funding approach incentivising or disincentivising collaboration, agility, innovation, diversification and inclusion among your grantees?

    • How are you supporting the “applied move” (mobilizing information and evidence in the design of action) on matters of disinformation ?

    • Given the roles you play, what are your responsibilities in the context of disinformation:

-Vis a vis the communities you serve?

-Vis a vis your staff and partners?

-Vis a vis others in your sector?

-Vis a vis other sectors?

Consider:

What is your potential for becoming a target of disinformation?

What is your potential for becoming an unwitting vector of disinformation?

What are the unintended harms that can emerge from such conditions, and for whom?

Given the roles you play, what protocols should you have in place to mitigate and/or respond to such harms?

30

30

31 of 36

6. Collaboration

31

32 of 36

Collaboration + Enabling Conditions

6. Collaboration

Systemic problems can only be addressed with the help of collaboration across organisations, sectors and disciplines. While autonomous action may be efficient when fighting fires, shifting leverage areas cannot be done by single actors on their own. This requires bringing together complementary approaches, skills, and resources to operationalize game-changing strategies.

Collaboration is not always the most accessible, agile, straightforward, or cost-effective option, and the intention should not be to collaborate for collaboration’s sake.

It is important to consciously decide when, on what, and with whom to collaborate and to support necessary collaborations through incentives, investment, and enabling vehicles, processes, and mechanisms.

In this series, participants developed an initial set of collaborative opportunities to shift key leverage areas of the disinformation ecosystem.

To help develop further ideas, we present questions to spur reflection on where collaboration is needed, and to identify the specific enabling conditions needed for success.

32

33 of 36

Collaborative Opportunities

6. Collaboration

The following propositions indicate areas where participants suggest that collaboration - if sufficiently scoped, resourced, organized, coordinated and facilitated - can make a difference for both tackling emergent disinformation events and in shifting key leverage areas.

  • Forming coalitions / platforms to learn, collaborate, and share common approaches on particular aspects of the problem

  • Aligning policy advice/ taking a multi-stakeholder approach to policy

  • Coordinating of messaging/ Shared communication/ Amplifying each other’s messages

  • Attending to the “applied move” – collaboration among researchers and practitioners to translate scientific findings for use in the practical work of NGOs and governments.

  • Collectively generating and sharing mission critical knowledge through monitoring data/ trends/ actors, creating shared playbooks, assembling lessons learnt, and sharing effective strategies.

  • Sharing resources and infrastructure: economies of scale from shared technical/ legal/ accounting support, back-office functions, comms expertise

  • Creating deliberate North-South and South-South collaboration

In addition, a number of ideas for collaborative initiatives were shared:

  • A Convening Body/ Linking Organisation bringing together a network of diverse experts to develop complementary strategies, build minimum standards, and provide both pre-emptive and reactive stand-up support in response to incidents. Such a body would be supported by a directory with clear mapping of specialities and complementarities within this network, and play a central role translating intelligence and evidence to inform policy, regulation, and action.

  • A regular “State of the Field” report for the mis- and disinfo space

to: set shared strategy and goals at the systems level; identify and make visible gaps in knowledge, information, and practice; set agenda and standards for research and investigations needed in response; and to assess collective progress.

  • A living “crisis playbook” to put into action when the time comes that assembles the best strategies, approaches, protocols and methods for responding to disinformation incidents (e.g. disinformation campaigns, information operations, etc.)

  • Shared standards education and training for shareholders, regulators, funders and policy makers to build capabilities for working on future crises, and help them understand the knowledge gaps in their own organisations.

33

34 of 36

Reflection

6. Collaboration

Reflection Questions

Identifying Need

          • What do we need to collaborate on within our own organization? Who needs to participate in this collaboration?

      • What do we need to collaborate on with other organizations/actors in our sector? With whom?

      • What do we need to collaborate on with other organizations/actors in other sectors? With whom?

  • What might we be able to accomplish through this collaboration that we would not be able to accomplish through autonomous action? What might change as a result?

Identifying Enabling Conditions

  • What will enable us to collaborate successfully? How might we put these enablers in place? For example:
        • Strategic alignment
        • Complementarity and role clarity
        • Investment of resources in the collaboration
        • A coordinating mechanism or vehicle for collaboration
        • Integrated communication and knowledge-sharing
        • Convening and facilitation
        • Relationships and regular contact
        • Collaborative capabilities
        • Dedicated staff

  • What obstacles are in the way for us to collaborate successfully? How might we overcome/ remove them? For example:
  • Competition for funding
  • Power imbalances
  • Stress and burnout

34

35 of 36

Incentivising and Supporting Collaboration

6. Collaboration

This Convening Series emerged out of the recognition that addressing the threats, challenges, and consequences involved in disinformation necessitates collaboration across many of the silos that presently shape how we work.

Participants have been clear, however, that they face many obstacles to advancing the kind of collaboration needed to cope with the challenges disinformation presents and creates.

Although the opportunity to collaborate in relevant areas and ways is desirable, in reality this is a skill- and resource-heavy endeavour, requiring know-how, time, management and coordination, and the funds to support these roles and activities so that collaboration can bear tangible fruit.

Many promising avenues have been proposed in this convening as well in as many others that have been taking place in the disinformation space. But moving these forward from ideas to action requires funders to enable, incentivise, and support collaboration.

Participants have indicated some key areas of attention to address:

  • Funding approaches that: incentivize collaboration rather than competition; support and facilitate collaboration with other organizations; provide resources for the management and coordination of collaboration, and offer ssustainable, long-term support for core, organisational support, job stability, and infrastructure.

  • Building and sharing expertise through capacity-building of staff and organizations around disinformation, and dedicated staff to apply expertise.

  • Convening opportunities for continued conversation learning, gathering across sectoral perspectives, catalysing collaboration (for example around each of the five leverage areas) and developing shared strategies.

  • Coordinating around strategic orientation to help “define the game we are trying to win”, clarify defined goals, and develop complementary strategies across groups, themes, etc.

35

36 of 36

Convened by:

For full outputs, recordings, and mural board links from the Convening Series, click here.

Produced by:

36