1 of 48

Knowledge Justice in Digital Spaces: Navigating Algorithmic Bias

Lea Sansom, MLIS &�Dani Dilkes, MEd, MSc, BCmp�

2 of 48

Agenda

Part 1:

Introductions and context

Epistemic injustice and algorithmic bias

Activity 1 & 2: Critical reflection

Activity 3: Mapping algorithmic bias to epistemic injustice

Part 2:

Knowledge justice

Activity 4: Discussion - Practicing knowledge justice in digital spaces

Activity 5: Reflection to action

3 of 48

Access Statement

  1. There will be small group activities and discussions. You may choose to work in a group or work individually.
  2. We will describe key images in the slides.
  3. Captions can be turned on in Zoom. 
  4. You need only share what you’re comfortable sharing.
  5. If your access needs are not being met, please speak to one of the facilitators.

4 of 48

Learning Outcomes

  • After this session, you will be able to:
    • Define epistemic injustice, knowledge justice, and algorithmic bias 
    • Understand the link between epistemic injustice and algorithmic bias 
    • Discuss how to apply a knowledge justice lens to online searching 

5 of 48

Who are we in relation to this work?

6 of 48

What brought us here

7 of 48

Knowledge Justice in the Helping Professions: From Theory to Practice 

Register to receive the OER

Content from today’s workshop coming from forthcoming Open Educational Resource, launching August 2025 via Pressbooks:�

Campbell, H., McKeown, A., Holmes, K., Sansom, L., Lengyell, M., Dilkes, D., and Glasgow-Osment, B. (Eds.). (2025). Knowledge Justice in the Helping Professions: From Theory to Practice

8 of 48

Epistemic Injustice

When people "undermine, undercut, disvalue, curtail, exclude, outright dismiss, or, in some cases, gaslight a person/or persons in their capacity as potential knowers" (Dunne 2020)

9 of 48

Epistemic Injustice is When Others…

  1. Deny that our opinions are valid or that we are reliable because of our social identities
  2. Erase, discount, or destroy our language, cultural practices, and ways of knowing
  3. Purposefully keep us isolated from others who are like us or from folks with similar experiences
  4. Do not understand our needs (and we may not even understand ourselves!) because people like us have been kept out of mainstream society

10 of 48

A Phenomenon by Many Names

Philosophy

    • Epistemic injustice (Fricker)
      • Hermeneutical & Testimonial injustice

First Nations Perspectives

    • Cognitive imperialism (Battiste)
    • “Empire of epistemologies” (Brunette-Debassige)

Critical Theory

    • Coloniality of knowledge (Quijano)
    • Epistemicide (de Sousa Santos)

Health Sciences

    • Epistemic exclusion (Dotson)

Decolonial Theorists

    • Epistemic violence (Mignolo)

Library & Info Science

    • Curricular injustice (Patin)
    • Knowledge justice (Leung & López-McKnight)

Social Anthropology

    • Cognitive justice (Visvanathan)

Epistemic injustice is when people "undermine, undercut, disvalue, curtail, exclude, outright dismiss, or, in some cases, gaslight a person/or persons in their capacity as potential knowers" (Dunne, 2020).

11 of 48

Epistemic Injustice in Nursing Education

12 of 48

Menti Instructions

Go to

www.menti.com

Enter the code

9455 7655

13 of 48

Activity 1:Critical Reflection

Where have you encountered or witnessed epistemic injustices in your discipline, profession, or education?

Discuss with your peers and/or add to Menti.com with code 9455 7655.

14 of 48

Algorithmic Bias

15 of 48

What are algorithms?

  • Instructions or rules that machines follow to perform tasks
  • Can be hard-coded by humans or created through machine learning (black box)

16 of 48

Coded Bias

17 of 48

Activity 2.1: Reflect

What examples of algorithmic bias come to mind?

Discuss with your peers and/or add to Menti.com with code 9455 7655.

18 of 48

Where does Algorithmic Bias come from?

  • Social biases can be encoded in computer programs at any step of the algorithmic development or deployment process:
    • Underrepresentation of social groups or perspectives in data sets
    • Limited frames of reference of programmers
    • Biassed labelling or categorization of data
    • Lack of diversity in user testing
    • Self sustaining feedback loops that reinforce and amplify biases
    • Misalignment of user needs and algorithmic objectives

19 of 48

Bias in Generative AI

20 of 48

Activity 2.2: Reflect

What harm can Algorithmic Bias cause?

Discuss with your peers and/or add to Menti.com with code 9721 5938.

21 of 48

What harm can Algorithmic Bias cause?

    • Harms can range from mild inconvenience to significant social, health or financial impact
    • Harms tend to impact already marginalized groups the most significantly
    • Many harms are forms of Epistemic Injustice

22 of 48

Algorithmic Bias as Epistemic Injustice

    • Algorithms can devalue the experience and knowledge of individuals based on skewed metrics
    • Systems can be designed to only allow for specific possibilities or ways of being that do not align with an individuals’ own experiences and needs, preventing them from participating in society
    • Algorithms can limit the reach of certain voices or exclude them entirely

23 of 48

Activity 3: Cases of Algorithmic Bias

Using the random generator provided, discuss one or more cases of algorithmic bias.

Discussion Questions

  • What is the root cause of the algorithmic bias in this example?
  • What harm is caused by this example of algorithmic bias?
  • How does the algorithmic bias reinforce or mask systemic inequities?
  • Who is being silenced, discredited, or excluded from knowledge production, validation, or access?

24 of 48

Case 1: Shadow banning

  • TikTok content creators reported noticeable declines in viewership and engagement on content when posting content in support of the Black Lives Matter movement. TikTok attributes this to “technical glitches”, but many view this as a type of censorship, often enacted by posts being muted or hidden from followers or disappearing from the platform entirely, called “Shadow banning”.

From These TikTok Creators Say They’re Still Being Suppressed for Posting Black Lives Matter Content

25 of 48

Case 2: Recruitment by Algorithm

  • In 2014, Amazon developed a recruiting algorithm to rate resumes of job applicants and predict who would succeed at Amazon. The algorithm learned to favour male applicants and penalize female applicants, particularly for technical roles like Software Engineers. Even if gender wasn’t stated on an application, attending women-only colleges or participation on women’s sports teams were both correlated to a lower overall rating.

From Why Amazon’s Automated Hiring Tool Discriminated Against Women

26 of 48

Case 3: Programmatic Gender Norms

  • Body scanners at airports are configured based on binary cis-normative understandings of gender. Physical “anomalies” that do not conform to statistic models of male or female are flagged and those passengers are required to go through secondary screening (a pat-down).

From Design Justice

27 of 48

Case 4: Grade Prediction

In 2020, England’s Office of Qualifications and Examinations Regulation (Ofqual) used an algorithm to predicate grades when exams needed to be cancelled due to COVID-19.

Algorithmic used 3 factors for prediction:

    • Instructor prediction of the final grade
    • Students’ academic performance to date
    • School’s historical performance data

Analysis of the results revealed that the scores of students from working-class and disadvantaged communities were more likely to be downgraded and the scores of students from private schools were more likely to be inflated.

From The UK exam debacle reminds us that algorithms can’t fix broken systems

28 of 48

Case 5: Virtual Proctoring Tools

  • Automated proctoring is on the rise, which involves students being recorded while completing online exams, and the recordings being run through machine algorithms to flag suspicious behaviour. The machine algorithms are trained to identify if a student looks away from the screen more or less often than average, if they type on the keyboard more or less often than the average, if their lips are moving, if they stand, and other behaviours deemed by the algorithm as potential signs of cheating.
  • Students with dark skin tones have also reported issues with facial recognition built into proctoring tools, with the tools often unable to recognize and read their faces.

From Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education and Students Are Rebelling Against Eye-Tracking Exam Surveillance Tools

29 of 48

Case 6: AI Detection Tools

  • There is an increase in the development of tools that claim to be able to detect AI-generated text. However, these tools tend to have a very high false-positive rate for assignments written by non-native speakers of English. In a study conducted by researchers at Stanford University, more than half of the essays written by English-Language Learners were flagged as AI generated. The algorithm used to detect AI-generated content is based on the perceived sophistication of language, including the use of varied vocabulary, the use of complex grammar and the use of complex sentence structure.

From AI-Detectors Biased Against Non-Native English Writers

30 of 48

Algorithmic Bias Summary

  • Algorithms replicate and perpetuate human biases; however, the biases embedded in algorithms are often invisible
  • The harm of algorithmic bias can vary widely from inconvenience to significant social exclusions
  • Many forms of algorithmic bias equate to epistemic injustice

31 of 48

Knowledge (epistemic) justice

Visvanathan (1997); Leibowitz (2017); Leung & López-McKnight (2021)

Knowledge justice is based on the principle that every person has the equal capacity to be knowledgeable, yet this right is often denied to individuals based on the social identities they hold. It also involves recognizing that some knowledge systems, particularly those of Indigenous peoples, have been purposefully ignored, silenced, or actively eliminated.  �

To practice knowledge justice, we must challenge the dominance of Eurowestern systems and act on our responsibility to engage in meaningful dialogue across multiple and diverse perspectives. This requires approaching our own ways of knowing with humility, acknowledging the edges or limits of what we understand. By so doing, we can learn to see the world through multiple ways of knowing and approach diverse epistemologies with an open mind. 

32 of 48

Our Practice of Knowledge Justice

  • Lea (Librarian)
    • Teaching students about knowledge/information seeking
    • Relied on as an “expert” by faculty colleagues
    • At the same time, I rely on being invited in

  • Dani (Educational Developer)
    • Exploring emergence and codesign as pedagogical practices
    • Draw on lived-experience expertise and narratives for educational design conversations
    • Bring interdisciplinary perspectives to my work

32

33 of 48

Seeking Voices, Not Sources

Campbell, H., McKeown, A., Sansom, L., & Holmes, K. (2025, forthcoming). Image 2: Voices Flower. In: Knowledge Justice in the Helping Professions: From Theory to Practice: Pressbooks.

The outer parts of the petals in this graphic represent different voices or identities that may be speaking to a topic.

The petals are overlapping, representing that individuals may hold more than one intersecting identity, and that no one person can speak to an entire group.

34 of 48

Seeking Voices Framework 1/3

Powerholders

Arms-Length Observers

Who: Governments, lawmakers, academic and health care institutions, or regulatory bodies.

Who: Journalists, academics, or research institutions that observe and report on phenomena without necessarily being directly embedded in the experience.

35 of 48

Seeking Voices Framework 2/3

Representative Groups

Care Providers

Who: Advocacy groups, non-profit agencies, Indigenous governments, and community organizations.

Who: Frontline healthcare workers, alternative and traditional practitioners, midwives, doulas, and others who provide care outside dominant models.

36 of 48

Seeking Voices Framework 3/3

Individuals with Lived Experience and their Loved Ones

  • The epistemic centre of our framework.
  • Critical in countering the silences and distortions in dominant knowledge structures.

37 of 48

Knowledge Justice is…

Messy

Complex

Nuanced

Necessary

38 of 48

Activity 4: Discussion

We will put you into breakout rooms. Discuss as a group or reflect on your own.

Discussion Questions

  • Whose voices might we find in digital spaces or using digital tools?
  • How can we balance the inherent bias in the tools we use with the necessity of using them?
  • In general, how can these tools help us practice knowledge justice?

39 of 48

Mis, dis, and mal information

What do we do about it?

40 of 48

A Framework of Harms

Profit over people

Misusing or falsifying authority

Theft of knowledge

Exclusion, deletion, and censorship

Bigotry and harmful language

41 of 48

Special Consideration: AI Generated Content

Consider

  • Who owns the company? What is their mandate? Is that information available?
  • How was the model trained? Is that information available?
  • What are the boundaries of the model (i.e. what is it “allowed” to talk about)? Is that information available?

Red flags

  • Do images have any missing or misplaced elements? Does lighting, textures, skin look realistic?
  • Does audio sound realistic? Is narration natural or does it sound clipped, robotic, or emotionless?
  • Is text or writing uniform and factual? Are words or phrases used over and over? Are there a high number of –em dashes– used?
  • Is an author/creator listed? Can you find any information about their positionality or intersections the author/creator to know that they’re real?

42 of 48

Bringing It All Together

43 of 48

Using a knowledge justice lens, we can…

  • Hold multiple voices in balance and understand where those voices may be silenced
  • Find lived experience and diverse voices online
  • Practice mindful exclusion by recognizing how knowledge can cause harm
  • Challenge epistemic injustice and minimize the harm we cause to others

44 of 48

Final Reflections

45 of 48

Activity 4: Reflection to Action

  1. One Minute Paper
  2. What is one thing you learned today or are taking away?

46 of 48

Activity 4: Reflection to Action

2) Create a Plan

  • How will you implement your learning into your practice moving forward?

47 of 48

Questions?

47

48 of 48

Thank you!