1 of 29

AI, DIGITAL LITERACY, & ETHICS

EDUCATORS' SUMMIT

2 of 29

Agenda

  • 10-10:15 am Welcome and table introductions
  • 10:15-10:45 am Introduction to Generative AI/LLMs and Education
  • 10:45 am-12 pm Demonstrations (interactive)
  • 12-1:15 pm Small group sessions and lunch
  • 1:15-2 pm Reporting and next steps

3 of 29

Objectives

  • Discuss the educational, ethical, and societal issues surrounding generative AI large language models
  • See several current AI tools demonstrated
  • Share concerns and insights related to teaching, learning, and assessment with AI
  • Gain some practical takeaways for classroom teaching
  • Begin building a network and community of educators in the Kansas City MO and NE Kansas area interested in AI issues and their relationship to Humanities teaching more generally

4 of 29

Framing Questions

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

What do students need to know about AI, and how can we teach them?

What research and critical thinking skills will students need to assess the quality of AI-generated content?

Can we teach students to use these tools ethically, in a way that supports rather than undermines their learning?

How can we ensure no student is left behind? How can we reduce harm and ensure the benefits of AI are shared equitably?

5 of 29

Defining digital literacy

  • UNESCO: "the ability to access, manage, understand, integrate, communicate, evaluate, and create information safely and appropriately through digital technologies for employment, decent jobs and entrepreneurship."

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

6 of 29

Critical AI literacy

  • Maha Bali on critical AI literacy: "the basic skill of how to use" AI + "the capacity to know when, where and why to use it for a purpose and, importantly, when NOT to use it."

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

7 of 29

What do the Humanities have to offer discussions about AI?

  • Critical thinking
  • Historical perspective
  • Cross-cultural understanding
  • Broad, diverse perspectives on human flourishing
  • Careful attention to language and its impacts
  • Understanding of the human condition

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

8 of 29

What do the Humanities have to offer discussions about AI?

Framework for understanding AI ethics 

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

9 of 29

What do students (and teachers) need to know about generative AI?

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

10 of 29

How do Large Language Models work? 

  • AI text generation comes from Large Language Models (LLMs) + chat interface
  • LLMs are built on neural networks – a type of mathematical model (watch this great explainer video from 3Blue1Brown for more on how neural nets work)
  • Statistical inference – predicts the next likeliest / best word in a sequence �(the bell on a bell curve)
  • Think sophisticated "autocomplete" + "Mad Libs," or crossword-puzzle solver

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

11 of 29

What can (and can’t) it do?

  • Most LLMs can:
    • Produce usable prose content
    • Engage in productive iterative design processes for text and image
    • Provide useful suggestions/ advice 
    • Edit/ revise text input or make suggestions for revision
    • Write passable essays, summaries, poems, emails, memos, instructions, code, reports—you name it
    • Provide explanation of concepts

  • LLMs cannot:
    • Reliably verify the veracity of its outputs
    • Consistently produce factually reliable outputs
    • Understand what it's talking about or what it's doing
    • Recognize hateful, offensive, or biased speech without significant training/ guardrails 

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

12 of 29

Workplaces are already using it

  • 2023 World Economic Forum Future of Jobs Report found that nearly 75% of companies surveyed plan to adopt AI over the next 5 years.
  • The Brookings Institute estimated that AI will impact a range of jobs, including those for highly-paid, highly-educated workers.

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

13 of 29

Workplace applications

  • Marketing – tools like copy.ai and jasper.ai used to create web copy, email copy, and social media content (widespread adoption)
  • Customer service – chat assistants provide agents with advice on how to respond to customer chats
  • Legal analysis – generative AI used to draft contracts, identify areas of risk/legal dispute, help lawyers craft arguments
  • Editing
  • Procurement
  • Sales
  • Transcription
  • Translation
  • Consulting
  • Human resources
  • Project management
  • Coding

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

14 of 29

Workplace applications

  • Common uses across industries: researching a topic or generating ideas, drafting emails and texts, drafting reports, editing text, summarizing text (Cardon et al)

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

15 of 29

AI will replace some jobs, but learning how to write and read critically and analytically will still be vitally important.  

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

Many workplace AI applications will require intensive on-site training – not our job to train students! 

Critical and analytical thinking, empathy, and communication skills continue to be highly valued. 

Highly-developed critical thinking skills are necessary to make the most of AI tools.

16 of 29

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

Can we teach students to use these tools ethically, in a way that supports rather than undermines their learning?

What research and critical thinking skills will students need to assess the quality of AI-generated content?

17 of 29

AI & Ethics

  • Labor exploitation
  • Copyright infringement 
  • Biased outputs
  • Environmental harms
  • Privacy/surveillance

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

18 of 29

AI & Ethics

19 of 29

AI & Ethics: bias (examples)

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

Example: ChatGPT 3.5 exhibiting bias from its dataset

Midjourney, 

prompt: 3 secretaries

Midjourney, prompt: 3 surgeons

20 of 29

Plagiarism/"AI-gerism"

  • Students can currently use AI to brainstorm, compose a thesis, outline, summarize, draft, revise, make "peer review" suggestions, and/or provide "personal" reflections. 

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

  • Generative AI tools have been or are in the process of being integrated into Google Docs and the Microsoft Office 365 suite. Grammarly AI and Wordtune can be used in all word processing applications as plugins.

21 of 29

Plagiarism detection

  • Current evidence suggests that AI detection software frequently fails to detect AI-generated work and can lead to false-positive IDs
  • Detection can be thwarted by combining LLMs (such as ChatGPT + Undetectable.ai or Quillbot)

  • Early evidence suggests AI detection tools are biased against non-native speakers
  • Detection software should be the start, rather than the end, of a conversation

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

22 of 29

Citation & attribution

  • Students should be guided how to cite and attribute their use of tools
  • LLMs are not "authors"
  • "An AI tool cannot be listed as a co-author in a publication as it cannot take responsibility for the content and findings reported. The person (human being or legal entity) is always accountable for the content, whether or not it was generated by AI." ENAI

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

23 of 29

Will collaborating with AI hurt students' critical thinking skills?

  • Yes, if:
    • Students use AI instead of, rather than in addition to, their own thinking
    • Students use AI to complete major chunks of an assignment without refining the content
    • Students don't question/verify the output 
  • Maybe not, if:
    • Students are guided to think critically about AI outputs prior to use
    • Students evaluate the AI output as they would any source 
    • Students use AI tools to extend their thinking (e.g., through asking for revision suggestions, asking for counterarguments)

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

24 of 29

Sample GPT prompts courtesy of Anna Mills (CC BY NC 4.0) (note: prompts intended for educators; can they be adapted for students?)

  • Ask GPT for feedback, not necessarily revisions
    • "How can my writing be clearer?"
  • When trying out ideas, ask it for counter-arguments or possible skeptical reactions
  • Ask it to ask you questions to think something through
  • "How can you help me write 'X'?"

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

25 of 29

How can we ensure no student is left behind? How can we reduce harm and ensure the benefits of AI are shared equitably?

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

26 of 29

Digital divide

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

Access issues: 9% of households in KC, MO lack internet access; 7% of households lack a computer (US Census ACS 5-year)

    • Includes 24,515 students

Generative AI tools likely to become more expensive as tech companies compete for customers; currently several tools have free and paid tiers

Accessibility of tools frequently becomes an afterthought when competition for customers is fierce; speech-to-text applications still have a long way to go (Addlesee)

27 of 29

Will AI democratize knowledge work?

  • Early experiments suggest GPT might level the playing field somewhat: "Inequality between workers decreases, as ChatGPT compresses the productivity distribution by benefitting low-ability workers more." (Noy & Zhang)
  • Benefits for people with disabilities including communication disabilities, ADHD, and dyslexia (NY Times)
  • Benefits for non-native speakers (Ray)

PRESENTATION BY KATHRYN CONRAD AND SEAN KAMPERMAN, LICENSED CC BY NC 2.0

28 of 29

DEMONSTRATIONS

29 of 29

Breakout Session Questions

  1. Have you integrated any generative AI tools or programs into your teaching practice? If so, how have you used them? 
  2. Do you envision generative AI fitting into your content area? How do you hope that AI might enhance your teaching practice or support your students' learning?
  3. How are your students using AI? What are they saying about it?
  4. What policies does your school/district have in place regarding AI use?
  5. What are your current concerns regarding the intersection of AI and education? What are your hopes?