1 of 34

AI & Accessibility

Designing an Equitable Future

Angela Martin

2 of 34

Because while artificial intelligence is reshaping how we work, communicate, & hire

— accessibility determines who gets to participate in that future.

3 of 34

Hi, I’m

Angela Martin

Lenovo

Design Storming

Advisory UX Designer, Lenovo | Chair, ABLE ERG

Certified Humane Technologist

Co-Founder of Design Storming, LLC

Author of This Is Not a Dream

4 of 34

Accessibility isn’t niche.

It’s the foundation of inclusive innovation.

1.3 billion people globally live with a disability.

That’s 1 in 6 people — and counting.

5 of 34

Disability is not rare — it’s the world’s largest minority group.

TLDR:

Accessibility isn’t just about ramps or captions. It’s about access to opportunity, to employment, to independence.

AI now touches every part of work — from how résumés are screened, to how people communicate on teams, to how we navigate systems.

If AI is inaccessible, it doesn’t just fail technically — it excludes people from participating in the digital economy.

And exclusion, at AI scale, happens faster than ever.

6 of 34

The Promise & Peril of AI

AI can:

  • Break barriers — automate captions, describe images, bridge communication gaps.�
  • Or reinforce them — misinterpret speech, mislabel disability, or deny opportunities.

Technology is neutral only until we decide what — and who — it’s for.

7 of 34

So the question isn’t “Can AI help?”

It’s “Who is it helping — and who is it leaving behind?”

Generative AI can write alt text automatically — huge win. But the same system can also misdescribe an image or erase nuance.

Voice recognition can empower people with mobility limitations — but fail spectacularly when faced with speech differences or AAC devices.

8 of 34

The 5 Lenses of AI & Accessibility

  1. Social
  2. Impact
  3. Development
  4. Design
  5. Future

9 of 34

Development: How data and models encode bias.

Future: How we govern emerging tools.

Design: How users experience and recover from failure.

Social Impact: How AI reshapes access and equity.

Accessibility: Not a section, but a thread through all of it.

To build equitable AI, we have to look through five lenses:

These five lenses build on one another.

Think of them as a feedback loop — from intention to implementation to impact.

Accessibility isn’t just the last step; it’s the connective tissue.

10 of 34

“Technology that isn’t built for everyone isn’t innovation — it’s just automation for some.”

AI doesn’t create bias — it scales it.

If we build systems on top of inequitable structures, they replicate those structures faster, cheaper, and with a friendlier interface.

11 of 34

Social Impact: Equity or Exclusion

When AI systems ignore marginalized users, they:

  • Exclude people from essential services.
  • Reinforce social bias.
  • Scale inequity faster than ever before.�

But when designed inclusively, they:

  • Expand independence, communication, and opportunity.

12 of 34

It’s not about bad intent — it’s about neglect multiplied by technology.

13 of 34

The Data Problem

AI learns from what we feed it — and the internet is full of bias.

If disability is underrepresented or misrepresented in datasets, the system “learns” that those experiences are anomalies.

That’s how bias turns into “fact.”

14 of 34

Data Shapes Reality

Ask:

Who is represented in the dataset?

Who isn’t represented?

Who benefits if the AI is right — and who is harmed if it’s wrong?

15 of 34

The data we train AI on is often scraped from online text and images — sources that already underrepresent or stigmatize disability.

That means AI models don’t just miss disabled users — they misinterpret them.

For example: image classifiers that flag wheelchairs as “objects,” or captioning tools that fail to understand nonstandard speech patterns.

When people are invisible to data, they become invisible to design.

16 of 34

Development — Build With, Not For

Inclusive AI development means:

#1

#2

#3

Co-designing with people with disabilities from the start.

Stress-testing for edge cases, not ignoring them.

Prioritizing transparency and explainability.

17 of 34

Empathy helps us understand pain points.

But empathy alone doesn’t change systems.

Accessibility does.

Accountability does.

Action does.

18 of 34

When Empathy Isn’t Enough

Accessible AI means:

Empathy starts conversations.

Design changes outcomes.

  • Multiple input modes (voice, text, touch, eye-tracking).
  • Clear, explainable feedback.
  • Compatibility with assistive tech.
  • Inclusive language and representation.

19 of 34

Accessibility = Usability for All

Captions → noisy rooms, language learning

Voice commands → hands-free convenience & multitasking

Text summaries → cognitive clarity

Predictive text → efficiency

High contrast → mobile glare

Features born from accessibility often help everyone.

Accessibility isn’t about disability.

It’s about good design.

20 of 34

  • Multiple input modes (voice, text, touch, eye-tracking, switch)
  • Compatibility with assistive technologies (screen readers, keyboard navigation, captioning, etc.)
  • Clear and consistent feedback loops
  • Predictable recovery paths when users make errors
  • Inclusive language and representation in visuals and copy

UX Design Best Practices

21 of 34

22 of 34

23 of 34

It starts with design.

Designing for Stress and Reality: Real life isn’t calm or consistent.

When users are under stress — in crisis, multitasking, or fatigued — accessibility becomes the difference between success and failure.

Design for that reality, not an idealized user.

And remember: designing for edge cases doesn’t just help “them” — it helps everyone.

24 of 34

Accessible AI anticipates stress, error, and chaos:

  • Clear, explainable feedback: Users should understand why the AI responded a certain way
  • Transparent system behavior: Show confidence levels, sources, or reasoning when possible
  • User control and override options: Let users correct or undo AI decisions
  • Training for trust: Set realistic expectations about AI capabilities and limits
  • Continuous learning and accessibility evaluation: Ensure AI adaptation improves usability for all, not just the “average” user

25 of 34

Emerging trends in AI

Generative Accessibility: AI creating real-time captions, alt text, and summaries.

Edge & On-Device AI: Faster, private, offline assistive experiences.

Predictive Personalization: Systems that anticipate accessibility needs.

Automated Accessibility Checks: AI flagging and fixing barriers in real time.

Wearable Assistance: Smart glasses and AR tools that describe the world.

Inclusive Language Models: Recognizing diverse accents, dialects, and speech patterns.

Neuroadaptive Design: Interfaces that respond to focus, fatigue, or stress.

26 of 34

We’re entering a fascinating stage where AI can personalize experiences — automatically adjusting reading level, contrast, or layout.

But personalization isn’t inclusion unless the user controls it.� Autonomy matters.

Governments are starting to catch up — we’re seeing real policy movement around AI transparency and accessibility mandates.

The future isn’t “AI or accessibility.”� It’s “AI because of accessibility.”

The Future of AI & Inclusion

27 of 34

Risks to Watch

1. Overtrust and Opacity

  • Overconfidence bias: Users and teams may trust AI outputs blindly.
  • Lack of explainability: When users can’t tell why AI made a decision, accountability disappears.

2. Embedded Bias

  • Bias in employment screening: Algorithms can unintentionally filter out disabled candidates.
  • Amplified inequities: AI used in healthcare, credit scoring, or education can perpetuate existing discrimination.

3. Inaccessibility by Design

  • Inaccessible interfaces: AI tools often conflict with assistive technologies or rely solely on visual or voice input.
  • “Assistive” exclusion: AI products marketed as inclusive may not actually meet diverse accessibility needs.

4. Privacy and Protection

  • Data privacy risks: Disability-related data is often sensitive, yet poorly protected.
  • Surveillance through assistance: “Smart” monitoring tools can cross ethical lines, turning support into surveillance.

28 of 34

From Compliance to Culture

Shift the mindset:

Accessibility shouldn’t live in a checklist.

It should live in your culture.

  • From empathy → equity
  • From compliance → creativity
  • From kindness → competence

29 of 34

Compliance gets you to “minimum viable inclusion.”

Culture is what sustains it.

When accessibility is part of how teams define excellence, you don’t need to enforce it — people expect it.

And that’s the future of inclusive AI: not waiting for laws to tell us what’s right, but designing like human dignity is a feature requirement.

30 of 34

Let’s get to work.

If you build AI:

If you hire people:

If you lead:

  • Include people with disabilities from day one.
  • Include accessibility in your definition of done.
  • Test with real assistive tools and technologies.
  • Budget for inclusion early and document accessibility decisions.
  • Audit AI recruiting tools for accessibility and bias.
  • Provide adaptive technology, accommodations, and training.
  • Offer flexible processes that value diverse communication and work styles.
  • Listen to employee feedback — then act on it.
  • Make accessibility a business KPI, not a side project.
  • Reward accessible innovation and inclusive design practices.
  • Fund accessibility as strategy — not charity.
  • Model inclusion in every decision and metric that defines success.

31 of 34

Accessibility touches every role.

  • Leaders can invest.
  • Designers can advocate.
  • Developers can build better defaults.
  • Recruiters can challenge biased algorithms.

32 of 34

The next generation of AI could be the most inclusive technology we’ve ever built — if we choose to build it that way.

And remember: the tools we design today might be the very ones we depend on tomorrow.

33 of 34

To automate exclusion.

Or to design equity into the future of work.

AI gives us a choice:

34 of 34

Angela Martin

Thank you!

🌐 AngelaMartin.design

✍️ medium.com/@AngelaLMartin98

💬 @AngelaLMartin98 on LinkedIn

🙂 Book me on ADPList.org