AI & Accessibility
Designing an Equitable Future
Angela Martin
Because while artificial intelligence is reshaping how we work, communicate, & hire
— accessibility determines who gets to participate in that future.
Hi, I’m
Angela Martin
Lenovo
Design Storming
Advisory UX Designer, Lenovo | Chair, ABLE ERG
Certified Humane Technologist
Co-Founder of Design Storming, LLC
Author of This Is Not a Dream
Accessibility isn’t niche.
It’s the foundation of inclusive innovation.
1.3 billion people globally live with a disability.
That’s 1 in 6 people — and counting.
Disability is not rare — it’s the world’s largest minority group.
TLDR:
Accessibility isn’t just about ramps or captions. It’s about access to opportunity, to employment, to independence.
AI now touches every part of work — from how résumés are screened, to how people communicate on teams, to how we navigate systems.
If AI is inaccessible, it doesn’t just fail technically — it excludes people from participating in the digital economy.
And exclusion, at AI scale, happens faster than ever.
The Promise & Peril of AI
AI can:
Technology is neutral only until we decide what — and who — it’s for.
So the question isn’t “Can AI help?”
It’s “Who is it helping — and who is it leaving behind?”
Generative AI can write alt text automatically — huge win. But the same system can also misdescribe an image or erase nuance.
Voice recognition can empower people with mobility limitations — but fail spectacularly when faced with speech differences or AAC devices.
The 5 Lenses of AI & Accessibility
Development: How data and models encode bias.
Future: How we govern emerging tools.
Design: How users experience and recover from failure.
Social Impact: How AI reshapes access and equity.
Accessibility: Not a section, but a thread through all of it.
To build equitable AI, we have to look through five lenses:
These five lenses build on one another.
Think of them as a feedback loop — from intention to implementation to impact.
Accessibility isn’t just the last step; it’s the connective tissue.
“Technology that isn’t built for everyone isn’t innovation — it’s just automation for some.”
AI doesn’t create bias — it scales it.
If we build systems on top of inequitable structures, they replicate those structures faster, cheaper, and with a friendlier interface.
Social Impact: Equity or Exclusion
When AI systems ignore marginalized users, they:
But when designed inclusively, they:
It’s not about bad intent — it’s about neglect multiplied by technology.
The Data Problem
AI learns from what we feed it — and the internet is full of bias.
If disability is underrepresented or misrepresented in datasets, the system “learns” that those experiences are anomalies.
That’s how bias turns into “fact.”
Data Shapes Reality
Ask:
Who is represented in the dataset?
Who isn’t represented?
Who benefits if the AI is right — and who is harmed if it’s wrong?
The data we train AI on is often scraped from online text and images — sources that already underrepresent or stigmatize disability.
That means AI models don’t just miss disabled users — they misinterpret them.
For example: image classifiers that flag wheelchairs as “objects,” or captioning tools that fail to understand nonstandard speech patterns.
When people are invisible to data, they become invisible to design.
Development — Build With, Not For
Inclusive AI development means:
#1
#2
#3
Co-designing with people with disabilities from the start.
Stress-testing for edge cases, not ignoring them.
Prioritizing transparency and explainability.
Empathy helps us understand pain points.
But empathy alone doesn’t change systems.
Accessibility does.
Accountability does.
Action does.
When Empathy Isn’t Enough
Accessible AI means:
Read more: Making the Business Case for Accessibility
Empathy starts conversations.
Design changes outcomes.
Accessibility = Usability for All
Captions → noisy rooms, language learning
Voice commands → hands-free convenience & multitasking
Text summaries → cognitive clarity
Predictive text → efficiency
High contrast → mobile glare
Features born from accessibility often help everyone.
Accessibility isn’t about disability.
It’s about good design.
UX Design Best Practices
It starts with design.
Designing for Stress and Reality: Real life isn’t calm or consistent.
When users are under stress — in crisis, multitasking, or fatigued — accessibility becomes the difference between success and failure.
Design for that reality, not an idealized user.
And remember: designing for edge cases doesn’t just help “them” — it helps everyone.
Accessible AI anticipates stress, error, and chaos:
Emerging trends in AI
Generative Accessibility: AI creating real-time captions, alt text, and summaries.
Edge & On-Device AI: Faster, private, offline assistive experiences.
Predictive Personalization: Systems that anticipate accessibility needs.
Automated Accessibility Checks: AI flagging and fixing barriers in real time.
Wearable Assistance: Smart glasses and AR tools that describe the world.
Inclusive Language Models: Recognizing diverse accents, dialects, and speech patterns.
Neuroadaptive Design: Interfaces that respond to focus, fatigue, or stress.
We’re entering a fascinating stage where AI can personalize experiences — automatically adjusting reading level, contrast, or layout.
But personalization isn’t inclusion unless the user controls it.� Autonomy matters.
Governments are starting to catch up — we’re seeing real policy movement around AI transparency and accessibility mandates.
The future isn’t “AI or accessibility.”� It’s “AI because of accessibility.”
The Future of AI & Inclusion
Risks to Watch
1. Overtrust and Opacity
2. Embedded Bias
3. Inaccessibility by Design
4. Privacy and Protection
From Compliance to Culture
Shift the mindset:
Accessibility shouldn’t live in a checklist.
It should live in your culture.
Compliance gets you to “minimum viable inclusion.”
Culture is what sustains it.
When accessibility is part of how teams define excellence, you don’t need to enforce it — people expect it.
And that’s the future of inclusive AI: not waiting for laws to tell us what’s right, but designing like human dignity is a feature requirement.
Let’s get to work.
If you build AI:
If you hire people:
If you lead:
Accessibility touches every role.
The next generation of AI could be the most inclusive technology we’ve ever built — if we choose to build it that way.
And remember: the tools we design today might be the very ones we depend on tomorrow.
To automate exclusion.
Or to design equity into the future of work.
AI gives us a choice:
Angela Martin
Thank you!
🌐 AngelaMartin.design
✍️ medium.com/@AngelaLMartin98
💬 @AngelaLMartin98 on LinkedIn
🙂 Book me on ADPList.org