Thing 18

AI for accessibility and inclusion

Last reviewed: March 2026 45โ€“60 minutes

After three Things focused on what can go wrong with AI (hallucinations, bias, and privacy), we're changing the tone. Thing 18 is about something AI does remarkably well: removing barriers.

AI-powered accessibility features are changing how people interact with technology and the world around them. Live captioning means a deaf colleague can follow a video call in real time. Text-to-speech with natural-sounding AI voices means a visually impaired team member can have any document read aloud, not in a robotic monotone but in a voice that's genuinely pleasant to listen to. Real-time translation means a conversation between people who don't share a common language can happen without a human interpreter. Visual description tools mean someone who is blind can point their phone at a scene and get a detailed, contextual description of what's in front of them.

These aren't futuristic concepts. They're features that exist right now, most of them built into the devices and software you already use. And they're not just relevant to people with disabilities. They're useful for anyone working in noisy environments, consuming content in a second language, or simply preferring to listen rather than read. The concept of the "curb cut effect" applies here: features designed to help people with specific access needs often end up benefiting everyone.

What makes this moment different from earlier generations of assistive technology is the quality. AI has closed the gap between "functional but awkward" and "actually good." Live captions are accurate enough to follow a fast-paced conversation. AI voices sound natural enough that you stop noticing they're synthetic. Translation is fluent enough to be useful in real professional contexts. This shift in quality changes accessibility from something that merely exists to something that genuinely works, and that distinction makes a real difference to the people who depend on it.

This Thing asks you to explore these tools firsthand, not because you necessarily need them yourself, but because understanding what's available helps you make better decisions about how you work, how you communicate, and how you design experiences that include rather than exclude the people around you.


Where AI is making the biggest difference

Illustration for Thing 18, AI for accessibility and inclusion
From live captioning to visual description, AI-powered accessibility tools are making technology genuinely usable for everyone.

AI accessibility tools broadly fall into five categories, and you've already encountered elements of several of them during this programme.


Why this matters beyond disability

There's a risk, when talking about accessibility, of framing it as something that only concerns a specific group of people. But the numbers tell a different story. The World Health Organisation estimates that 1.3 billion people globally (roughly 16% of the population) experience significant disability. In the UK, there are approximately 16 million people with a disability, and over 14 million disabled people are in the workforce. Almost certainly, someone you work with, provide services to, or communicate with regularly has access needs that these tools could help address.

Beyond diagnosed disabilities, there are many situational access needs. A parent with a sleeping baby who needs to follow a video call on mute. Someone recovering from eye surgery who can't read a screen for a fortnight. A colleague in a noisy open-plan office who can't hear their headphones clearly. A team member whose first language isn't English trying to follow a fast-moving group discussion. AI accessibility tools serve all of these situations.

There's a professional angle too. If your work involves creating content, running meetings, or communicating with the public, knowing what accessibility tools exist helps you design processes that work for more people from the start, rather than retrofitting accommodations afterwards. That's the shift from "accessible on request" to "inclusive by design", and AI is making it substantially easier to achieve.


Resources to explore

Be My Eyes

A free app connecting blind and low-vision users with volunteers and AI-powered visual assistance. Available on iOS, Android, and Windows. The Be My AI feature provides detailed image descriptions using GPT-4.

Visit site
Microsoft Accessibility features in Windows

Microsoft's guide to live captions in Windows 11, including translation features on Copilot+ PCs.

Read guide
Apple Accessibility

Apple's overview of accessibility features across iPhone, iPad, and Mac, including Live Captions, VoiceOver, and Visual Intelligence.

Visit site
Google Accessibility

Google's accessibility hub covering Android features, Lookout, Live Caption, and Chrome accessibility tools.

Visit site
Inclusive design at Microsoft

Microsoft's inclusive design toolkit and methodology. Useful background reading on why accessibility matters in professional contexts.

Visit site
How professionals with disabilities use AI tools at work (InclusionHub)

Practical examples of how AI accessibility tools are being used in real workplace settings.

Read article

Activity: exploring AI accessibility tools

45โ€“60 minutes Computer + smartphone ยท Built-in features + optional free apps

This activity asks you to try at least three AI accessibility features firsthand and think about how they could make a practical difference in your working context. You won't need any specialist equipment; everything here uses tools that are either built into your existing devices or available as free apps.

The point isn't to become an expert in assistive technology. It's to experience these tools well enough that when you're thinking about how to make your work more inclusive (whether that's running a meeting, sharing a document, or designing a process), you know what's available and how well it works.

  1. Try live captioning. You have two options here, depending on your setup.
  2. Try text-to-speech. Find a piece of written content (a blog post, a news article, or a document you've been meaning to read) and have it read aloud using AI-powered text-to-speech.
  3. Try a visual or translation tool. For this step, choose one of the following based on what interests you and what devices you have available.
  4. Write your accessibility report. Compile your findings into a short report covering the three tools you tested.
Privacy reminder: use personal documents and examples for this activity, not work materials. If you run the document accessibility checker (Option C), use a personal document rather than anything connected to your employer.

Why this matters

Ticking a box for digital inclusion isn't really the point here. This activity builds the kind of awareness that changes how you think about the things you create and the processes you run.

Most professionals have never tried live captions, never listened to their own documents being read aloud, never experienced what it's like to rely on an AI description of something they can't see. That gap between "knowing accessibility exists" and "having actually experienced it" matters. Once you've heard live captions stumble over a colleague's name or mispronounce a technical term, you understand why clear audio quality matters. Once you've heard your own document read aloud and noticed how confusing it sounds without proper heading structure, you understand why document formatting is an accessibility issue.

The tools you've tested are impressive, but they're not perfect, and that's important too. Understanding both what these tools can do and where they struggle puts you in a much better position to make thoughtful decisions about accessibility in your professional life. Not every situation calls for every tool, and knowing the limitations helps you choose the right approach.

The broader point connects to something that's been a thread throughout this programme: AI is most powerful when it augments human capability rather than replacing it. Accessibility is perhaps the clearest example of this. These tools don't eliminate disability; they remove specific barriers that technology can address, freeing people to contribute more fully. That benefits everyone.


Claim your Open Badge

Submit your accessibility report with your screenshots and notes from testing the three accessibility tools, your assessment of each tool's accuracy and usability, your reflection on who could benefit and what you'd change about your working practices, and your specific scenario for improving inclusion.

Thing 18: AI for accessibility and inclusion open badge
Thing 18: AI for accessibility and inclusion

Submit your accessibility report and reflection to claim this badge via cred.scot.

Claim now

What's next

Thing 18 wraps up Phase 4 of the programme, the critical thinking and responsibility phase that's taken you through hallucinations, bias, privacy, and accessibility. You now have a solid framework for using AI thoughtfully: appreciating its capabilities, understanding its limitations, making informed choices about your data, and thinking about who gets included and who gets left out. In Thing 19, we move into the final phase of the programme with advanced applications. Thing 19 explores using AI for learning and personal development, from using a chatbot as a personalised tutor to AI-powered language learning and study tools.