A path to independence: Navigating accessibility through AI
Artificial intelligence can be a life-enhancing tool for people with disabilities, but only when equity is at the forefront
Editor’s note: This article has been updated to provide one of the sources a pseudonym to give them anonymity.
Bailey Anderson, whose real name The Runner has agreed to keep confidential, has had severe hearing loss in one ear and profound loss in the other since she was a month old due to complications with a premature delivery.
“I was born along with a twin sister. So, with twins, there are always complications.”
In 2020, they joined Simon Fraser University’s Disability and Neurodiversity Alliance, a student-run group that advocates and supports neurodivergent people through events and support group sessions.
Anderson makes use of visual cues such as automated captions and lipreading during their classes. Lipreading can often be difficult for Anderson if their instructor walks while lecturing, so they often have to rely on captions.
She has been using ChatGPT for her coding classes when she misses something her instructor says during lectures. The chatbot helps her visually understand some aspects.
“It’s really hard for me to grasp in class the technical aspect, because I have to follow along every minute, and I cannot do that because of my disability,” they say.
While Anderson says the use of artificial intelligence (AI) has been effective in studying, it is not something she enjoys. Unless there are ways to make the technology ethical so that it doesn’t deplete resources, violate people’s privacy, or affect people belonging to marginalized communities, Anderson wouldn’t be keen on using AI.
AI has become a defining zeitgeist of the 21st century and continues to seize the world around us, whether its cybersecurity, cars, online shopping, homework, smart homes, infrastructure, and much more.
With advancing trends in the AI industry, the market is projected to make an estimated $407 billion by 2027, statistics predict.
A research study by McKinsey and Company suggests that the technology could displace between 400 million and 800 million workers and affect around 15 per cent of the hours worked globally by 2030.
Today, AI is used to make life easier in a myriad of ways, such as by responding to texts or emails, answering questions about finances, planning a trip, prepping for a job interview, and summarizing complex concepts. The technology can do it all, from being a therapist to helping with homework.
But experts say there is an uncertain aspect in this whirlwind of all-encompassing technology. Questions on whether it’s equally helpful to everyone, if it considers people with disabilities and those who use mobility devices, and if it’s inherently biased in the way it’s trained all lay on the table.
Christo El Morr, a professor of health informatics at York University’s School of Health Policy and Management, defines AI as the field of research that tries to imitate human intelligence. It helps computers think in a certain manner, he says.
“Of course, it’s far away from being intelligent, and even the word human intelligence needs to be discussed, really.”
El Morr says the first phase of AI involves training it. He explains how features on our smartphones such as Siri often imitate our speech to understand our accent and the way we speak.
This speech recognition is referred to as assistive technology and is often used to help individuals with visual or mobility impairment control and command the devices around them. Other assistive technologies such as text-to-speech and speech-to-text are also used widely for people with learning disorders such as dyslexia.
The Mayo Clinic defines learning disorders as the ability of the brain to process and “[work] with information in a way that is not typical. It keeps a person from learning a skill and using it well.”
The disorder affects skills such as reading, writing, math, understanding language, socializing, and more.
Students with learning disorders are often allowed to use gadgets to help with daily tasks and assignments. El Morr says this often helps facilitate communication and interactions with digital content.
AI can also be used to navigate space for people needing mobility assistance. El Morr makes use of the medical model in his research to understand and speak about disability.
“We are used to speaking about disability using words like ‘someone who lacks vision,’ ‘someone who cannot walk,’ or ‘someone who cannot hear or cannot speak,’” he says. “Basically, it’s about an individual who lacks a physical functionality, and that’s the medical model.”
El Morr says the way we speak is often reductive to the problems and realities of people with disabilities and that society has evolved in its understanding of people with disabilities.
“If a person cannot enter a building, it’s not because that person cannot walk, even if they have a wheelchair, [but] the building is not adjusted in a certain manner to allow the person with the wheelchair to step into the building.”
He says to look into the reality of disability, we first need to delve into its social aspects.
“If you don’t have a scope for the person with a wheelchair to enter the building, then it’s not the person who is disabled. It’s the society who is disabling that person.”
The United Nations Convention on the Rights for Persons with Disabilities states that nations should take measures to ensure equitable access to the physical environment.
Without the use of assistive technologies, the society is unaccommodating and not adjusted to support people with disabilities, El Morr says.
In his research, El Morr studies AI bias and the absence of a disability justice approach as technology becomes more prevalent. The approach goes beyond just traditional disability rights, he says.
While disability rights take a radical approach and focus on legal protections and equal opportunities, disability justice is more holistic and explores how different identities play a part in affecting a disability.
It also addresses the intersecting social, economic, and political systems that lead to the marginalization and oppression of people with disabilities. Disability justice emphasizes the importance of creating an equitable world and “gives a leadership role to the most impacted.”
“[It’s] more than just the rights. I might have a right, but I might not be able to practice it because I have a disadvantage economically, and the political system is marginalizing me,” El Morr says.
It’s important to look into the intersection of identities and what roles they play for people with disabilities as people experience disability in different ways, he adds. Aspects such as race, class, gender, and immigration affect people’s experiences with disabilities, he says.
While AI technology isn’t consciously biased, it relies on data which often contains biased information to perform tasks.
“Whatever is in the data will be reflected in the AI,” El Morr says.
The technology hasn’t been trained to predict the data based on the different characteristics of a population, and it’s important to make sure that AI technology isn’t biased against certain groups, he says.
An example of such biases is when a company uses the technology to recruit people for management positions. For the process, several resumes were scanned by AI, and based on existing data from the company, the technology always recommended a man for the position instead of woman since the company had a history of hiring male representatives in management positions.
“So, the AI learned that being a man is a characteristic that is linked to being a manager. It’s not the AI that thought that way, but it learned from the data,” El Morr says.
To enhance accessibility for people with disabilities, El Morr says AI should be trained to use data that represents them to avoid any bias. He says while conducting research on people with disabilities, it is also important to include them in the design and development of tools to enhance accessibility.
“There is a slogan in the disability world that states ‘Nothing about us without us.’ And this should apply also to AI and any AI-based product that is dedicated for people with disabilities.”
El Morr says inclusive teams should be created in the process of designing, implementing, and testing AI tools for people with disabilities. People trained in the social impact of these tools should be included in the development process as well, he says, adding there is a need to ensure unbiased AI training and create ethical guidelines to make it more transparent.
“The AI system should be explainable, because if there is decision making to be done based on an AI prediction, for example, if an AI tool [leads to] people [taking] a bad decision, based on the AI feedback, then this feedback should be clear and explainable,” he says.
Seanna Takacs, learning specialist and practice lead at Kwantlen Polytechnic University’s accessibility services, says there are two main concepts when it comes to the role AI plays in accessibility — supporting communication and supporting executive functions, which refers to the way we plan and organize ourselves.
AI tools are being used to create independence for people with disabilities in various ways. Transcription is used to support deaf students’ access to online lectures and videos, technology converts pictures into words to make life accessible for the visually impaired, and there are features to help students overcome writer’s block, among other uses.
Takacs says the effectiveness of AI tools should be developed by including more disabled voices, which would create a diverse dataset and prevent AI bias.
“Preventing that bias is really crucial for continuing to create and expand tools that are useful to disabled users,” she says. “We always have to remember to check in on accessibility with the people who are using the tools who need those accessibility measures.”
The users of AI tools should be sought out to gather information on how these tools have helped them, what they use them for, and how their use and application can be broadened, Takacs says.
She also says assistive technology has the ability to incorporate various functions, which can be customized and adaptive to different users.
At KPU, Takacs says she has many students who use AI in different ways, whether it’s for planning and organization or for supporting communication.
“We are really trying to create pathways to meaningful inclusion when we’re looking at AI tools,” she says.
Takacs is excited to see the expansion of AI tools and how they support accessibility in the future.
Keegan Newberry is the assistant director of assistive technology at the Developmental Disabilities Association, an organization that provides community-based programs and services to people with developmental disabilities.
The association has been making use of goblin.tools, a platform that includes a series of single-task AI tools created to help neurodivergent people with tasks throughout the day.
The developer of the platform, Bram De Buyser, is an AI, software, and data engineer. The platform features different tabs that break down tasks to help people go through each step, Newberry says.
They say it is nice to have accessible and free AI tools since cost is a big barrier when working with technology in their field. The platform also helps individuals at the association with executive functioning, which involves planning, organizing, memory, motivation, and time awareness.
Newberry says the purpose of using AI is to help support skills and bridge gaps where deficits exist rather than replacing them. While the tools do require support staff to act as middlemen to help clients access the platform, Newberry says many people can use it independently.
“We’re trying to give as little support [to individuals] as they need to be successful to encourage as much independence as possible.”
The association also offers assistive technology classes where they introduce new applications for individuals with developmental disabilities to try out every month.
“I think that in the long term, AI has the capacity to help a lot of our clients be able to be more independent in the community and that’s the goal,” Newberry says.
Stan Leyenhorst was just a teenager when he got into an accident that left him quadriplegic, which refers to the paralysis of all four limbs. He has been using a wheelchair for 47 years, and is the founder of Universal Access Design (UAD), a Surrey-based organization that aims to enhance accessibility through universal design.
Leyenhorst founded UAD in 2017 after working at the Rick Hansen Foundation, a charity that aims to remove barriers for people with disabilities.
He says there has always been a need for people with lived disability experiences to be included in helping to build an environment to make it accessible for everyone.
The 2019 Accessible Canada Act “to ensure a barrier-free Canada” and the 2021 Accessible British Columbia Act, which outlines a framework for the government to work with and remove barriers for people with disabilities, both require a certain level of accessibility in public spaces, Leyenhorst says, “but experience tells us that it’s never good enough.”
He also says there is a difference between universal and accessible design.
“[With accessible design] you might remove the barrier for somebody who’s in a wheelchair, but you might introduce a barrier at the same time for somebody who uses a white cane and who’s blind.”
While accessible design is implemented to remove barriers for a certain demographic of the population, universal design strives to provide access for as many people as possible and create an environment that anyone can use.
Universal design, however, is a fairly new approach, Leyenhorst says. Emerging in the United States in 1979, the approach gradually expanded to include people who are visually impaired.
“[With universal design] when I go into a public space, I can access it however I want.”
While Leyenhorst has started making use of AI to compile reports, he doesn’t use it for navigation. He says he is reluctant to dive into the technology just now since he’s uncertain about the consequences.
“Let’s build an environment where it actually doesn’t matter whether you have any type of disability, anybody can use it,” he says.
That is exactly what Leyenhorst helps various public and private groups with. Through consultations, he provides perspective on how universal design can be implemented through his lived experience.
“A world in which [AI] did not exploit the artist, creative labour, and environmental resources, I would absolutely say it could be used as an accessibility tool,” Anderson adds.