AI Unchecked: Are We Ignoring the Rights of the Disabled in the Digital Rush?

AI Unchecked: Are We Ignoring the Rights of the Disabled in the Digital Rush?

Artificial intelligence (AI) can potentially improve our lives substantially.  Right now, however, AI is being developed in a way that means AI and disability rights are an oxymoron.  Developers are creating AI technology for and around people with disabilities instead of with them, just as they have been developing AI around people of color.  

The definition of disability depends on who is defining it.  In an article on Rights of the Disabled in the Digital Rush, Smith (2024, no relation), defines a disability as something that causes a physical, emotional, or intellectual impairment. 

Some people with disabilities are obviously disabled, such as a person in a wheelchair, while many people with disabilities look “normal,” such as someone with asthma who cannot walk far at a time. 

Traditionally, disabilities have been viewed as a deficit, and the lives of people with disabilities as worth less than people without disabilities. After all, someone in a wheelchair cannot plow a field or sow seeds to feed a family.

Add in the view of many that God made people disabled to punish them or their families for sins, and the disabled were hidden away in shame. The Nazis took that to the logical extreme and “euthanized” the disabled. The United States went through a period of enforced sterilization of those deemed undesirable, too. 

Fortunately, we don’t do that anymore. However, according to an article appearing in the Harvard Gazette, the treatment of people with disabilities frequently reeks of paternalism and of viewing people with disabilities as less than. 

Technologies developed for people with disabilities “often rely on two assumptions:  that many people are faking or exaggerating their disabilities, making fraud detection critical, and that a life with a disability is not a life worth living.” (Smith, 2024).  I believe both these assumptions are false. 

In 2002, I went with a friend to Hungary, where she competed in the World Agility Paralympics with two of her dogs.  At the time, I was the luggage and dog handler because she used a power wheelchair and could not do so herself. 

We went from the United States to Munich to Budapest, Hungary.  In Germany, we ran into a problem.  Two men carried my friend into the plane and put in her seat.  As I came up the stairs, the pilot came out of the cockpit and said he wanted my friend, her dogs, and that wheelchair off his plane.

I was discussing the situation with the purser and had the two dogs with me, one of whom was my friend’s service dog. The purser let us sit down when he saw how well-behaved the dogs were.  My friend was totally excluded from the conversation, and no one would tell her what was happening. 

At Budapest, a wheelchair-accessible van and a driver waited for us. However, we got a nasty shock. The pilot had made the baggage people take my friend’s power wheelchair off his plane. The airline promised to deliver it the next day, but the competition was the next day.

They finally delivered it at about 9:30 that night. Every connector in the electrical system that drove the computer driving the wheelchair was disconnected. There were no wheelchair mechanics in Hungary at that time. I managed to plug everything back in, and it worked, but that was sheer luck.   

We noticed everyone frowning at us as we went about the town. We finally asked if these people had something against Americans.  Our hosts told us that the people here were poor.  We obviously had more money than they did, and they felt the expensive wheelchair and expensive clothes my friend wore were wasted on someone who was disabled. Her wheelchair cost as much as these people made in a year. 

Returning to the States, we had a four-hour layover in Munich and planned to walk the dogs.  Instead, we were taken to a room staffed by two police officers that was full of people with disabilities and not allowed to leave that room until we got on our plane. 

Such attitudes may not be explicitly stated here anymore, but you cannot tell from the way AI is being developed. People with disabilities feel excluded from the conversation. 

Undoubtedly, AI technology can help people with disabilities if properly designed.  For example, inclusive technology by Midjourney includes a bot that generates alt text from images or images from text descriptions. This allows people who are blind or have low vision to read text describing an image. 

Sadly, in most cases, AI technology hurts people with disabilities.  These problems range from AI-assisted screening of job applicants that are biased and screen out people with disabilities to difficulties with AI-assisted health care. 

In an article by Smith and Smith (2021), they discussed their problems with speech technology. The older Smith could make a phone call with speech technology but not hang the phone up with it. Writing a paragraph with speech recognition software took him as long as writing two pages had when he was able-bodied. 

His daughter was blind and tried to use image recognition software to find which cereal box she wanted but gave up in frustration and had to ask her partner for help. Speech recognition does not work for people who stutter, so sometimes Smith could not get the program to work when he needed to call the pharmacy to get his medication. 

Lawrence Weru, an associate in biomedical informatics at Harvard Medical School and an individual who stutters, has this to say about AI technology and disability: “If we’re creating tools that we know are fed with information that can bias against certain groups, and we integrate those into very crucial aspects of our lives, what’s going to be the impact of that?” Weru said.

“That’s a concern that I hope people would be having enough foresight to try to address in advance, but historically accessibility is usually something that’s treated as an afterthought.” 

According to Smith (2024), a problem that occurred during the COVID pandemic was that people with disabilities were less likely to be allocated scarce resources because their lives were worth less. Healthcare companies increasingly use quality-adjusted life years (QALYs) to decide whether a given treatment is cost-effective. 

The potential for AI to violate the rights of people with disabilities is obvious.  Someone who cannot work or who cannot leave their home could be marked down for their disabilities. If the QALYs score is too low, the insurance will deny the treatment as not being cost-effective. 

AI and disability rights do not have to be mutually exclusive.  Currently, O’Grady reports that AI technologies are trained in ways that do not account for people with disabilities and their experiences in the world. 

Stats on Rights of the Disabled in the Digital Rush

Data from people with disabilities is often excluded as an “outlier” instead of being used to train AI systems.  Disabled voices in tech should be included in the conception of AI technologies if these problems need to be rectified. 

The voices of people with disabilities concerning AI accessibility are beginning to be heard.  The Biden Administration has published a Blueprint for an AI Bill of Rights.  This document recognizes that there are documented problems with bias in AI.  While not containing everything it should, this is nevertheless an attempt to make inclusive technology available to all.   

At the same time, the U.S. Census Bureau tried to change the definition of disability in the American Community Survey in 2025 to something that would cut 10 million women and girls from the count.  No disabled organizations were consulted about this. 

Only after major pushback from disabled people and the organizations that serve them was the plan dropped.  The Census Bureau also agreed to consult the disabled community before making any more changes (Dotkowsky and Robbins, 2023). 

People with disabilities must be involved in developing AI technologies from conception.  Anything less means AI and disability rights are a farce. Some AI technologies are unwanted because they just don’t work for people with disabilities.

Part of involving people with disabilities in developing AI technologies has to be asking what problems they have experienced and how they want these problems solved.  

Help isn’t help if it is unwanted. Right now, AI is still developed for people with disabilities instead of with people with disabilities.

In fact, the conversation needs to shift from seeing disabilities as a problem to one of viewing socio-economic, educational, political, environmental, and cultural views that hamper people with disabilities from living their version of their best lives as the problem. AI has great promise in its ability to do this, but its development has to include all voices to accomplish that. 

References

Bertelson, R. (2024, February 5). The Future is Inclusive: Leveraging Generative AI for PoC Empowerment and Growth. The Inclusive AI. https://theinclusiveai.com/the-future-is-inclusive-generative-ai/#more

Biden Administration. (2023) Blueprint for an AI Bill of Rights:  Making Automated Systems Work for the American People.  Accessed 04/18/2024 https://www.whitehouse.gov/ostp/ai-bill-of-rights/ 

Brown, S. (2015). Disability culture and the ADA.  Disability Studies Quarterly V 35 (3).  Accessed 4/18/2024. https://dsq-sds.org/index.php/dsq/article/view/4936/4062 

Ditkowsky, M. & Robbins, K. (Dec 5,2023).  National Partnership for Women and Families blog.  accessed 4/18/2024. https://nationalpartnership.org/new-census-proposal-would-reduce-disabled-women-girls-counted-nearly-10-million/ 

Midjourney Documentation Page.  accessed 4/18/2024.  https://docs.midjourney.com/ 

O’Grady, E. (April 3, 2024).  Why AI fairness conversations must include disabled people.  The Harvard Gazette.  accessed 4/18/2024 https://news.harvard.edu/gazette/story/2024/04/why-ai-fairness-conversations-must-include-disabled-people/ 

Smith, P., & Smith, L. (2021).  Artificial intelligence and disability: too much promise, yet too little substance.  AI Ethics, 1, 81-86.  Accessed 4/18/2024.  https://link.springer.com/article/10.1007/s43681-020-00004-5 

Smith, S.E. (Feb 14, 2024). Automating Ableism.  The Verge. Accessed 4/17/2024 https://www.theverge.com/24066641/disability-ableism-ai-census-qalys 

The Leadership Conference on Civil and Human Rights.  (2023). Open Letter.  accessed 4/18/2024/  https://civilrights.org/resource/next-steps-to-advance-equity-and-civil-rights-in-artificial-intelligence-and-technology-policy/ 

CATEGORIES
TAGS
Share This

COMMENTS

Wordpress (0)
Disqus (0 )

Discover more from The Inclusive AI

Subscribe now to keep reading and get access to the full archive.

Continue reading