How ableist algorithms dominate digital spaces

Graphical representation of bias being present in tech systems
(Image credit: Shutterstock)

The first interaction I remember having with an algorithm as a fairly online disabled person came after logging into Facebook, in my teens. I was a parasport athlete at the time – specifically wheelchair basketball – and so my little sidebar offered a number of ads associated with both playing basketball and being a wheelchair user that made sense in the abstract – but didn’t work in practice. Two ads I remembered seeing were catheters, a product that I do not need, and basketball shoes that would make you jump higher.

While that’s a benevolent example of when back-end systems get it wrong, sometimes the consequences can be much worse. “Algorithmic ableism names how the sorting, ranking and filtering that algorithms do privileges and promote the ideology of able-bodiedness, as well as the medical model for disability and a culture of healthism,” says Dr Olivia Banner, one of the leading researchers of a field called crip technoscience. This is an area of study that focuses on how disabled people adapt, and adapt to, technology to fit their needs and desires. Part of her work as an associate professor of critical media studies at the University of Texas at Dallas is exploring a concept known as algorithmic ableism.

“This doesn’t happen because an algorithm has been crafted ‘to be ableist’,” she continues. “It happen[s] because of the contexts in which algorithms are developed and instituted already favour able-bodiedness and cultures of ability. These include fitness culture, health optimisation culture, and social Darwinism (eugenics).”

Algorithmic bias is hardly ever by design

Banner points to systems like those used by HireVue, a video interviewing company that claims its algorithms can strip away human bias, but that comes under fire routinely on public forums and online blogs.

Although disputed, many claim such tools can perpetuate already rampant biases against disabled people. For those studying algorithmic ableism, it isn’t that these systems are being used actively as a discriminatory tool. There is, as far as we know, no known evil cabal of IT workers pulling the strings to deny disabled people jobs or ‘shadow ban’ content about assistive devices on TikTok. Rather, these issues are a passive consequence of tools that aren’t seeking to be anti-ableist.

That distinction is one that Dr T Makana Chock, a professor of communications at Syracuse University, Italy, is keen to hone in on. For her, the issue starts at the core mission of any algorithmic endeavour: the drive for efficiency. This is a concern that is increasingly evident in a world built around social media.

“Particularly for social media, a lot of what you're looking at are the basics of what was most liked, what was most viewed. Now, the issue of people, then, who are using social media is that those images that show up the most often perpetuate perceptions of what is normal, what is desirable.”

Cracking open the algorithmic attention economy

Chock says that when social media’s ethos is so heavily tilted towards our most “finest and glossy versions” of ourselves, algorithms can easily perpetuate the idea that a disabled experience – one that is less likely to come across as inherently glamorous – is one that is not worth focusing on. Banner says part of the concern is not that disability is fully erased when these algorithms become ableist, but that they prioritise problematic representations of disability that tend to get more engagement.

“Since algorithms – in some contexts, for example social media – are designed to maximise people’s attention, they favour whatever ‘rises to the top.’ When it comes to disability, inspiration porn, sentimentality towards disabled people, rhetoric of pity, and other ways of talking about disability as undesired and undesirable – these are often how disabled people are framed in memes, Facebook pages, social media campaigns, and other elements of the algorithmic attention economy.”

Inspiration porn, brought into common knowledge by disabled activist Stella Young in her groundbreaking TED Talk, is the phenomenon in which disabled people are looked at as admirable or inspirational simply for existing.

The proliferation of disabled people being seen as only objects for the non-disabled to use as fuel to get out of bed in the morning or, in some cases, the tragic backstory to avoid at all costs, puts disabled people at a distinct disadvantage according to researchers. It’s not as if disabled people are the only marginalised group to be at the whims of an algorithm that isn’t actively being monitored or corrected for its biases. Banner says that the intersections of identities can often compound these negative impacts.

“Algorithmic ableism plays out alongside, or intersects with, algorithmic racism and sexism, too. There’s been a lot of work on how algorithmic thinking and practices shape, for example, medically racist practices — for example, racially biased algorithms in kidney testing — or the sexism of credit ranking practices. Algorithms that draw on multiple demographic characteristics will tend to harm disabled women of color the most.”

Ripping out ableism from next-gen technologies

RELATED RESOURCE

Digital transformation & risk for dummies

Understand the risks to your digital business and accelerate your digital transformation

FREE DOWNLOAD

But what happens when you take your typical user outside of their Instagram feed or Twitter timeline? Part of Chock’s work is studying immersive environments like virtual reality (VR) and the metaverse. She says in that environment the sources of bias tend to shift. Because the choices a user makes in these sorts of environments are more akin to typical social interaction – highlighted by the use of tools like eye tracking technology in development – it’s less about the clicks or the search terms being executed. Chock says there’s an advantage in being on a new frontier when it comes to certain aspects of inclusion and accessibility.

“This is not a case where you have to go in afterwards and put in curb cuts and things like that. We have the ability to try and build into the system things that reduce the amount of bias, it won’t eliminate it entirely, but reduce the amount of bias and reduce the amount of discrimination”

So, what can IT workers do about algorithmic abilities? From an ideological or mission perspective, Banner says that those working in the sector need to understand who has been historically privileged in the technology space and that these algorithmic biases are another version of that same phenomenon.

“If IT professionals want to consider how to design in a different way, they should ask themselves: who is the norm for this algorithm? Who does this algorithm privilege, and could it be designed otherwise? How will this algorithm effect impoverished disabled women of colour?,” she says.

“I would recommend including, consulting with, or otherwise designing technologies where disabled people have agency and participate in the design process. “Nothing about us without us!” is an old-school disability activist slogan. It applies in the 21st century as much as it did in the 20th.”

John Loeppky is a British-Canadian disabled freelance writer based in Regina, Saskatchewan. His work has appeared for the CBC, FiveThirtyEight, Defector, and a multitude of others. John most often writes about disability, sport, media, technology, and art. His goal in life is to have an entertaining obituary to read.