Feb 18, 2025
13 Views
0 0

Decolonising AI: A UX approach to cultural biases and power dynamics

Written by

An AI system is only as good as the stories it learns, but what happens when those stories are incomplete or one-sided?

‘Cunhatain — Antropofagia musical’, 2018 — Denilson Baniwa — Picture by: DuHarte

AI systems, trained on data reflecting historical power imbalances, often encode and reinforce inequalities, leading to discriminatory outcomes for marginalised communities.

There is an urgent need to explore ideas on how to Decolonise AI, examining how UX can play an essential role in addressing cultural biases and power dynamics. deeply ethical and political one that requires interdisciplinary collaboration and a fundamental shift in perspective.

The colonial echo in algorithmic design

AI systems are not neutral arbiters; they are reflections of the data they are trained on and the values of their creators. Too often, this data is sourced from Western-centric datasets, embedding cultural assumptions and biases that disadvantage non-Western populations. This phenomenon echoes Boaventura de Sousa Santos’s concept of epistemicide, where Western knowledge systems systematically suppress alternative forms of knowledge.

Consider, for example, image recognition algorithms that struggle to accurately identify faces from diverse ethnic backgrounds, or natural language processing models that perform poorly on non-English languages or dialects. These failures are not simply technical glitches; they are manifestations of a biased system that prioritises certain cultural perspectives while marginalising others.

Source: ‘Humans Are Biased. Generative AI Is Even Worse’ — By Leonardo Nicoletti and Dina Bass Technology + Equality — June, 2023.

The subaltern and the algorithm: Who gets to speak?

The issue of algorithmic bias is further complicated by the fact that marginalised communities often lack the power to shape the development and deployment of AI systems that affect their lives. Gayatri Chakravorty Spivak’s seminal essay, “Can the Subaltern Speak?” (1988), highlights the challenges faced by marginalised groups in having their voices heard within dominant power structures. This resonates strongly with the issue of algorithmic bias, where certain voices are systematically excluded from the data used to train AI systems.

The consequences of this exclusion can be shocking. AI systems used in law enforcement, for example, have been shown to disproportionately target communities of colour. Similarly, AI-powered hiring tools can perpetuate existing inequalities by penalising candidates from non-Western backgrounds. Achille Mbembe’s Necropolitics (2003) examines how power operates through the control of life and death. AI systems, used in areas like law enforcement and border control, can become instruments of necropolitics, disproportionately impacting marginalised communities.

Frantz Fanon’s work, particularly Black Skin, White Masks (1952), explores the psychological effects of colonialism and the ways in which it internalises feelings of inferiority and alienation. Similarly, Walter Mignolo’s, The Darker Side of Modernity: Global Futures, Decolonial Options (2011) exposes how coloniality is inherent to modernity, and remains unseen.

These insights can inform our understanding of how AI systems can perpetuate colonial patterns, even when they are not explicitly biased. To truly decolonise AI, we must challenge the concentration of power in the hands of a few tech companies and governments.

We must promote the development of AI systems that are democratically controlled and accountable to the communities they serve.

‘AI Decolonial Manyfesto’, pushing for a re-evaluation of AI through diverse cultural viewpoints, and ‘Tierra Común’, which campaigns for interventions against data colonialism.

UX as a tool for decolonisation

UX, with its focus on people and empathy, can be a powerful tool for decolonising AI. By centring the experiences and perspectives of marginalised communities, UX teams can help identifying and mitigating biases in AI systems before it causes real harm.

Questioning who AI truly serves and whose voices are missing from the design process, AI risks being shaped by a narrow set of experiences, reinforcing existing power structures rather than challenging them.

The UX for Decolonial AI framework

Some of the key strategies that UX practitioners can add to their frameworks to identify and prevent harmful experiences could rely on:

Diversifying data sets: Actively seek out and incorporate data & feedback from diverse cultural backgrounds to ensure that AI systems are trained on a representative sample of the population.Engaging in participatory design: Involve members of marginalised communities in the design process, giving them a voice in shaping the development of AI systems that affect their lives.Promoting transparency and explainability: Make AI systems more transparent and explainable, so that users can understand how they work and challenge potentially biased outcomes.Embracing cultural sensitivity: Design interfaces that are culturally sensitive and inclusive, taking into account the diverse needs and preferences of users from different backgrounds. Silvia Rivera Cusicanqui, a Bolivian sociologist, emphasizes the importance of ch’ixi thinking — embracing contradiction and ambiguity — in challenging colonial power structures. In AI, this could mean designing systems that acknowledge and celebrate cultural differences rather than striving for a single, universal standard.

The path forward

Decolonising human-AI interactions is a complex and ongoing process that demands a collective effort. By embracing a decolonial lens, we can design AI systems that are not only accurate and reliable but also truly inclusive.

This requires us to recognise the colonial legacy embedded in AI by acknowledging the historical power imbalances that have shaped its development and deployment. It also means centring the voices of marginalised communities, prioritising their experiences and perspectives as those most affected by algorithmic bias.

But not only.

How can we all commit to building an environment where AI empowers rather than marginalises?

Decolonising AI: A UX approach to cultural biases and power dynamics was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.

Article Categories:
Technology

Leave a Comment