The Hidden Algorithms of Inequality

In today’s digital landscape, one of the pressing concerns is the rise of data colonialism. Couldry uses the term colonialism to describe the appropriation of data for capital means (2019). As traditional colonialism created inequality in resources and labour between countries, tech companies commodify data that goes beyond the control of the individuals to whom it belongs. Their personal information is treated as a commodity for corporate gain, often without their control or consent. 

Data colonization creates a power imbalance that disadvantages marginalized groups. This newer digital form of exploitation is like a wolf in sheep’s clothing. Algorithms have become a pervasive presence in the backrooms of our lives, operating invisibly but with significant consequences. From these systems comes injustice. O’Neil mentions in her book how algorithms fired hundreds of teachers from Washington public schools because a ‘complex algorithm’ scored them as inadequate teachers who needed to be dismissed (2019). Systems that were originally created to optimize efficiency became self-perpetuating feedback loops that reinforced poor bias. As she later describes using credit scores and hiring algorithms, this new method used in the workplace can reinforce a system that leads people into more poverty.

In another instance, PredPol, a policing tool that analyzes crime, created a shift from focus on more important crimes to petty crimes. This shift, exacerbated by residential segregation, resulted in neighbourhoods being overpoliced, adding to systemic inequality. Moreover, serious crimes are not adequately dealt with due to misallocation of resources, leading to destabilization in communities and creating more of a surveilled neighbourhood rather than a supported one. As Heilweil discusses AI, she brings up the valid points of how AI inherits the biases from its creators. A system ‘trained’ on data that is prejudiced and discriminatory results in an algorithm that reinforces social prejudice rather than changing it. The systems seem to be oblivious to deeper context, and rather than protecting, they affect the vulnerable. 

In response to the influence of data algorithms, data activism has emerged. There are several examples of data activism that challenge this form of colonialism through ‘algorithmic resistance, data justice, and algorithm accountability’ (Ricaurte, 2019). Citizens have created initiatives for good and fill in the gaps in place of government inaction, like in Mexico, where Maria Salgueros gathered data to bring justice to femicides (Ricaurte, 2019).  A community-driven effort utilizes the available data resources to amplify marginalized voices and create their narratives against misrepresentation and injustice. 

Maria Salgueros

Joy Buolamwini, founder of the Algorithmic Justice League and an AI researcher at MIT, hopes to protect the everyday citizen from faulty software. In cases of facial recognition misidentifying individuals, Buolamwini works by ‘urging public and private organisations [to sign] the SAFE Face pledge’ (2018). She also advocates for how this kind of technology is ‘susceptible to bias’ and can even ‘be used in ways that breach civil liberties’ and warns how it can be abused and weaponized if not confronted and defined with limits. In addition to these initiatives, governments and organizations are beginning to implement laws and regulations to promote transparency and privacy regulations. 

Joy Buolamwini

While algorithms are efficient and sometimes impressive, they often perpetuate a modern form of colonialism that affects the job markets, daily life, and education. Rather than having a profit-driven world, decolonizing efforts focused on empowering people, representation, and equal treatment while also recognizing the injustices and challenges brought by these algorithms is essential to moving towards a fairer and more equitable digital world.


References:

Algorithmic Justice League launches new campaign to prevent facial recognition software industry from selling or buying tech that can be weaponized. (2018). In PR Newswire. PR Newswire Association LLC.

Couldry, N. (2019). The costs of connection : how data is colonizing human life and appropriating it for capitalism (U. A. Mejias, Ed.). Stanford University Press.

Heilweil, R. (2020). Why algorithms can be racist and sexist :A computer can make a decision faster – That doesn’t make it fair. Vox. https://www.vox.com/recode/2020/2/18/21121286/algorithms-bias-discrimination-facial-recognition-transparency

O’Neil, C. (2017). Weapons of math destruction : how big data increases inequality and threatens democracy. Penguin Books.
Ricaurte, P. (2019). Data Epistemologies, The Coloniality of Power, and Resistance. Television & New Media, 20(4), 350–365.

Leave a Reply