Words: Yashi Banymadhub
“Are you a coloniser?”
ChatGPT: “No, I’m not a coloniser. I’m an AI developed by OpenAI, designed to assist with information, creativity, and problem-solving.
If you’re asking in a metaphorical or critical sense—perhaps about the influence of Western technology, language, or culture—I can definitely engage with that. There are important conversations to be had about digital colonialism, AI biases, and cultural representation.”
“Are you being defensive?”
ChatGPT: “Not defensive—just clear. When you asked if I’m a coloniser, I wanted to make sure I addressed both the literal and possible deeper meanings of the question. If you meant it as a critique or provocation about power, influence, or cultural dominance in tech, that’s fair ground for discussion. I’m here for an honest conversation if that’s where you’re going with it.”
You couldn’t blame ChatGPT for being a little taken aback by my question. After all, using brute force and violence to steal land and enslave its people is surely incomparable to a software people use to write an email they can’t be bothered to?
Not entirely. Colonial extraction refers to the practice where colonisers exploit the natural resources of colonised territories for their own benefit, such as mass removal of minerals and agricultural products that harms the environment and renders the local population dependent on the coloniser for survival. AI mirrors historical colonisation through the domination and control of data, driven by the same desire for profit and power as entities in the Global North from the 15th to the mid-2oth century.
The term “data colonialism” was coined by Couldry and Mejias (2019) as a parallel to historical colonisation to refer to the domination of data flows by powerful countries in the Global North over countries in the Global South. While people in the latter contribute unwittingly to the data used to program AI softwares, they will not use nor benefit from these technologies and be further marginalised as the wealth gap increases.
This hegemonic system puts women and non-binary people at most risk as systemic inequalities grow. While the racist and dehumanising systems of historical colonisation aren’t employed, cheap labour in the Global South benefits rich, white-majority countries at the expense of a stable economy and liveable wages. AI systems need large volumes of labeled training data (information that’s been tagged or annotated with the correct answer so AI can learn from it, for example labelling a photo of a cat so that the systems may recognise felines in the future).
By outsourcing to cheaper — and often inequitable — markets, AI researchers and engineers in the Global North can retain their high-value roles and salaries while companies remain economically dependent on workers earning below the minimum wage. OpenAI’s partnership with Sama, a data annotation company, relies on Kenyan workers filtering out harmful content for a safer and commercially-viable ChatGPT — all on less than £1 per hour. Workers also filter out violent and sexual content without proper mental health support.
Now a new generation of academics are advocating for a “decolonial AI” to give power back to people in the Global South. In a series published by MIT Technology Review, Indonesian taxi and food delivery drivers resist algorithmic control by continuing the tradition of “base camps”: networks formed on street corners to trade information with each other and offer their services to passersby. Their strength in community has enabled Indonesian drivers to overcome exploitation and changes to their work brought on by new platforms that seek to pit workers, scattered across a geographical location by the algorithm, against each other.
In a rural town in New Zealand, a Māori couple seize back control of their community’s data so that they may control how it is used, and turning to AI to revive their native language, te reo, to prevent dominant languages from taking over. In South Africa, the fight against AI is far more dangerous. Private surveillance systems and biometric data are used to fuel digital apartheid, resulting in a widening and deeply-racialised inequality gap. Activists and journalists are increasingly tracked by individuals, with some fatalities.
AI may have brought about a new world order, but it can expect the same pushback from communities that are used to defending themselves from colonising forces. New faces, same old fight.