
This piece is part of Reframing Impact, a collaboration between AI Now Institute, Aapti Institute, and The Maybe. In this series we bring together a wide network of advocates, builders, and thinkers from around the world to draw attention to the limitations of the current discourse around AI, and to forge the conversations we want to have.
In the run-up to the 2026 India AI Impact Summit, each piece addresses a field-defining topic in AI and governance. Composed of interview excerpts, the pieces are organized around a frame (analysis and critique of dominant narratives) and a refra…

This piece is part of Reframing Impact, a collaboration between AI Now Institute, Aapti Institute, and The Maybe. In this series we bring together a wide network of advocates, builders, and thinkers from around the world to draw attention to the limitations of the current discourse around AI, and to forge the conversations we want to have.
In the run-up to the 2026 India AI Impact Summit, each piece addresses a field-defining topic in AI and governance. Composed of interview excerpts, the pieces are organized around a frame (analysis and critique of dominant narratives) and a reframe (provocations toward alternative, people-centered futures).

*Joan Kinyua is the founding president of the Data Labelers Association of Kenya, which advocates for the recognition and fair treatment of data workers. *
In this interview, Kinyua argues that the current framing of “human capital” invisibilizes data workers—primarily in the Global South—whose labor has quietly powered AI development for more than a decade. Big Tech treats this work as entry-level or temporary, which, they assume, justifies the exploitation of labor in countries like Kenya, often with the blessing of politicians chasing foreign investment at the expense of worker protections. Kinyua calls for more transparency, basic rights, and fair pay for data workers. She highlights examples of workers already pushing back through “name and shame” campaigns and building solidarity networks across borders—facilitating accountability where laws have failed them.
Following is a lightly edited transcript of the conversation.

Frame
The current ‘human capital’ conversation focuses on skilling and workforce transitions. This leaves out and distracts from important discussions about data work.
“Future-of-work” framings ignore the real challenges data workers face today.
When speaking about reskilling, upskilling, they’re not speaking about data labeling. They’re speaking about the engineers and all those big titles. I know why they do this is because this has always been termed as “the future of work.”
But how is it the future while it has been in existence for the last ten years? Our work is being treated like an entry-level job or a transition job. It’s not being treated as a “real” job. We have been working in this space for the longest time and people are continuing to work in this space.
A majority of the people have been fed this narrative that AI is doing it on its own. Most people think AI is magic. And I do not know why we are being invisibilized.
As the Data Labelers Association, we represent the people who are powering artificial intelligence. So it’s the data labelers, it’s the content moderation, it’s the people doing transcription tasks. It’s basically the whole landscape of AI.
We came into existence because we faced a lot of challenges. And we had no recourse, we had nobody to report the challenges to. It’s the content of the work that you’re doing that is really harming you and it’s the nature of the work. You’re always working, and you’re always walking on eggshells. It’s the pressure. It’s the monthly salary that does not make sense—and you have a lot of things that the salary is supposed to sustain. And we had no recourse. You’ve worked for a full month, you’re not paid, you do not have anybody to report that to. You do not have any social protection.
So we came up because of these systemic challenges. We have come to realize it’s intentionally designed that way. So we took it upon ourselves to fill the knowledge gap that is there of who the data labelers are. Because there’s this blanket definition of platform workers, and it usually speaks about the delivery, the Ubers—but it did not include us.
Global South governments are giving up their “cake”—abundant and educated labor—without fighting for worker rights and protections.
In Kenya, when you’re speaking about good things, you refer to them as cake. So for me, I am the cake. For the president, he is the owner of the cake. So instead of selling this cake, he’s giving it away.
We have a lot of young people, and almost everybody is well educated. We all have university degrees. We all can speak in English very well. There’s a good penetration of the internet. We are all very tech-savvy people. But there’s no jobs.
So that’s a very ripe market for Big Tech, right? You find that it’s cheap, quality labor.
For now in Kenya, I think AI is the biggest employer to the young—to the organizations we know and to those that we don’t know. So instead of banking on our people, and selling them at the highest rate, the president is shortchanging them for his own benefit to be in the good books of Mark Zuckerberg and in the good books of Elon Musk. It’s that they want to be identified with the big people. They want to be part of the conversation because Big Tech is in bed with the politicians.
For example, a bill passed recently called the Business Laws (Amendment) Act, 2024. And there’s this part […] when they say that the Big Tech cannot be sued in Kenya. We can only sue the business processing outsourcing (BPO) firm. So when you see that, you really feel very discouraged.
I feel very discouraged because people are working eighteen hours a day with no social life, with nothing to show for it. You don’t even have medical coverage for your kids. You’re living in struggle. Someone I spoke to two years ago mentioned the job was dictating where he’s going to live. So today he has a job, he moves because he does not want to raise his kids in this specific area that he is in. And then when the job is not there again, he moves back again to the ghetto because that’s where he can afford.
So you can imagine these people and then you also imagine the presidents and the people, decision-makers, they have the power to have a conversation and speak for the workers—but they’re not.

Reframe
Kinyua argues that it is important to fill the policy void by demanding accountability and better treatment for data workers via “naming and shaming,” as well as through domestic and global worker solidarities.
“Naming and shaming” works where laws do not, and deliberate distancing has helped companies evade accountability.
Naming and shaming—that has worked. Something I’ve learned is do not sugarcoat. If it’s Meta you’re speaking about, name them. If it’s ChatGPT, name them. They really do not want bad publicity for their products. So I think the naming and shaming is the most important part of this.
Because laws are not holding them accountable. For example, I remember in the Meta case that is still ongoing, they said they cannot be sued in Kenya because [the company isn’t] registered under the Kenyan laws. So because of enforcement when it comes to [Global South] countries, there’s no accountability. There’s no body to make sure the laws are enforced and they are followed. And I feel that is why they’re taking advantage of the situation and at the same time creating rules that really favor them. Laws may be there, but they usually protect the Global North.
Solidarities between worker groups—domestically and globally—are emerging and important.
In Kenya, we have the African Tech Workers Movement. It’s the data labelers, it’s the content moderators, it’s the Uber drivers, it’s the deliveries, it’s all those people. And it’s because we have some common challenges, especially working for platforms. So we merge together.
Apart from that, we are also part of another global gig workers alliance that includes other countries. There’s the Philippines, there’s the US, there’s a lot of people. So I think even though it’s a bad thing that is happening, it’s also a good thing because we are forming international solidarities that are really strengthening us.
We started our organization because of a German workers’ council that came to Kenya and we saw what they had accomplished for their workers. That really encouraged us. So I feel like it is there and it is growing. The momentum of international solidarity is growing as time goes by. Once we realize they are not listening, we form bigger bodies. And the bigger the solidarity, the more the impact you’re seeing.
Workers need to be better informed and to have better rights and protections in place.
First of all, I’ll just cut this invisibility. They do not know me. I do not know them. I would remove that. I want to know who I’m working for. I want to have a conversation with them. I want to know what technology I am coming up with. Is it something that’s supposed to help people? Let’s discuss things—I might have better feedback when it comes to whatever it is you’re doing.
A colleague of mine works for an organization called Tech for Animals, and this work made him realize that we were being treated like trash. There, he is part of the decision-making, he is told the process of everything, and he is taught what this AI is for. When we look at an organization like the one he’s working for, and you feel they want him to be part of the conversation, you feel like this is something that can be done. But when they are bringing masses of people, I feel like they do not really care about the people that they’re bringing on board. It’s that they want their work to be done and that’s it.
If I was queen for one day,I’d make sure people in this place would work at normal hours and they accorded their rights. We should not have to struggle so much to get something that we have earned. I’ll make sure I am accorded the rights and benefits and the pay. And I can go for holidays. And accounts can’t be closed arbitrarily. And I can pay for hospitals—basics. Things that look very basic are very far-fetched for people in this sector.
Research Areas