5 Questions for…Joan Kinyua

You will be speaking at re:publica conference this May on the panel “Am I Not Human? Data workers behind our AI systems and social media platforms speak out”, supported by the Gig Economy initiative. What does “Am I Not Human?” mean to you personally, especially in the context of your work in data labelling?

I started data labelling in 2017. To outsiders, it seemed simple, but behind the scenes, we faced constant pressure to maintain 95%+ accuracy daily. One mistake—even a 94% score—meant punishment: no incentives, no recognition, and mandatory Saturday shifts. I gave my time and energy, hoping to be seen as a pioneer in artificial intelligence (AI). But the reality was a system demanding perfection while treating us like nothing.

When the pressure became unbearable, I planned my exit. I thought I was moving to stability, but it was worse. At the next company, I was labelled an “Independent Contractor”. I lost protections—no benefits, no rights—just precarious monthly contracts. We had to limit our income to survive, not by choice, but necessity. While the company staff enjoyed perks and social gatherings, we were shadows—excluded, dehumanised. Leave days were dictated. Our voices silenced by threats and discrimination. We are the labour behind AI but remain excluded from its profits.

So I ask: Am I not human? Do I not deserve rest, fair pay, dignity, and recognition? Contracts that protect me, not strip me of my rights? Tech companies thrive off our labour yet discard us when we speak up. Big Tech treats us as replaceable parts—but I am not a part, a tool, or a ghost. I am human—and I demand to be treated like one.

AI and social media rely heavily on the invisible labour of data workers—yet most people never hear about it. Can you walk us through what your work actually involves? What does a day of a data worker look like?

I have worked on AI projects like self-driving cars, vacuum robots, medical diagnostics, and content moderation. We drew boxes around people and signs for cars, taught robots to detect dirt and avoid toys. At first, it felt manageable, but expectations rose. Tasks got more complex and demands tighter. I ended up working with medical data—X-rays and scans—teaching AI to read bodies, without knowing who would use it or its impact on real lives. All while earning so little I sometimes worked 20 hours a day just to make ends meet.

Then, there came projects crossing moral and ethical lines. Companies promised annotation jobs but asked for personal videos and photos of our families with no transparency on use. It was manipulative and invasive. I also labelled violent, pornographic, and drug-related content—blood, sexual violence, disturbing images. No psychological support, no warning—just a cold interface and deadlines. This is the reality of AI development all over the world: labour by unprotected, overexposed, invisible workers. We do not just teach machines—we bear the internet’s darkest corners. Yet, no one knows about us, let alone asks if we are okay.

A day in the life of a data labeler is survival. My day started at noon. I prepped meals, then worked from 2 p.m. to 3 a.m. Once I sat down, I could not afford to break my rhythm. Sometimes no tasks came, but I could not leave the desk. I had to stay glued to the screen, watching the page auto-refresh again and again, hoping that a task would land in your queue. Hope was the only constant. Everything else was uncertain.

You are the President of the recently founded Data Labellers Association. How is the association advocating for better working conditions, and what impact have you seen so far?

We engage key stakeholders locally and globally to secure recognition and rights for data workers. In a meeting in Naivasha, Kenya in December 2024, as a preparation meeting for this this year’s International Labour Conference from June 3rd, 2025, on platform jobs, we shared stories of exploitation and called for worker-focused policies. We also signed an open letter to President Biden demanding accountability for abuses in the AI supply chain.

This momentum continues. I spoke directly with the US Undersecretary for Labor, stressing how unionising is nearly impossible in environments designed to silence us. In Kenya, we submitted proposals to the Business Law Amendment Act 2024 and actively engage in shaping it. The impact is clear: from explaining who we are to being heard. People now begin to see AI is built on hidden labour of real people with real stories. We are no longer silent—we shape policy, demand change, and make the world see us.

The strongest shift happens on the ground. More data workers attend workshops, speak out, and organize. Fear is breaking. People realise they are not alone—and collective voices have power. Silence lifts, replaced by solidarity and demands for dignity. This is change—in policy and in people reclaiming their voices.

Kenya and Germany started a cooperation to improve labour standards on digital labour platforms and the AI supply chain. In your view, what real opportunities does this kind of international cooperation open for data workers?

When Kenya and Germany unite to improve data workers’ treatment, it is more than policy—it is a lifeline for thousands. Picture a single mother in Nairobi working 16-hour days labelling AI data, struggling to pay rent. Or Ladi Anzaki, a young Nigerian TikTok moderator who died unable to return home due to her contract. These invisible workers have been unseen and unheard too long.

This partnership can change that, offering fair contracts, decent wages, and healthcare. It recognises data workers as professionals with dignity. Most importantly, it creates a platform for worker-led organising, letting them demand better conditions and be truly heard. If Kenya and Germany get this right, it will show that tech progress can uplift lives, setting global standards for justice and accountability. Behind every AI advance are human hands and hearts—it is time they receive the respect and protection they deserve.

Finally, if you could shape the future of digital labour, what would a fair, inclusive, and human digital workplace look like to you?

Fair treatment goes beyond payment. It means a standard eight-hour day with enough income to live with dignity. Weekends are overtime, not just another shift. It means real benefits, healthcare, and enforceable rights respected in practice, not just on paper. Data workers, like all workers, must be seen as human. Leave is a right, not a favour. Mental health support is essential. No one should have to erase their identity for profit.

Above all, fairness means recognition. Every AI breakthrough relies on data workers’ invisible labour, who deserve more than exploitation. Recognition means reward: fair treatment and a share of the revenue they help create, like engineers and executives. Without them, AI would not progress.

We need binding policies protecting all workers globally, not tailored to tech companies. We ask for justice, decency, and a fair share of the future we build. That is not too much to ask.

 

 

Links

Gig Economy Initiative
Panel at the re:publica 2025