In May, I attended the W20 Inception Meeting in Cape Town, where delegates from government, academia, civil society, and the private sector gathered to discuss gender equality in the G20. Two major themes surfaced: digital transformation and the care economy. While the former urged greater women’s representation in science, technology, engineering, and mathematics (STEM); the latter called for care work to be recognised as vital economic infrastructure, even an entrepreneurial opportunity.

Both are undoubtedly urgent. Especially now, as artificial intelligence (AI) transforms how we work, with the potential to disrupt livelihoods and deepen existing exclusions.

While most mainstream policy responses focus on inclusion through supporting women into tech roles or equipping them to access digital work, these efforts risk overlooking something more foundational: how labour, innovation, and value are defined in the first place. It’s a question that stayed with me as I left the meeting. We need to interrogate the gendered and racialised assumptions embedded in our economic and technological systems. Especially since AI doesn’t only reflect these biases, but reproduces and amplifies them.

Few areas make the stakes clearer than care work in the age of AI. Here, there are two dominant narratives. First, automation: caregiving robots could free up women’s time, enabling them to enter the workforce. Second, flexibility: digital platforms that allow women to earn while caregiving. But both approaches erase something essential. They strip care of its relational, emotional, and contextual meaning. These feminised dimensions of care have long been invisiblised and undervalued, both economically and socially. We should be doing the opposite: revaluing care not only as labour, but as a guiding principle. This would inform a slower, more contextually sensitive and collaborative way of thinking about the future of AI and work. One that centres dignity, wellbeing, and interdependence.

Building on Uneven Ground

AI is changing how work is imagined, structured, and distributed, but it's also reproducing historical exclusions. According to the International Labour Organisation (ILO), jobs easiest to automate are those widely – whether correctly or not – considered easy, repetitive, unskilled: secretarial work, cleaning, call centres, basic care work. Historically, these jobs have been feminised, racialised, and underpaid.

While AI may also generate new forms of work, the benefits won’t be evenly distributed. Women, those who are low-income, caregivers, and based in the Global South, face structural barriers: time poverty, limited digital infrastructure, high costs, and digital skills gaps. Globally, women perform two to ten times more unpaid care work than men, which correlates with lower labour force participation and higher precarity. As things stand, women are most likely to be automated out, and least likely to benefit from AI-driven opportunities.

Care Work in the Platform Economy

In response, global institutions have turned to the “care economy” as a site of digital opportunity. A UN Women and UNESCO report notes that technology can support women in their “multiple roles in production, community management, domestic and care responsibilities.” The implication is that digitisation can lighten care burdens and expand income streams.

One path has been platform work—from on-demand childcare to domestic services—marketed as flexible and empowering, particularly for women balancing caregiving and earning. But research shows that platform economies often reproduce, even intensify, exclusion. Algorithms reward uninterrupted availability, penalising those with caregiving responsibilities. Gig-based models push wages below legal minimums, workers operate with little to no protection, and employers offload responsibility.

These issues go beyond regulation. They reveal deep structural logics of devaluation. For instance, in South Africa, platform dynamics echo histories of racialised servitude, where Black women have long been relegated to underpaid, invisible labour in white households. Algorithms inherit and reproduce these dynamics through rating systems, surveillance, and opacity. Research demonstrates how platform models exploit regulatory gaps and migrant labour regimes to divide workers into groups with unequal protections and pay, maximising profit through selective exploitation.

Gender, race, and class inequalities are refracted through code, as algorithms become active agents in the devaluation of certain kinds of labour.

Commodifying Care

AI-driven platform work is part of a longer trend in the marketisation of care—cycles of commodification, decommodification, and recommodification that have unfolded through history as societies undergo economic changes.

However, as critical feminist and postcolonial scholars argue, recognition through commodification is not the same as revaluation. Shokooh Valle critiques the figure of the “Third World Technological Woman” , celebrated in policy circles yet burdened with care as an individual responsibility, without structural support. Rather than addressing the political roots of care inequality, technological “solutions” often serve as short-term fixes. 

Others warn that commodifying care not only exacerbates inequalities but also devalues the significance of care relationships. Platforms reduce care to metrics like efficiency, availability, and compliance, erasing its affective and relational dimensions. The same logic applies to caregiving robots. Even if we succeed in building service robots with social competencies, it remains questionable whether their work would truly approximate care. As care is translated into code, something vital is lost.

What Can’t Be Automated

Tronto and Fischer, define care as “everything we do to maintain, contain, and repair our ‘world’ so that we can live in it as well as possible”. This definition posits care as a practice, but it also involves dispositions: responsibility, compassion, vulnerability, dependence, and deep contextual understanding. 

While factual recall or formal reasoning can be automated; these invisible, intangible, unquantifiable dimensions cannot. Crucially, they are the same qualities that have long been dismissed as “soft,” feminised, and devalued. Perhaps their resistance to automation is not a flaw in AI, but a sign of what we should protect.

As AI reshapes labour practices, frameworks must go beyond redesigning tasks and begin to question how we define the value of work. This isn’t merely speculation. Some jurisdictions are already experimenting. Singapore’s “Guide to Job Redesign in the Age of AI” suggests practical steps for preserving the human dimensions of work—‘creativity, empathy, emotional intelligence’—rather than replacing them. South Africa’s Presidential Commission on the Fourth Industrial Revolution (PC4IR) adopts a similar human-centred agenda, positioning the country’s competitive strength in its people—particularly women and youth—and calling for shifts in skills ecosystems to protect qualities machines cannot replace.

These developments hint at the potential for change that goes beyond surface adjustments to reimagine our priorities. The real opportunity lies at the intersection of technological change and enduring human dispositions. Caring is one of our strongest resources, and we can use it to guide how we imagine the future of work and AI in ways that support a world where all can thrive.

Valuing Other Knowledges

It is essential to include more women in the AI industry, especially in technical roles. And yes, economic inclusion, even through commodified care work, may be “what we cannot not want,” as Spivak puts it. But inclusion without structural change risks reinforcing the very systems it seeks to challenge.

We need a shift in mindset, away from tech-fixes and market logics, toward relational ethics, slow thinking, and deep engagement with lived realities.

This means elevating fields like anthropology, which continue to be marginalized in AI development but offer some of the most robust tools for grappling with ethics, power, and social context. It means making space for multiple ways of knowing, including the knowledge that comes from people as living, feeling bodies, with histories. And it means building strategies informed by African philosophies, feminist ethics, and communitarian worldviews, knowledge traditions grounded in relationality, reciprocity, and care.

Initiatives like FemLab and Data Feminism show what this could look like in practice—centring power analysis, contextualising data, and promoting participatory approaches to designing technologies and ethics.

Participatory Futures

To avoid extractive processes, it is essential to put women, marginalised groups, and context-specific actors at the centre of decision-making. But participation is not without limits. Vulnerable communities, already overburdened with work, shouldn’t shoulder the responsibility for fixing systems that marginalise them.

If AI is to work for everyone, including women and mothers, it must be built from below, with care, but also upheld from above, through protections, redistribution, and shared accountability. There must also be a willingness to go further than inclusion and to ask difficult questions about what values are centred in the process.

I keep returning to feminist scholar Donna Haraway’s call to “stay with the trouble” —to remain present with the messy, uneven realities of the here and now, rather than reaching for the comfort of technology-led quick fixes. This means recognising that our needs cannot be met by efficiency or automation alone; that meaningful change is slow, relational, and often uncomfortable; and that no technical solution can replace the political and structural work still required. 

Care, then, is neither a problem to solve nor a service to optimise, but a practice to protect. If AI and labour systems are to support human flourishing, they must be grounded in connection, reciprocity, and shared responsibility.

Author: Jess-Capstick Dale

NEXT READ: Epistemologies and Political Economies of AI in Africa and the World

Articles in the “Ideas from the Palaver Tree” collection were co-edited by Selamawit Engida Abdella and Dr. Fola Adeleke

Sign up to our newsletter
Stay updated with the latest news and exciting updates from GCG!
By subscribing, you agree to receive occasional emails and newsletters from the Global Center on AI Governance. We respect your privacy and will never share your information outside our organization. For more details, please see our terms & conditions.