Most discussions about artificial intelligence start in the same place: rules or guidelines. Governments are expected to decide what AI systems can do, how risky they are, and how to control them. The European Union’s AI Act is often treated as the clearest example of this approach. This focus on regulation misses a more basic question: who gets to build AI in the first place? This article argues that this focus is misplaced. The most important decisions about AI governance are often made earlier, through the organisation of access to compute, data, and digital systems.

This question of who gets to build AI matters because access to the core infrastructure needed for AI development such as compute, data, and digital systems, is uneven and increasingly strategic. As Farrell and Newman show in their work on “weaponized interdependence”, control over key points in global networks can create power asymmetry without formal regulation. India provides one of the clearest examples of this dynamic in practice.

Where governance actually begins

Over the past decade, India has developed an extensive digital public infrastructure, including Aadhaar and UPI which has changed how identity, payments, and state services work at scale. These systems are often discussed in terms of inclusion, but they also reorganise how people and firms can participate in the economy, as shown in the Center for Global Development study.

The current phase of India’s AI strategy extends this infrastructural logic. The India AI Mission, approved in 2024, places access to compute at the centre of its intervention. Public resources are being mobilised to create shared high-performance computing capacity that can be accessed by startups, researchers, and public institutions. Alongside this, the program emphasises curated datasets, platform development, and ecosystem support. The stated objective is to reduce entry barriers and enable a broader set of actors to participate in AI development.

At one level, this looks like a standard effort to build domestic capacity. At another, something more structural is happening. If building AI depends on access to compute, data, and digital systems, then the way these resources are distributed shapes who can participate. By the time regulation enters the picture, many of the key advantages are already in place.

The organisation of access

India’s approach is not restrictive. In fact, it aims to expand access. The IndiaAI Mission is explicitly framed around lowering barriers and enabling a wider ecosystem of developers. In a world where advanced computing resources are concentrated in a few firms and countries, this is a significant move.

However, expansion does not mean equal access. Compute access is not simply open to just anyone; it is distributed through programs, partnerships, and institutional channels. In much the same way, participation in digital systems requires alignment with technical standards and protocols. These systems do not block participation, they shape it. In addition, actors already embedded in formal innovation ecosystems, for example, venture-backed startups, research institutions, firms with technical capacity, are better positioned to navigate these pathways. Others may find entry more difficult, not because they are excluded, but because the available routes are narrower. This is not a side effect. It is how large systems work.

Openness and its limits

India’s model is often described as open: open infrastructure, open networks, open participation. Although this is partly true, openness in India’s digital public infrastructure and AI ecosystem, including systems such as UPI, ONDC, and emerging AI platforms under the India AI Mission, operates within a structured environment. Multiple actors can connect to these systems, but the terms of connection are defined in advance. Standards, interfaces, and governance mechanisms set the conditions for participation.

This combination of open access and predefined terms of participation creates a tension. On the one hand, expanding compute access and digital platforms reduces dependence on external actors and opens new opportunities for domestic innovation. On the other hand, structuring access through programs and standards can concentrate participation among actors already able to operate within these systems.

Power through enablement

This structuring of access changes how power works in AI governance. In regulation, power works by limiting what actors can do. In infrastructure, it works differently, by shaping what actors can do in the first place.

When access to compute, data, and platforms is organised in specific ways, it affects both who participates and what kinds of AI development are possible. Some forms of innovation become easier; others remain out of reach. A simple example is the IndiaAI Compute Portal, which is designed to give startups, researchers, and public institutions access to shared high-end computing resources through a common application system. Shared access through the portal widens participation in principle. In practice, it also means that participation depends on meeting the requirements of that system - from institutional affiliation to the ability to formulate viable projects and work within the portal’s technical and programmatic terms.

A similar dynamic can be seen in India’s digital public infrastructure more broadly. Systems such as UPI or ONDC (Open Network for Digital Commerce) are often described as open networks. Yet to build on top of them, firms need to comply with specific technical standards, integrate into existing protocols, and operate within a defined architecture. Such structuring is not a flaw in the model. Large digital systems need rules, standards, and protocols to remain reliable, interoperable, and scalable. The political question is whether structured access is mistaken for equal access. These design choices influence which actors can grow within the system and which remain at the margins. These decisions are often framed as technical or developmental. But they have long-term effects on who builds AI and who does not. Governance, in this sense, is not absent; it is built into the system.

Sovereignty and dependence

India’s strategy points to a push for digital sovereignty. By expanding domestic capacity in compute and data, the state is trying to reduce reliance on external technology providers and gain more control over its digital ecosystem as reflected in the India AI Mission. This approach makes sense in a world where access to semiconductors, cloud infrastructure, and advanced models is highly concentrated. Despite its pros, this effort runs into limits. This is because building domestic capacity does not remove global dependencies, it changes how they work. Compute infrastructure still depends on global supply chains. The most advanced models remain concentrated in a small number of firms. International integration is still necessary for scale. This creates a second tension: sovereignty is pursued within interdependence. Similar dynamics can be observed in other contexts, where governments are investing in compute infrastructure, data ecosystems, and platform architectures as a way of shaping AI development.

Inclusion before production

India’s digital transformation is often described in terms of inclusion. Systems like Aadhaar and UPI have expanded access to services at scale. Building AI requires more. It requires technical expertise, organisational capacity, access to compute, and the ability to work within complex digital systems. These capabilities are unevenly distributed. If access to these resources is uneven, then the ability to build AI will also be uneven. Some actors will develop and scale systems. Others will remain users of technologies developed elsewhere. Inclusion in services does not automatically mean inclusion in production.

Rethinking AI governance

India’s trajectory suggests that governance does not start where we usually look. If we focus only on regulation, we miss where key decisions are actually made - in how compute is allocated, how data is organised, and how systems are designed. These are not just technical choices. They are decisions about who gets to participate.

For policymakers, this means that governance cannot be reduced to mere rules. It must also address how access is structured. For researchers, it means shifting attention from downstream regulation to upstream conditions. For the public, the question becomes simpler: who is being enabled to build AI and who is not?

What is at stake

India’s approach does not offer a simple template. It combines efforts to expand access with mechanisms that inevitably structure it. It creates new opportunities while also defining their contours.

What it does offer is a clearer view of a dimension of AI governance that is often overlooked. By the time AI systems are regulated, the landscape of who can build them has already been shaped. If this is the case, then focusing only on regulation is not enough. Governments need to think more directly about how access to AI infrastructure is organised and who it enables.

The question, then, is not only how AI should be governed. It is how the capacity to build it is distributed and by whom.

About the author:

Olga Ustyuzhantseva is a researcher working on AI governance, digital infrastructure, and innovation policy. Her work focuses on how access to data, compute, and digital systems shapes technological development and participation, particularly in the Global South. She has conducted comparative research on India, South Africa, and Russia, and contributes to international research on digital public infrastructure and AI governance. She is the author of a monograph on science, technology, and innovation policy in India, and her recent work develops an infrastructural approach to AI governance.

Sign up to our newsletter
Stay updated with the latest news and exciting updates from GCG!
By subscribing, you agree to receive occasional emails and newsletters from the Global Center on AI Governance. We respect your privacy and will never share your information outside our organization. For more details, please see our terms & conditions.