Artificial intelligence is shaping everything from education and healthcare to justice and governance. But without the right checks and balances, AI systems can also reinforce inequality, invade privacy, and deepen existing power gaps. That’s where responsible AI comes in. AI that is ethical, inclusive, and respects human rights.

But how do we measure whether countries are actually putting responsible AI into practice?

That’s the mission of the Global Index on Responsible AI

RELATED CONTENT: Why is Global Measurement Important

What is the Global Index on Responsible AI (GIRAI)

The Global Index on Responsible AI is the world’s first tool designed to track how over 140 countries and jurisdictions are developing and governing AI in ways that are ethical, accountable, and human-centric. It looks at real-world actions, not just promises on everything from AI legislation to digital inclusion and algorithmic transparency.

GIRAI uses primary data collection to understand how AI is being implemented, regulated, and experienced around the world. That means talking to people who understand the local context, researchers like you.

With support from Canada’s International Development Research Centre (IDRC) and the UK’s Foreign, Commonwealth and Development Office (FCDO) through the Artificial Intelligence for Development (AI4D) program, we’re now preparing the second edition of the Index set to be released in 2026. And we’re growing our global research network to make it even stronger.

Why the Research Network Matters

GIRAI isn’t just about publishing a report. It’s about building a global community that puts ethical AI and human rights at the heart of development and policymaking. In our first edition, we worked with researchers from 138 countries. This network is one of GIRAI’s biggest strengths.

Researchers bring local expertise, lived experience, and critical insight into how AI is showing up in their countries. By becoming a Country Researcher, you become part of this global effort to spotlight what’s working, what’s not, and where support is needed.

What You'll Do as a GIRAI Country Researcher

If selected, you’ll help gather and assess evidence on how AI policy, ethics and human rights are being addressed in your country.

Your main tasks will include:

  • Completing a detailed questionnaire and survey tools based on GIRAI’s framework
  • Participating in training on responsible AI and human rights
  • Reviewing and documenting government and non-state actor activities across key thematic areas
  • Following strict data quality standards and timelines

You’ll receive a full Researcher's Handbook, attend capacity-building sessions, and work with a dedicated regional hub to guide your process. Most importantly, your work will directly inform the world’s understanding of AI development on the ground.

Why You Should Apply

This is more than a research job, it’s a chance to shape benchmarks on how the world addresses the use of AI in human society. You’ll:

  • Be part of a global, respected community of AI researchers.
  • Gain visibility for your work in a high-impact, international project
  • Receive compensation between USD 1,000 - 3,000, depending on the scope of work and location
  • Build skills through our methodology and training sessions
  • Help make AI development more inclusive, ethical, and transparent

Key Dates

Applications open: July 1st - 31st, 2025

Training: September 2025

Data Collection: October - December 2025

How to Apply

Ready to get involved? You can:

Sign up to our newsletter
Stay updated with the latest news and exciting updates from GCG!
By subscribing, you agree to receive occasional emails and newsletters from the Global Center on AI Governance. We respect your privacy and will never share your information outside our organization. For more details, please see our terms & conditions.