AI and access to justice: reflections on MPP summer placement with the Clooney Foundation for Justice 

Kanksshi Agarwal, 2024 Master of Public Policy student and Cyril Shroff Scholar, reflects on her summer placement with the Clooney Foundation for Justice and how it shaped her understanding of the intersections between law, technology and gender justice. 

Estimated reading time: 6 Minutes
Emma Lindsay, Co-CEO Clooney Foundation, and Kanksshi
Emma Lindsay, Co-CEO Clooney Foundation, and Kanksshi

This year, as part of my policy placement, I was offered the opportunity to work with Clooney Foundation for Justice (CFJ), an organisation co-founded by Professor of Practice Amal Clooney, which focuses on advancing justice through strategic litigation, monitoring the fundamental rights of journalists and providing legal support to women and girls worldwide.  

I joined as a Global Fellowship Strategist on their flagship fellowship, Waging Justice for Women – an initiative supporting young women lawyers across Africa to challenge systematic barriers to justice, build upon their legal skills, develop strategic litigation and support the understaffed legal ecosystem in Africa. This was a chance to bring together my background in technology and gender politics with the pursuit of structural reform through law.  

Before coming to Oxford, I founded India’s first incubator for women in politics, NETRI Foundation and Access Polity which built a community of over 5000 women and trained them for different paths into politics and policy in India. With a background in engineering and urban policy, I’ve long worked at the crossroads of technology, gender, society and politics, guided by my motto “building a safe and just world”. But this time, the intersections I was navigating were different – they revolved around what politics ultimately yields: structural reforms through strategic litigation. 

At the Clooney Foundation, my task was to review the existing fellowship programme, design an expansion framework, and explore the role of technology in community building, which led to the ideation of an Agentic AI for building a community of practice of feminist lawyers, alumni of the WJW programme. The CFJ team who guided me during this process left a lasting imprint on how I think about justice and institution-building. The experience revealed, in the most tangible way, how technology, legal frameworks and policy interventions can converge to create systemic change for women and girls. 

Conversations that shaped my thinking  

I still remember meeting with Emma Lindsay, the Co-CEO of the Clooney Foundation for Justice, alongside Professor Philippa Webb. Emma asked me to reflect on the gaps that systematically exclude women from accessing justice and how can AI driven solutions that can be integrated into models such as WJW fellowship to create large scale impact. The question of scale and replicability is gripping in this hyper-contextual circumstance. 

Our discussion veered into the ethical implications of AI – how can artificial intelligence, if designed responsibly, can support policy reform and access to justice for marginalised communities, eradicating rather than replicating bias. We spoke about how tech-capacity building and women’s participation must go hand in hand when designing solutions that affect their lives. 

When I later spoke with Professor Clooney, she posed a more personal question: “If you could change just one law in your country, what would it be?” I spoke about my ongoing work to reform the application of India’s Prevention of Sexual Harassment Act (PoSH), 2013, so that it applies to political parties – because politics is a workplace, and women in political life deserve the same protections as women in every other profession. Extending this law would be a step towards recognising politics as a form of work, ensuring women can express themselves freely, exercise their creative agency and pursue leadership without fear of harassment or retribution. 

I was struck by the blend of pragmatism and radical optimism in leadership. 
In my conversations, I observed a rare combination of uncompromising clarity around impact and a profound openness to learning. Their example reminded me that being willing to be challenged is at the heart of building clarity and companionship in the work of justice. It is the kind of leadership I aspire to emulate in all roles. These conversations shaped my work and my understanding of emerging intersections between AI and Justice – insights that, I believe will resonate globally. 

How the work informed my Policy Report 

I began with a review of the African fellowship model: who had been selected, the training they received, how they collaborated, and how the alumni network functioned. This was not just a technical exercise; it was an immersion into the lived realities of women lawyers pushing back against entrenched systems. 

Community-building emerged as a central tool. Alumni networks must be intentionally designed, and my research drew on global examples of how technology can serve not just as a platform but as connective tissue for building movements.This led me to question: can technology do more than connect people – can it change institutional behaviour? To explore this, I designed an agentic AI embedded with a case tracker which builds community amongst WJW fellows and alumni. As I mapped potential countries for expansion of the fellowship model, specially to Asia, I kept coming back to one question: would a tech-driven solution be universally applicable for improving justice outcomes? 

Rethinking AI as Augmenting Institutions using Game Theory 

This query evolved into a systemic policy thesis, where I applied my learnings from the MPP and the problem statement of expansion of access to justice using AI/Technology. Titled, “Augmenting Institutions with AI- Application of Game theory to the Justice System in Malawi”, my policy report brought together behavourial economics, AI and access to justice through a case-study of interventions in Malawi's legal system. 

In low-resource justice systems, the constraint isn’t only money or manpower, but incentives. Victims face high costs and stigma, frontline actors operate with limited capacity, and courts are congested – the rational outcome becomes silence and delay.  

One of my recommendations was to build a low-tech, AI-enabled model which can be operationalised through a simple reporting, referral and resolution (RRR) model. In other words, technology serves as a coordination and visibility device that facilitates reporting, referral, and resolution of rational choices. 

If we export incentive design, not just applications we can develop a policy-practice across global south. The Augmenting Institutions (AuI) framework is portable precisely because it is context-specific with tools (calling services over apps where the internet is scarce; local language voice calls for low literacy) but universal in logic: make performance observable, reward timeliness, and protect users by design. That lens helped me judge Asian partners based on the opportunities present to plug in tech solutions that reduce backlogs, punish sluggish performance and increase cost of inaction. 

In fact, Amal and Emma’s initial questions proved as anchors for my approach and shaped the methodology. I treated technology not as a shiny object for end-users, but to alter payoffs for every actor in the justice chain – victims, paralegals, police, prosecutors and courts. That’s what I call Augmenting Institutions: using technology to engineer visibility, coordination, and credible discipline so that even overburdened systems can reach a higher level of performance without assuming an abundance of lawyers or bandwidth. 

The key insight I took from this work is simple: in low-resource settings, technology that doesn’t change incentives doesn’t change outcomes. My research demonstrated how simple tools – such as voice-calls, SMS reminders, and e-hearing nodes – can transform the justice system when tied to observable performance and credible follow-through. We don’t need more pilots; we need systems that make every actor visible, every delay traceable and every commitment accountable.  

I came to Oxford to read public policy; I left convinced that institutional design is a moral act. My MPP placement at the Clooney Foundation for Justice, showed me how law, AI and policy can be engineered to protect agency. It gave me mentors who challenged my thinking and the conviction that building a just world requires both strategic imagination and structural design.