Clinical education is experiential education, after all. It integrates theory and practice through structured, supervised engagement with real-world legal issues. This pedagogy fosters a kind of reflective, values-driven lawyering that distinguishes professionals committed to justice from others.

Nurturing students' desire and ability to do justice is a process that American law professor Jane Aiken calls training for “justice-readiness”. One way experiential learning in legal clinics contributes to training for justice-readiness is by supporting students as they go through disorienting moments. Typically, this happens when clinic professors involve students in cases in which they will be confronted with injustice, and encourage them to find creative legal solutions to achieve better outcomes for their clients.

Can artificial intelligence be of any help in that process?

It depends on how one chooses to use it. In a previous blog post, I argued that law students and lawyers who use artificial intelligence to speed up their professional activities are flirting with recklessness. I still stand by that stance and would probably defend it even more firmly now. However, speed and productivity are not the only skills a good lawyer needs. Being imaginative, empathetic and principled are also hallmarks of good lawyering.

Maybe AI can help with that?

After running some tests, I believe that using AI to develop these qualities among students is a path worth exploring. In fact, I’ve been genuinely impressed by the potential ChatGPT holds for guiding those interested in thinking outside the legal box to promote justice. Here are two experiments I ran to explore whether AI could help clinic professors to teach justice-readiness.

My first experiment consisted of submitting an excerpt from Jean-Philippe Baril-Guérard’s Royal—a novel describing the ups and downs of a law student’s life—to ChatGPT. The extract I chose portrays a haughty student who works in a legal clinic and struggles to help a tenant facing eviction. I wanted to see if an AI system could come up with creative legal solutions to help the tenant stay in her apartment despite the landlord’s claim that the place was unsafe.

ChatGPT did come up with a solution that a 1L with a legalistic worldview wouldn’t think of: it gave me information about a municipal service helping tenants to clean and reorganize their apartments to comply with the city’s regulations. It also noticed that the protagonist lacked empathy when he interacted with his client and offered good advice as to how to improve.

This is pretty interesting, but it’s nothing a good clinical professor can’t do. For my next experiment, I wanted to see if ChatGPT could prove useful to a seasoned clinic professor. In order to do that, I took inspiration from an academic paper on clinical legal education that contained testimonies from real legal clinicians. I focused on a case in which ChatGPT could have benefitted both the students and their supervisor, it goes as follows

A group of students and their supervisor are supposed to meet with a mother who lost custody of her child because she neglected him. The purpose of the meeting is to help her regain custody of her kid after she successfully completed a probation. However, the mother calls to inform the team that she has had a “bad night” and is stuck in a motel room without money. Despite their efforts, neither the students nor their supervisor can imagine any explanation that is not related to alcohol or drug abuse.
They decide to drive to the motel to confront the woman. Upon arrival, their bias and prejudices reveal. Contrary to their expectations, the woman hadn’t been drinking or consuming drugs. She worked late as a grocery store cashier and found herself stuck there because her father’s car had broken down. She took a cab to get back home, but the driver dropped her on the side of the road when he realized that, save for the physical paycheck she just received, she didn’t have anything to pay for the ride. She thus had to walk to a motel where she could spend the night.

Had the legal clinic team used ChatGPT in their brainstorming session, they would have had to consider a broader range of issues—including transportation troubles and overnight work/work crisis. Indeed, these are all potential explanations the chatbot provided when I asked it to help me think through what “bad night” could mean in the context at hand. Realistically, the team of students probably wouldn’t have considered these options to be the most probable ones, but a good clinical professor could have made a teaching moment out of it, framing it as an opportunity for experiential reflection. Encouraging them to prepare for all eventualities, as the legal practice can be full of surprises. The client would probably have received better legal services as a result.

What should we make of these examples? They certainly show that ChatGPT has something interesting to offer, but the precise nature of its contribution is not entirely clear. Justice readiness is an ethos, it’s inherently human and outsourcing the teaching of it to machines feels dehumanizing. However, LLMs have been trained on an immense corpus of texts, which means that they have been exposed to a broader spectrum of ideas than any human has ever been. Refusing to take advantage of it probably does a disservice to justice.

When students experience disorienting moments and fail to find solutions that lead to more just outcomes for their client, they can feel apathetic and powerless. Some may fall back on a posture of legal relativism and give up on their commitment to justice. However, research shows that exposure to hopeful alternatives is a way to prevent this kind of disillusionment. Using LLMs as brainstorming tools to generate hopeful alternatives can open students' minds and help them see that things don’t have to be the way they are. In that sense, AI can contribute to training justice-ready law graduates. When used within the structured, reflective environment of experiential education, it can be another tool to deepen student learning. In troubled times like these, we should probably seize every opportunity to nurture students’ desire and ability to ‘do justice’.

The opinion is the author's, and does not necessarily reflect CIPPIC's policy position.