Back to overview

LCN Says

Reimagining legal assessments in the age of generative AI

updated on 23 February 2024

Reading time: four minutes

The rise of generative AI, including tools like ChatGPT, presents a fascinating challenge for legal education, particularly at undergraduate level and concerning the LLB degree. While traditional assessments focused on knowledge recall are under threat, generative AI opens doors for approaches in assessment that continue to promote critical thinking, legal reasoning and real-world skills.

The shifting focus of universities around generative AI continues to develop and, where many still view generative AI as a threat, there are some who see it as an opportunity. Tasks like legal research and the drafting of legal documents are straightforward enough that generative AI can enhance the speed at which these tasks are completed, allowing assessments to shift towards evaluating different skills like critical analysis, creative problem solving, and ethical decision making. Rather than being something to fear, generative AI simply means we must change the way we approach assessments within the legal education space.

What are the possibilities when it comes to legal assessment using generative AI?

Imagine an LLB assessment where students use generative AI tools to represent clients in a mock trial, negotiating on real-time feedback from the ‘AI judge’. This fosters practical application of legal knowledge and communication skills.

The same can be done outside of a mock trial but in a debate setting, whereby students are advocating for one side and generative AI represents the opposing side. This promotes a reimaged assessment of skills whereby generative AI is embedded as a technology.

Shoosmiths takes a look at the pros and cons of using AI in the legal industry in this Commercial Question.

Generative AI can generate diverse legal arguments and perspectives. Students analysing these AI-produced outputs can develop better critical evaluation skills and nuanced understanding of complex legal issues. Seminar debates or group projects where students discuss the strengths and weaknesses of AI-generated outputs can stimulate deep engagement with legal theory and its practical implications. It promotes students understanding that generative AI has its limitations. After all, a student's understanding of AI capabilities and its limitations is crucial to prevent overreliance or misuse of technology in the legal field. This demonstrates an ethical issue that students can explore in the safe space of a university, rather than in the profession with potentially harmful consequences.

Lecturers should seek training on generative AI tools and pedagogical strategies if they wish to effectively integrate this technology into seminars and assessments. While the LLB curriculum might need to adapt, generative AI isn’t a replacement for legal education. Instead, it's a powerful tool to enhance learning and prepare future lawyers for the rapidly evolving legal landscape. This requires thoughtful integration, ethical considerations and ongoing skill development.

For insights into the impact of AI on paralegal work, read this LCN Says from Iain Brown.

At the University of Salford, this opportunity is part of the LLB degree with an AI law module offered to students in their final year. This option module aims to explore the use of generative AI both in the legal profession, society and the student’s assessment for that module. In a recent article, titled ‘Reimagining Assessment in the Era of Generative Artificial Intelligence: A reflection on legal education’, an example of a traditional academic essay is reimagined to incorporate generative AI in a way that invites students to engage with the technology but doesn’t compel them. This is the key to successful generative AI integration: that is, the choice to use generative AI.

This example offers inspiration for educators across disciplines. It demonstrates how assessments can evolve to match changing professional demands, developing cognitive skills and practical digital experience while including student values. While there has been some discussion of generative AI assessment panic for higher education, this case study shows an approach that surpasses the traditional essay while upholding academic rigor and prompting students to contemplate their future careers in light of generative AI technologies.

By cultivating digital skills, critical thinking, ethical reflection and meaningful dialogues, assessments can be transformed into powerful tools for shaping tomorrow's professionals. Embracing generative AI not as a threat but as an opportunity to enrich education allows for students to prepare for a future of collaborative human and AI partnerships.

Craig Smith is a lecturer at Salford Business School.