updated on 06 June 2023
Reading time: six minutes
The acceleration of AI technology has been one of the many surprises of 2023. Bill Gates described it as “revolutionary” and the beginning of “the age of AI”, while others, such as Elon Musk, have warned for the past decade of the dangers of such technology without proper regulation. Both have a point. Just because we can do a thing, should we?
When I looked at ChatGPT, my overriding initial emotion was much simpler: it was fun! You can ask it many things and the answers provided are plausible. ChatGPT has a persuasive writing style; however, the problem is that it can be wrong or misleading. The answers it gives are dependent entirely on the questions asked by the user. In addition, the current version is only up to date to September 2021 so that brings its own limitations in a fast-paced profession such as law. You don’t get the same indicators when it’s unsure, compared to say, a junior member of your team. It’s breezily confident in the first instance of each discussion.
At Nottingham Law School Legal, we utilised ChatGPT to enhance our cross-examination training exercises based on landmark court cases. We incorporated ChatGPT as a witness, enabling law students to practice their cross-examination techniques by asking it questions and analysing its responses. After the students finished questioning, ChatGPT offered constructive feedback to a degree. In addition, it presented a new cross-examination scenario tailored to the students' performance, increasing the difficulty level for better-performing students and adapting the training accordingly – the better they did, the harder the next scenario would be, which suggests that it can adapt training for different levels.
Of course, it’s not akin to full and proper training – even though it could play a part in it – perhaps more in relation to structure and confidence building. For someone who’s never tried cross-examination, it’d provide a bit of practice, but it comes with some red flags. ChatGPT isn’t regulated by the Solicitors Regulation Authority or the Bar Standards Board, it may not be (and likely isn’t) an ethical lawyer and it’s certainly not (at this point) capable of being a complete replacement for lawyers.
The risk and potential of ChatGPT
There’s certainly a place for ChatGPT to be embedded into how a law firm operates. During our testing, we asked it to create precedent documents. On simpler matters, it tends to do this better. If you use the paid version, it improves noticeably and you can see the role it could play in improving efficiency in law firms. Although it may not get everything right, there’s a logic that you can follow, even at this early stage. For example, you can ask it about a key case in respect of procedural unfair dismissal in England and Wales and mention the key paragraphs of note. You can ask it to find the link to a case you want to refer to in your written submissions or to summarise an area of law in a way that’s accessible to a non-expert (a skill in itself and one where it might fare better than some lawyers!).
During our testing, these are all things it handled relatively well. In your area of specialism, you’ll quickly spot where the AI has got it wrong. In less experienced hands, such as trainees or paralegals, errors may not be as obvious and greater caution may need to be exercised. Nevertheless, it has the potential to be used to assess prospects of success. In the future, you can imagine that AI could help with tasks such as filtering out enquiries that don’t have a realistic prospect of success, prior to being placed in front of a lawyer. This example comes with obvious trigger warnings around confidentiality and privilege. ChatGPT is, after all, an open-source software. Risk and potential – ChatGPT offers both.
It’s likely that someone in your firm is at least thinking of using ChatGPT, if not already, even if only in a specific and limited way. What training do you need to do to make sure members of staff aren’t breaching GDPR, client confidentiality and privilege? Law firms are going to need to plan how they intend to deal with AI as a firm. A policy about to what extent AI can be used by staff seems like a sensible step and so does training that warns about the risks. As a not-for-profit teaching law firm that works with a large volume of law students, we’re already implementing guidance and training for our students. And, while this article focuses on ChatGPT from the perspective of a legal practitioner, due to our integration into the law school at Nottingham Trent University, we’re well aware of the much wider debate about its use in an education setting.
What other implications might AI have on the legal profession going forward?
Access to justice
As a not-for-profit law firm that provides pro bono legal services to those unable to afford or otherwise access it, I acknowledge that ChatGPT could play a significant role in public legal education and has the potential to support a litigant in person (an individual, company or organisation who has to go to court without legal representation).
They could use ChatGPT to help them generate documents and prepare for proceedings in the future – if it isn’t happening already. The litigant in person won’t have any recourse if ChatGPT is wrong, of course, and while access to a lawyer would always be preferable, it’s probably going to improve the quality of a litigant in person overall. One problem a litigant in person may have is that ChatGPT can sound knowledgeable (sometimes like an expert), even when it’s incorrect and, as litigants in person aren’t experts, they may not ask the right questions. It’s like your very clever confident friend who always has an answer. It might not be right but it sounds good. It’s also worth noting that while it could be a useful tool for litigants in person, it risks growing the gap even further for those who are digitally excluded.
ChatGPT might be able to draft a decent cover letter based on your website. It could also produce a decent CV and produce plausible answers to competency and scenario-based questions. As the technology progresses, firms may need to guard against using written applications only to shortlist candidates, and recruitment approaches may need to adapt. At our firm, as we rely on a large number of student volunteers from Nottingham Law School, our current recruitment process utilises a multiple choice-based screening process. In these tests, ChatGPT scored 79% so we already know we need to adapt it!
On the flip side, ChatGPT can be used for some practical tests on how a potential candidate would conduct a client meeting or cross-examine a witness. It presents both a challenge and an opportunity.
The only real certainty is that AI tools such as ChatGPT are here to stay, so ignore them at your own risk. The question is really one of how lawyers and law firms (and aspiring lawyers) can safely manage it to their advantage, while guarding against the risk it poses. And, looking more widely, how it could be used to improve access to justice.
Mathew Game is an employment solicitor at NLS Legal, Nottingham Law School’s teaching law firm.