updated on 11 June 2025
Reading time: two minutes
The High Court has issued a warning to UK legal professionals following the alleged misuse of AI in two recent legal proceedings, where fake case-law citations had been submitted in court. The ruling, delivered by president of the King’s Bench Division Dame Victoria Sharp, urged immediate action to address the misuse’s “serious implications for the administration of justice and public confidence in the justice system”.
The warning follows two high-profile cases this year involving made up legal citations. Claimants in an £89 million damages case against the Qatar National Bank submitted 45 case citations – 18 of which had been entirely fabricated. The claimant admitted to using publicly available AI tools and their solicitor acknowledged citing the made-up authorities.
In a separate case, Haringey Law Centre’s legal challenge against the London Borough of Haringey allegedly included a number of fictional citations. The Law Centre and the junior barrister it had instructed had been criticised for their reliance on the fake cases. However, in light of evidence that wasn’t put before the previous judge, the court recently found that the Law Centre’s lawyers weren’t at fault. The court determined that the paralegal had “acted appropriately throughout” and said there was no reason to suspect that her supervisor had deliberately caused false material to be put before the court. That said, the barrister’s explanation, in which she denied that the fake cases used had been created by AI, was not accepted by the court. 11KBW reported that the “threshold for initiating contempt proceedings against her had been meet” – the court decided not to initiate contempt proceedings and instead re-referred the matter to the Bar Standards Board.
Speaking on the incident, Sharp warned that lawyers misusing AI could face sanctions: “Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect.
“The responses may make confident assertions that are simply untrue. They may cite sources that do not exist. They may purport to quote passages from a genuine source that do not appear in that source.”
Sharp called on the Bar Council and the Law Society to urgently address the issue and instructed heads of barristers’ chambers and managing partners of solicitors to ensure all practitioners understand their ethical obligations when using AI.
In response, CEO of the Law Society Ian Jeffery, echoed the concerns, stating that the High Court’s ruling “lays bare the dangers of using AI in legal work”. Jeffrey explained: “Artificial intelligence tools are increasingly used to support legal service delivery. However, the real risk of incorrect outputs produced by generative AI requires lawyers to check, review and ensure the accuracy of their work.”
Jeffrey added: “Whether generative AI, online search or other tools are used, lawyers are ultimately responsible for the legal advice they provide.”
Find out whether you should be using AI in law firm applications – and how not do it.
This article was amended on 10 June 2025. An earlier version of the article reported that the High Court had found the Law Centre employees negligent. However, in the latest regulatory ruling, and in light of evidence that the previous judge had not seen, the court found that the Law Centre paralegal had acted appropriately and held that there was no reason to suspect that the paralegal’s supervisor had deliberately caused false material to be put before the court.