For years, victims of crime and those trying to challenge convictions have said that the UK’s system for accessing court records is prohibitively expensive and unnecessarily bureaucratic.
The government has taken steps to improve this transparency. In a recent investigation, I revealed that the government has abandoned its policy of destroying court records. But significant barriers remain for those trying to gain access to those records.
Now, the government has unveiled a plan to use artificial intelligence (AI) to make court transcriptions cheaper and more accessible.
The Ministry of Justice will run a trial using its in-house AI, called Justice Transcribe, to produce court transcriptions. The pilot study, overseen by the courts and tribunals service, will assess how accurate the AI transcriptions are before the system is potentially rolled out nationwide.
Read more:
Government has halted controversial policy of destroying court records, investigation reveals
A campaign led by survivors of rape and sexual assault has highlighted the difficulty of getting transcriptions of their trials. London’s Victims’ Commissioner Claire Waxman called it a “real block to recovery” for victims, and said one woman had been quoted £30,000 for a transcript of her full trial.
In 2012 the MoJ dispensed with stenographers and began recording court cases instead. But these audio recordings need to be transcribed by private companies. The fees charged by these companies have been described as “exploitative”. As Julie Price of Cardiff Law School put it:
The expense appears to be out of proportion to the work involved to produce these, and the private companies that hold these service contracts are businesses that exist to make a profit from what should surely be a public service.
The MoJ now recognises there is a problem, admitting that victims have had to pay thousands of pounds for transcriptions. Under the new Sentencing Act, victims in the Crown Court can receive judges’ sentencing remarks for free. But the AI pilot aims to make full court transcriptions more accessible.
Waxman, the victims’ commissioner, says that transcriptions are central to the healing process: “Access to transcripts is vital for victims and families, helping them understand what happened in court, process proceedings in their own time and support their recovery, while also strengthening transparency and accountability across the justice system.”
And miscarriage of justice campaigners have argued that lack of access to court records makes it difficult to challenge convictions.

corgarashu/Shutterstock
Sarah Sackman KC, Minister for Courts and Legal Services, said: “Victims show immense courage in coming to court, delivering their testimonies and looking their perpetrators in the eye … that’s why it is only right they process what happened in their case in their own time and on their own terms.” She added that AI use in the courtroom could improve transparency and access to justice.
The introduction of audio recordings in 2012 spelled the end for court stenographers. This time, it could be the transcription companies that face extinction.
Changing technology
While faster and cheaper transcriptions will be welcomed, there are reasons to be sceptical that AI will be able to deliver in the way the MoJ hopes.
Accuracy will be an issue, from getting names and places right, to properly representing technical language used by experts. But the greater issue may be AI hallucination. This is when AI tools “generate information that seems plausible but is actually inaccurate or misleading”.
These hallucinations are something that regular users of AI have become used to. But they can pose a significant risk, particularly if the information is being relied upon for medical treatments or legal decisions.
A study by the Thomson Reuters Institute, the research arm of information company Thomson Reuters, concluded that judicial scepticism about using AI for court documents “is not simple technophobia – it’s professional responsibility”.
“Relied-upon hallucinated information isn’t merely bad output, it can lead to a potential distortion of justice.”
Read more:
Why AI shouldn’t be used even to decide ‘simple’ court cases
Research by the Law Commission, an independent statutory body charged with reviewing the law of England and Wales, also highlighted the issue. They found “examples from many jurisdictions” of lawyers citing hallucinated legal cases.
The commission argues there needs to be human oversight of AI legal systems and that an overreliance on AI by lawyers would “almost certainly be a breach of their regulatory obligations”. In some cases, it “may even risk the lawyer being liable for contempt of court”.
The scepticism from those working in the criminal justice system is understandable given that any snags when new technologies are introduced can have significant impacts on real people. For example, DNA technology was presented as flawless when it was introduced, but we now know it is not. And just ask someone like Shaun Thompson, who is bringing a legal challenge after the Met police’s live facial recognition technology wrongly identified him as a suspect, about the accuracy of this technology.
Given these concerns, it is unsurprising that the MoJ hasn’t put a time frame on how long the pilot study will take – much less when the new approach will be used across the criminal justice system.
![]()
Brian Thornton does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.