Local Government Lawyer

SharpeEdge

Fred Groves and Christopher Watkins provide insight into growing judicial concern about accuracy, professional responsibility and the efficient administration of justice in the face of Artificial Intelligence.

The increasing use of generative artificial intelligence in the preparation of court documents has prompted growing judicial concern about accuracy, professional responsibility and the efficient administration of justice. Recent cases have highlighted the risks associated with AI-generated “hallucinations”, including the citation of non-existent authorities, with significant consequences for parties and court resources. Against that background, the Civil Justice Council (“CJC”) has launched a consultation on whether new procedural rules are required to govern the use of AI in litigation.

In this article, Fred Groves and Christopher Watkins of our Dispute Resolution team examine the consultation and considers the implications of the CJC’s proposals for practitioners, experts and litigants in person.

A recent illustration of the risks associated with reliance on AI-assisted legal research tools is provided in the case of R (on the application of Ayinde) v London Borough of Haringey [2025] EWHC 1040 (Admin). In this case, the Divisional Court considered submissions which relied on non-existent or materially misstated authorities generated using AI tools. The Court stressed that lawyers who use AI remain professionally responsible for verifying the accuracy of their work, whether in advice or before the court, and warned that the misuse of AI risks undermining public confidence in the administration of justice.

The problem is not confined to legally represented parties. In Olsen v Finansiel Stabilitet A/S [2025] EWHC 42 (KB), a litigant in person relied on an AI-assisted “case summary” which referred to a case that did not in fact exist, and which the court rejected as having no evidential or persuasive value. This demonstrates the particular difficulty posed where unrepresented parties may lack both the expertise and the resources to verify the accuracy of AI-generated legal information.

AI hallucinations are beginning to place additional strain on already stretched court resources, with judicial time expended identifying, querying and responding to applications supported by false authorities. These resource and case-management consequences were emphasised in Taiwo v Homelets of Bath Ltd [2025] EWHC 3173 (KB), where the High Court found that reliance on a “bogus” authority “no doubt falsely created by AI”, together with repeated and undisciplined applications, had “undoubtedly added to the burden on the Court and the Respondent”. 

The Civil Justice Council Consultation

In February 2026, the CJC announced that it had established a Working Group which would consult on ‘…the question of whether [procedural] rules are needed to govern the use of AI by legal representatives for the preparation of court documents, including pleadings, witness statements, and expert reports.’ The open consultation will be open to responses until 14 April 2026. 

The CJC’s interim report discusses the potential regulation of the use of ‘intelligent’ and generative AI systems (as opposed to AI that does not generate substantive content). The interim report notes that regulated legal representatives are subject to professional regulation. For example, under the SRA Code of Conduct, solicitors etc. must not mislead or attempt to do so, and must draw the court’s attention to procedural irregularities which are likely to have a material effect on the proceedings. The misuse of AI in the preparation of court documents would constitute a breach of a legal representative’s professional obligations. Accordingly, the interim report proposes amongst other things that (in summary):

  • For statements of case (including particulars of claim, a defence etc.): ‘Provided the statement of case bears the name of the legal representative who is taking professional responsibility for the statement of case, there is no need for any (further) rules relating to statements of case produced with the assistance of AI.’ The CJC comments that ‘The [Working] group would be concerned that a statement which goes beyond an acknowledgement of professional responsibility for the content of the document might lead to more questions being asked of the court and add to delays.’
  • For trial witness statements: The relevant Rule/Practice Direction of the Civil Procedure Rules (“CPR”) should be amended to provide that, if AI has been used (other than for transcription) in the preparation of a witness statement, the legal representative must declare within the witness statement ‘…that AI has not been used for the purposes of generating the content of such a statement (including by way of altering, embellishing, strengthening, diluting or rephrasing the witness’s evidence)’.  
  • For expert reports: Practice Direction 35 of the CPR should be amended to provide that, if AI has been used (other than for transcription) in the preparation of an expert report, the expert must declare within the report (i) the AI tools used and (ii) for what purpose those AI tools have been used.

The CJC’s interim report raises some interesting further points:

  • Several overseas jurisdictions (including some US states and New South Wales) require that, where AI has been used in respect of court filings, the legal representatives must submit certifications or notices declaring (i) how AI has been used and (ii) that any such use complies with the applicable rules and regulations in that jurisdiction.
  • The use of AI by litigants in person (“LiPs”) in preparing court documents is outside the scope of the interim report, which states that this question ‘would benefit from further consideration’. The Working Group commented that: ‘Any regulation of the use of AI by LiPs presents a particularly difficult challenge, owing to its potential to assist with access to justice and thus the undesirability as a matter of policy of discouraging its use as well as the lack of regulatory framework to govern the conduct of LiPs… It may be that requiring a declaration on such documents as to the use of AI would at least alert the court to the possibility that the material being presented may be inaccurate or fictitious (albeit the requirement for a declaration might of course be ignored).

What Next?

The CJC will produce a final report in due course. In the meantime, our observations are as follows:

  • The risk of AI being misused in the production of Court documents is of international concern. Recent cases, such as those referred to in this article, illustrate the detriment to the courts, parties and the justice system when AI is misused in the preparation of court documents. It seems likely that this problem will continue unless, or until, regulatory change is implemented. 
  • It is possible that courts in the UK will follow the lead of other jurisdictions and introduce new regulations requiring parties to litigation and/or their legal representatives to declare whether AI has been used in preparing court documents and if so, that such use was compliant with the applicable rules.
  • Under CPR Practice Direction 57AD (Disclosure in the Business and Property Courts), a disclosing party or its legal representative is required to serve a Disclosure Certificate. The Disclosure Certificate must certify that the party has complied with its obligations under the Practice Direction and must contain a signed statement of truth, including the words: ‘I am aware that proceedings for contempt of court can be brought against me if I sign a false Disclosure Certificate without an honest belief in its truth.’ Whilst it is a matter for the courts to decide, we consider that a similar requirement as regards the use of AI in the preparation of court documents may be introduced in due course.
  • Whilst the Working Group considers the meaning of ‘AI’ to be sufficiently clear within the context of the interim report, that document notes that ‘the scope of what is and is not “AI” is open to significant debate’. In our view, that debate is likely to increase significantly over the coming years, as the AI technology sector continues to rapidly advance and diversify. Disagreements over the definition of ‘AI’ may challenge the effectiveness of any efforts to encourage and enforce the compliant use of AI in the course of legal proceedings.
  • As noted by the CJC’s interim report, the Artificial Intelligence (AI) Guidance for Judicial Office Holders dated 31 October 2025 states: ‘AI chatbots are now being used by unrepresented litigants. They may be the only source of advice or assistance some litigants receive. Litigants rarely have the skills independently to verify legal information provided by AI chatbots and may not be aware that they are prone to error.’ It seems to us that because of the widespread availability of large language models, the use of AI by LiPs is likely to increase over the coming years. In order for the courts to manage this recent development, new regulation and/or guidance may be deemed necessary.

Fred Groves and Christopher Watkins are Associates at Sharpe Pritchard LLP.


For further insight and resources on local government legal issues from Sharpe Pritchard, please visit the SharpeEdge page by clicking on the banner below.
 
 
Visit Sharpe Pritchard's new Building Safety Hub, focusing on The Building Safety Act 2022 and its wide-ranging impact.

This article is for general awareness only and does not constitute legal or professional advice. The law may have changed since this page was first published. If you would like further advice and assistance in relation to any issue raised in this article, please contact us by telephone or email This email address is being protected from spambots. You need JavaScript enabled to view it..

Click here to view our archived articles or search below.

ABOUT SHARPE PRITCHARD

Sharpe Light Blue Bar 435px

We are a national firm of public law specialists, serving local authorities, other public sector organisations and registered social landlords, as well as commercial clients and the third sector.

Our team advises on a wide range of public law matters, spanning electoral law, procurement, construction, infrastructure, data protection and information law, planning and dispute resolution, to name a few key specialisms.

All public sector organisations have a route to instruct us through the various frameworks we are appointed to. To find out more about our services, please click here.

Justin Mendelle signature

OUR RECENT ARTICLES

Sharpe Light Blue Bar 435px

Click here for our archived articles

OUR NEXT EVENT

Sharpe Light Blue Bar 435px

SharpeEdge Event Slide

OTHER UPCOMING EVENTS

Sharpe Light Blue Bar 435px

Slide backgroundSlide thumbnail
Slide backgroundSlide thumbnail
Slide backgroundSlide thumbnail

OUR KEY LOCAL GOVERNMENT CONTACTS

Sharpe Light Blue Bar 435px

Peter CollinsPeter Collins

Partner

020 7406 4600

Contact by email

Find out more
 

Catherine NewmanCatherine Newman

Partner

020 7406 4600

Contact by email

Find out more
 

Rachel Murray-Smith

Rachel Murray-Smith

Partner

020 7406 4600

Contact by email

Find out more