• 2 min read

Litigant in Person uses ChatGPT in UK Court Case

Chatgpt Chat with AI or Artificial Intelligence technology, business use AI smart technology by inputting, deep learning Neural networks to understand, respond to user inputs. future technology

We seem to hear reports on the increasing use of AI on an almost daily basis. The legal profession is not immune to this trend. A recent case however is perhaps a timely warning that this emerging technology must be used carefully.

It has been reported in the Law Society Gazette and elsewhere that a litigant in person tried to present false submissions in court based on answers provided by the ChatGPT AI tool. The case concerned a civil claim being dealt with in Manchester.  It is said that proceedings ended for the day with a barrister for 1 party submitting that there was no precedent to support the case being advanced by the other party.  The following day the litigant in person retuned to court citing 4 cases in support of the case they were trying to advance. On closer analysis by the barrister, it transpired that 1 case name had been made up and the other 3 were real case names but the paragraphs cited from them were completely made up. On being asked about this by the judge, the litigant in person admitted to having used ChatGPT to locate cases to support their position.

In this case it seems the judge accepted that the misleading citations from the litigant in person were inadvertent and they were not penalised. In the Supreme Court decision in Barton v Wright Hassall LLP 2018 the court stated that litigants in person should be expected to make themselves aware of and use the Civil Procedure Rules and that they were under the same duty as lawyers and should receive no special treatment.  

Comment: This case serves as a reminder of the need to check all documents carefully whether you are involved in litigation as part of your job or as a party to a case. Although the litigant in person was not penalised on this occasion it seems that as we all become increasingly familiar with AI it is likely in future that litigants in person as well as lawyers will be expected to use it very cautiously in the courtroom and will be held responsible if submissions are misleading. Getting it wrong in future could result in serious consequences including but not limited to cost penalties and an adverse impact on your case. This is also something for the courts to keep a close eye on bearing in mind the increasing number of litgants in person appearing in our courts due to reforms to legal aid, small claims limits and fixed cost extensions.

Kelvin Farmaner is a Partner with Trethowans LLP and heads the Insurance and Regulatory team which specialises in defending claims.  The team is one of only 12 teams in the whole of the UK to be ranked for Defendant Personal Injury work in Chambers and Partners. Kelvin Farmaner and Bethany Blamire are also ranked as leading individuals in Chambers in a list of only 36 lawyers nationally.  The team and individuals are also ranked by the Legal 500 guide to law firms.

Answers are just a click away