Expert Witness Journal Issue 65 February 2026 - Flipbook - Page 71
the court for misusing AI. For example, Choksi v IPS
Law LLP [2025] EWHC 2804 (Ch) where a witness
statement from the defendant’s managing partner
contained references to a number of cases that had
“wrong citations, wrong names or which simply did
not exist”, and MS v Secretary of State for the Home
Department (Professional Conduct: AI Generated
Documents) Bangladesh [2025] UKUT 305 (IAC)
where it was found that the barrister “had misused
arti昀椀cial intelligence and attempted to mislead the
Tribunal”.
Experts
The use of AI by expert witnesses was addressed
in the latest Bond Solon Expert Witness Survey
published on 7 November 2025. 20% of respondents
stated that they had used arti昀椀cial intelligence in
their role as an expert witness, an increase from
9.31% last year but still considerably lower than the
national average of 65% across UK workers.
Of the 20% of experts who had used AI, most had
done so to assist with research and others said
that they “used AI to rephrase writing, and check
grammar and spelling, or to calculate results from
data.”
Dame Victoria Sharp, the president of the Kings
Bench Division, issued a sharp warning on 6
June 2025 (R (on the application of Frederick Ayinde)
v Haringey London Borough Council; Al-Haroun v
Qatar National Bank QPSC and QNB Capital LLC
[2025] EWHC 1383 (Admin)). In this decision, the
Court addressed serious professional misconduct
involving the misuse of AI by legal practitioners in
two separate cases. Dame Sharp warned that:
“
The vast majority of respondents (89%) felt that
speci昀椀c guidance was required for the use of AI
by expert witnesses in the UK. The survey states
that “it is clear that the relatively low uptake of
the technology is likely down to fear of inviting
unintended criticism”.
Freely available generative arti昀椀cial intelligence tools,
trained on a large language model such as ChatGPT
are not capable of conducting reliable research. Such
tools can produce apparently coherent and plausible
responses to prompts, but those coherent and plausible
responses may turn out to be entirely incorrect.”
The survey asked whether experts would “accept an
instruction where the solicitor insisted on providing
the expert witness with a draft expert report for the
case, that was generated by AI”. 14% said that they
would accept such an instruction. This was raised
as a matter of concern by Mr Justice Waksman,
head of the Construction and Technology Court,
at the Bond Solon Expert Witness Conference on 6
November 2025.
[para 6]
“
Those who use arti昀椀cial intelligence to conduct
legal research notwithstanding these risks have a
professional duty therefore to check the accuracy of
such research by reference to authoritative sources,
before using it in the course of their professional work
(to advise clients or before a court, for example).”
Comment
Whilst there is currently no speci昀椀c guidance
for expert witnesses on the use of AI, they should
always comply with their duties under Part 35.3 of
the Civil Procedure Rules (CPR). Expert evidence
presented to the court should be the independent
product of the expert, an expert witness should
provide independent assistance to the court by way
of objective unbiased opinion in relation to matters
within their expertise, and an expert witness should
state the facts or assumptions on which their opinion
is based.
[para 7]
“
There are serious implications for the administration
of justice and public con昀椀dence in the justice system if
arti昀椀cial intelligence is misused.”
[para 9]
Dame Sharp’s judgment refers to an American case
(Kohls v Elison No 24-cv-3754 (D Minn 10 January
2025) where the parties relied on expert evidence
about AI. Dame Sharp noted that “one of the experts
had used generative AI to draft his report and it
included citations of non-existent academic articles”
and referred to observations made by United States
District Judge Laura Provinzino in that case:
“
The CPR Part 35 duties make it incumbent upon the
expert to check that the information obtained from
AI, or from any other research sources, is accurate.
It would be helpful for CPR Part 35 to be updated
to include speci昀椀c guidance for experts on the use
of AI. In the meantime, we would suggest that the
warnings of Dame Sharp apply to experts as much as
lawyers and that they should bear these in mind, as
well as the recent judicial guidance. The Law Society
guidance on generative AI states that a solicitor
bears the professional responsibility for the factual
accuracy of expert reports and so it is important
that solicitors seek to regulate the use of AI by their
experts.
The irony … a credentialed expert on the dangers
of AI and misinformation, has fallen victim to the
siren call of relying too heavily on AI – in a case that
revolves around the dangers of AI, no less.
…
The Court thus adds its voice to a growing chorus
around the country declaring the same message:
verify AI-generated content in legal submissions!”
EXPERT WITNESS JOURNAL
69
FEBRUARY 2026