News & Info
Why AI legal advice can cause more problems than it solves
Concerns have been raised recently about the use of AI in Court cases. We explain the risks of using AI for legal advice and the importance of choosing ‘actual intelligence’ over ‘artificial intelligence’.
In recent times, the use of AI in law regularly comes up for discussion with clients and colleagues. AI can sometimes be treated like an oracle that provides answers to all problems. However, while AI can be a powerful tool in many situations, it is often problematic when used for legal purposes and personal injury claims.
Human intelligence, experience and judgement are vital elements in legal advice and Court cases that cannot be substituted with a machine solution.
At Tracey Solicitors LLP, we like to support our clients with the most advanced proven technology to help secure positive results for their cases. However we believe AI is best used as a tool for administrative support (such as summarising documents, provided the original is also read) rather than as a substitute for a qualified solicitor.
We say this as serious concerns have arisen following the use of AI in Court cases and legal offices recently.
According to the Law Society of Ireland:
“As use of AI in legal offices increases, the number of cases is also increasing where inaccurate, false or fictitious information in the form of fake citations has been used in Court submissions or filings.”
Some of the dangers of using AI for legal advice include:
AI hallucinations and fabricated case law:
AI often generates fake citations, non-existent precedents, and misleading legal arguments that appear authoritative. Courts have already reprimanded people for submitting AI-generated documents that contained fictitious information.
Confidentiality and data privacy risks:
Inputting sensitive information into public AI tools (like free versions of ChatGPT) can breach client confidentiality. Once confidential data is entered, it may become part of the public domain or be used to train future models, potentially waiving attorney-client privilege and disclosing your private information to others.
Outdated or non-applicable information:
AI models may rely on outdated laws or fail to account for recent judicial rulings. Furthermore, AI often cannot distinguish between different state or national jurisdictions, providing irrelevant legal advice that is not applicable to the user’s location.
Lack of nuance and empathy:
AI does not understand legal context, human emotions, or the nuances required to assess witness credibility. It cannot identify when a client or witness is plausible vulnerable or exaggerating, which is crucial in litigation.
Automation bias and overconfidence:
Users often trust AI outputs simply because they appear confident and well-organized, even if the information is technically wrong. Large volumes of meaningless submissions carry little weight before Judges.
No discernment or understanding of a situation:
AI has no ability to discern good or bad advice, nor has it the knowledge of the tactics or attitudes of specific courts, legal teams, insurance companies, or professional experts.
No professional accountability:
Unlike human lawyers, AI tools have no professional indemnity insurance, no regulatory body, and no liability for damage caused by bad advice. In other words, if you rely on AI, there is no person to hold accountable for that advice if it proves to be incorrect.
Risk of case being dismissed on the basis of false evidence:
It’s important to remember that most cases require claimants to swear affidavits as to the truthfulness of the documentation used in their case. Any documents generated by clients with AI which cannot be verified, run the risk of a Court dismissing a case on the basis of false evidence. AI cannot be used in place of a claimant giving sworn evidence where necessary.
If you have any questions regarding AI and your case, please feel to contact our office for a confidential discussion.
Disclaimer: This article has been prepared by Tracey Solicitors LLP for general guidance only and should not be regarded as a substitute for professional advice.