Langella & Langella | Personal Injury Attorney - Hauppauge, NY

AI and Legal Practice: A Real-Life Nightmare for Lazy Lawyers

August 30, 2023

AI and Legal Practice: A Real-Life Nightmare for Lazy Lawyers

Thanks for joining us for part two in our series on AI and the Law; a rapidly emerging force in legal proceedings that we all could stand to learn more about. If you have a passing interest in AI, or are a legal practitioner yourself, you may have already heard about the case we plan to discuss today. But what many of those articles fail to mention, is just how bad a situation the lawyers in question put themselves in. It almost beggars belief, how these attorneys had such confidence in the AI tool.


A law firm right here in New York (of all places), had a fairly weak airline liability/injury case that they had to conduct some rather routine legal research for. Simple enough, right? Yes, it may be difficult to find some of the necessary legal precedent in the case law, but that’s nothing that some good old fashioned time and effort can’t solve. It’s what we lawyers do, day in and day out. But apparently there are some practitioners out there that find the tedium of legal research to be too much effort for them. And that’s where the firm in question here makes their first of many horrible mistakes.


Despite the airline in question managing to removing the venue to federal court, the lawyers overseeing the case simply prompted Chat GPT to do their research for them. And what did the AI come up with? Well, if you read what we discussed in our first article on generative AI, you would recall that an AI such as this will search through an unimaginably large database of harvested information to generate its answers. But if it is incapable of coming up with the correct answers, that does not typically stop it from confidently providing AN answer. And in this case, when it could not come up with proper relevant case law, it simply made it up.


That’s right. The AI invented case law that, in reality, did not exist. It fabricated entire fanciful cases from whole cloth. And due to either a lack of acumen, or a lack of fact-checking, the lawyers that were working the case did NOT realize that the cases they were about to cite in federal court were non-existent.


This would be bad enough if this delayed the case-work for a legal practitioner. Perhaps they would notice the problems in the process of verifying their work for submissions of early motions in the case, and they would be under the gun to effectively start over. But in a move that will likely shock legal professionals and laypersons alike, they simply ran with everything the AI gave them, all the way through submission of legal documents to the court.


In fact, it has been speculated that the submitted motions which contained the erroneous information were crafted by the Chat GPT software as well. Imagine being before a judge in a hearing, realizing in real time that the case that you effectively did no real work on, had no real information in it.


In all, there were five different cases presented by the plaintiffs in Motion of Opposition which Chat GPT had made up. And the consequences of this incredible faux-pas are exactly what they should be: Massive sanctions, and a very high probability of disbarment.

October 11, 2024
As of 2024, pedestrian death rates in the United States have reached a stark 41-year high, highlighting a troubling surge in fatalities among those on foot. In 2021 alone, approximately 7,500 pedestrians lost their lives, representing a 13% increase from the previous year.
By Keanae Monroe October 10, 2024
Talking Motorcycle Safety Part 2 - Minimizing Danger in Bad Conditions
More Posts
Share by: