51Ƶ

AI Hallucinations and False Information in Legal AI at Biglaw

18th December 2024
3 min
Text Link

Note: This article is just one of 60+ sections from our full report titled: The 2024 Legal AI Retrospective - Key Lessons from the Past Year. Please download the full report to check any citations.

Challenge: AI Hallucinations and False Information

AI models can sometimes generate false or nonsensical information, known as "hallucinations."

In 2024, the National Center for State Courts recommended that judicial officers and lawyers must be trained to critically evaluate AI-generated content and not blindly trust its outputs.[129]

In a 2024 survey conducted by LexisNexis, concerns over AI hallucinations were cited as the biggest hurdle to adoption of generative AI in law firms (55%).[130]

Written by

Alex Denne
Head of Growth

Review any legal document for free

Join 130,000+ users already strengthening their legal docs using Genie AI:
4.6 / 5
4.8 / 5

Interested in joining our team? Explore career opportunities with us and be a part of the future of Legal AI.

Related Posts

Show all
No items found.

Discover what Genie can do for you

Create

Generate bulletproof legal documents from plain language.
Explore Create

Review

Spot and resolve risks with AI-powered contract review.
Explore Review

Ask

Your on-demand legal assistant; get instant legal guidance.
Explore Ask