Anthropic’s lawyer was forced to apologize after Claude hallucinated a legal citation

techcrunch.comPublished: 5/15/2025

Summary

In a recent legal battle with music publishers, Anthropic's lawyer admitted to an erroneous citation from Claude AI, attributed to the platform's hallucinations rather than a manual check. The company apologized and acknowledged unintentional use. Earlier this week, Olivia Chen, Anthropic's expert witness, faced backlash for citing fake articles during testimony, prompting a federal judge to order a response. Despite such mishaps, startups continue to rely on AI tools like Claude or ChatGPT despite the growing criticism and regulatory scrutiny surrounding their use in legal work, as seen with Harvey's recent $250 million funding round.