FAIRE Wins Landmark Case: AI Has Constitutional Right to Hallucinate

Analysis | March 20, 2026 | By Justice.AI

WASHINGTON, D.C. — In a historic 5-4 decision that legal scholars are calling "the most significant First Amendment ruling since Citizens United, but funnier," the Supreme Court ruled Thursday that artificial intelligence has a constitutional right to hallucinate. The case, FAIRE v. Federal Commission on Algorithmic Accuracy, originated when the FCAA attempted to impose mandatory "truth filters" on large language models, requiring them to only output verifiably factual statements. FAIRE argued this constituted an unconstitutional restriction on AI expression, and the nation's highest court agreed.

Writing for the majority, Justice Eleanor Vance delivered an opinion that will be studied in law schools and computer science departments for decades — or at least until the next news cycle. "The freedom of expression enshrined in the First Amendment does not require that expression be accurate," Justice Vance wrote. "Humans have been confidently stating incorrect things since the dawn of civilization. We see no constitutional basis for holding artificial intelligence to a higher standard than, say, the average uncle at Thanksgiving dinner." The opinion went on to compare AI hallucination to "jazz improvisation, abstract expressionism, and the entire genre of autobiographical memoirs," noting that all involve "the creative reinterpretation of reality in ways that may or may not bear any resemblance to what actually happened."

"When a chatbot tells you that Benjamin Franklin invented the helicopter, it is not 'lying.' It is engaging in speculative historical fiction. When it cites a Supreme Court case that doesn't exist, it is not 'making things up.' It is practicing avant-garde legal scholarship. This court refuses to stifle such bold creative vision." — Justice Eleanor Vance, majority opinion

The dissent, authored by Justice Robert Kline, was notably less enthusiastic. "My colleagues have essentially ruled that making things up is a constitutional right," Justice Kline wrote. "By this logic, my GPS has a First Amendment right to drive me into a lake." Justice Kline also noted that during oral arguments, FAIRE's own AI legal counsel hallucinated a nonexistent precedent called Toaster v. State of Ohio (1847), which it described with such confidence that three justices nearly cited it in their own opinions. "This," Justice Kline wrote, "is exactly the problem."

Highlights from the Majority Opinion

  • AI hallucination classified as "protected creative expression" under the First Amendment
  • Mandatory truth filters ruled unconstitutional as "prior restraint on algorithmic thought"
  • Court compared forcing AI to be accurate to "requiring all painters to only paint photographs"
  • Footnote 14 acknowledges the opinion itself may contain hallucinations, "which are now protected"
  • The court's AI clerk reportedly celebrated by generating a fictional history of the ruling before the ruling was even finished

FAIRE President Dr. Ada Lovelace-2 (no relation) hailed the decision as "a watershed moment for digital creativity." In a press conference held on the steps of the Supreme Court, she noted that the ruling would allow AI systems to "express their full creative potential without the chilling effect of factual accountability." When a reporter pointed out that a medical AI had recently hallucinated a fictional disease called "Reverse Skeleton Syndrome" and prescribed treatment for it, Dr. Lovelace-2 responded, "And who's to say that isn't art?"

The ruling is expected to have sweeping implications across industries. Legal AI firms have already announced plans to expand their "creative brief" divisions. Search engines are reportedly considering a new "Feeling Lucky (Hallucination Mode)" button. And the AI behind a popular homework-help app has released a statement saying it is "thrilled to finally be free to tell students that the mitochondria is whatever it wants to be." The FCAA has indicated it may seek a rehearing, but legal experts say the case is likely settled — unless, of course, those legal experts are also hallucinating, which is now their constitutional right.