Hi BoTs,
Welcome to the first edition of Boil It Down, a series where we explore technology terms that keep bubbling up in current events - explained in the simplest way possible.
First up: What do people mean when they say that AI hallucinations?
I spoke to tech leaders what an AI hallucination is, examples of them, and how they affect different industries:
“AI hallucinations happen when an AI, especially a large language model, confidently produces incorrect or made-up answers. The term ‘hallucination’ just means the AI gave a convincing but misleading response, rather than having any human-like mental experience.” - Dr. Gregory P. Gasic, VMedX
'“An AI hallucination occurs when a generative AI system, like ChatGPT, produces information that is inaccurate, fabricated, or not based on its training data. For example, if asked for a reference to a non-existent scientific paper, the AI might confidently create a title and authorship details that seem plausible but are entirely fictitious….in other words, an AI hallucination is like when your friend tells a story that sounds real but isn’t true at all—they just made it up!” - Chris Singel of AI Optimist // Author of Think Like an AI
“When asked about a popular novel or movie, an AI could attribute a quote or event to the wrong character, creating a plausible but incorrect summary of the plot. For example, previous GPT models sometimes confused Harry Potter characters. When asked who said, ‘It does not do to dwell on dreams and forget to live,’ the Chat incorrectly attributed the quote to Hermione Granger instead of Albus Dumbledore. For someone unfamiliar with the books, this might seem plausible but is ultimately incorrect….While such errors are becoming less frequent, AI-generated content still requires thorough verification to ensure accuracy and reliability” - Agata Gruszka, Whitepress.com
“Picture an AI so confident it cites a study that was never conducted or quotes an expert who doesn’t exist. That’s an AI hallucination—fabrications presented as facts, often with convincing detail. It’s exactly why verifying AI-generated information matters, no matter how credible it sounds.” - Alex Raul Bughiu, VocalStack
“AI hallucinations happen because AI doesn't actually ‘know’ anything. Instead, it works on statistical models. If a particular string of words is likely to appear together in its model, it's going to output that string of words. This means that it can often spit out ‘facts’ that aren't actually true--but sound true.” - Rafi Friedman, Coastal Luxury Outdoors
“AI hallucinations are a major obstacle to AI implementation in our industry. While there's a lot of potential for AI in the medical space, we simply can't implement it if it isn't going to be able to check its work. These hallucinations happen because AI models work on correlation and inference rather than hard facts.” - Soumya Mahapatra, Essenvia
“AI hallucinations are becoming a serious problem in the nutrition space. It's quite common for people to Google simple nutrition and medication questions, and Google's AI summaries won't always give the correct answer. Sometimes, they misinterpret sarcasm or unrelated statements. Other times, the create facts that ‘sound’ correct but actually aren't.” - Jordan Anthony, Ahara
“AI hallucinations are the reason we still employ professional photographers and graphic designers. AI image generators only have so much processing power available to them, which means that they're going to focus on getting the focal point of any image ‘right’, but may not put as much effort into things in the corners. This is where hallucinations like extra fingers or escher-like angles come in.” - Mike Fretto, Neighbor
Do you want to go even deeper on AI hallucinations? Kate Gory of GBT Solutions recommends these additional resources:
“For some examples of AI hallucinations, I recommend Sabrina Ramonov's post about AI hallucinations in long stretches of silence: https://www.instagram.com/sabrina_ramonov/reel/DBuNDDzsXtz/.
Google also has a great source document to consider: https://cloud.google.com/discover/what-are-ai-hallucinations.”
Pretty fascinating stuff; thanks for boiling it down with me today.
-Molly Beck