AI hallucination—that is, the generation of factually incorrect or nonsensical...
https://www.scribd.com/document/1013175958/When-Summaries-Lie-A-Case-study-of-Models-That-Summarize-Well-but-Fail-to-Admit-Ignorance-147755
AI hallucination—that is, the generation of factually incorrect or nonsensical outputs—remains a critical limiting factor in deploying language models reliably