There is no such thing as a "hallucination" that could be isolated from "not a hallucination" in a provable systematic way because all they do is hallucinate.
I'm extremely comfortable calling this paper complete and utter bullshit (or, I suppose if I'm being charitable, extremely poorly titled) from the title alone.
Arguably, all we do is something similar to hallucination; it's just that hundreds of millions of years have selected against brains that generate internal states that lead to counter-survival behavior.
I recently almost fell on a tram as it accelerated suddenly; my arm reached out for a stanchion that was out of my vision, so rapidly I wasn't aware of what I was doing before it had happened. All of this occurred using subconscious processes, based on a non-physical internal mental model of something I literally couldn't see at the moment it happened. Consciousness is over-rated; I believe Thomas Metzinger's work on consciousness (specifically, the illusion of consciousness) captures something really important about the nature of how our minds really work.
The Input of an LLM is real data. The n-dimensional space an LLM works in is a reflection of this. Statistical probably speaking there should be a way of knowing when an LLM is confident vs. when not.
I'm extremely comfortable calling this paper complete and utter bullshit (or, I suppose if I'm being charitable, extremely poorly titled) from the title alone.