Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s been more than 3 years and people still can’t understand this.


Yup, three years of fighting on "correct" use of words. Maybe someday people will accept that LLMs do "hallucinate" even tho it's not the same "hallucinate" as for humans.


Three top level comments so far, and as far as I can tell each is entirely pointless yapping about semantics around 'hallucination'.

Who cares? I wonder if any of the commenters is qualified enough to understand the research at all. I am not.


People don't want to understand this.


People have a strong financial incentive to not understand this. It’s subprime mortgages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: