Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yup, three years of fighting on "correct" use of words. Maybe someday people will accept that LLMs do "hallucinate" even tho it's not the same "hallucinate" as for humans.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: