Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Eh… your complaint describes every single piece of information available on the internet.

Let’s try it with other stuff:

“Looking at solutions on stack overflow outsources your brain”

“Searching arxiv for literature on a subject outsources your brain”

“Reading a tutorial on something outsources your brain”

There’s nothing that makes ChatGPT et al appreciably different from the above, other than the tendency to hallucinate.

ChatGPT is a better search engine than search engines for me, since it gives links to cite what it’s talking about and I can check those, but it pays attention to precisely what I asked about and generally doesn’t include unrelated crap.

The only complaint I have is the hallucinations, but it just means I have to check its sources, which is exactly the case already for something as mundane as Wikipedia.

Ho hum. Maybe take some time to reevaluate your conclusions here.



That tendency to hallucinate that you so conveniently downplay is a major problem. I'll take reading the reference manual myself all day rather than sifting through the output of a bullshit generator.


I'm not trying to downplay it, it absolutely is a major problem.

But it's a very different problem from "outsourcing your brain". In fact, the tendency to hallucinate is the very reason you still need to use your brain when using an LLM.

As I said, it's not any different from a web search or a wikipedia read: It's one input, and you still have to think for yourself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: