Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The need for prompt engineering is an indicator of GPT failing the problem, not succeeding at it.


Like how the need for roads is an indication that cars don't solve transportation problems.

People adapt to maximize the utility of their tools; always have, always will.


I disagree. We cannot expect these tools to work if we don’t communicate what we want from them. Can’t do that for humans either.

We shouldn’t be confusing clear communication with “prompt engineering”.


The need for “tools” is a bit of a red flag too.


Why? I'm significantly less productive and competent without tools, and this applies to every human I've ever known.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: