Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand? It's actually a pretty good idea - ChatGPT will download whatever the link contains in its own sandboxed environment, without endangering your own machine. Or do you mean something else by saying we're cooked?


I doubt it downloaded or executed anything, it probably just did a base64 decode using some tool and then analysed the decoded bash command which would be very easy. Seems like a good use of an LLM to me.


Out of curiosity I asked chatgpt what the malware does, but erased some parts of the base64 encoded string. It still gave the same answer as the blog. I take that as a strong indication that this script is in its training set.


It can easily read base64 directly.


It did have had the temp file name wrong


ChatGPT didn’t download anything, hopefully.

The we’re cooked refers to the fact of using ChatGPT to decode the base64 command.

That’s like using ChatGPT to solve a simple equation like 4*12, especially for a developer. There are tons of base64 decoder if don’t want to write that one liner yourself.


Unless you're on Windows, there's one in /bin or /usr/bin, you don't even need to go find one.


So what? Why not use the everything machine for everything? You have it open anyway, it’s a fast copy-paste.


Let‘s take a sledgehammer to crack nut. I guess the next step is: ChatGPT, how much is 2+2?

No wonder we need a lot more power plants. Who cares how much CO2 is released alone to build them. No wonder we don’t make real progress in stopping climate change.

What about the everything machine called brain?


> You have it open anyway

Imagine being this way. Hence "we're cooked".


Perhaps he means, "We have this massive AI problem", and the default answer being: "Let's add more AI into the mix"


True, but we also have an intelligibility problem, and “footrace” was already taken.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: