Hacker Newsnew | past | comments | ask | show | jobs | submit | esafak's commentslogin

Yes, serving requires infra, too. But you can use infra optimized for serving; nvidia GPUs are not the only game in town.

I don't know, man. At this point I'm liable to ask "Why are you using C++?" if you start a new project. Let them defend their language!

Free hosting, CI minutes, and an ecosystem.

Commit and push to test small incremental changes, self-hosted runners' time still count towards CI minutes, and an ecosystem hellbent on presenting security holes as new features. I'm a bit unimpressed :)

Dagger. Workflows that run anywhere, including locally.

I've seen dagger pipelines they're horrendous. Just have GitHub Actions call out to a task runner like Make/Taskfile etc and use an environment manager Mise or Nix to install all the tools.

I think that is a good pattern too, though I would replace the make/taskfile step with something bazel-like.

Dagger used to be more declarative with CUE, but demand was not strong enough.


I do not follow. How does that change anything? Don't things still go wrong? Do you not need to debug?

Sorry for not being clearly enough.

The point is that it is very difficult to replicate the environment of a hosted GitHub Actions runner, and having to do so defeats the ease of use the platform provides.


I don't see how an AGI coder will end scarcity; it will simply debase knowledge work. Physical things we need, like housing, are still scarce.

The AGI can build robots that build houses. It has a virtually unlimited amount of working time to dedicate to the robotics engineering problems.

We'd still be limited to some extent by raw materials and land but it would be much less significant.


That can trim costs but not drive it to zero. If you assume that the computer is going to do all the work, won't your salary erode, making it harder for you to afford scarce things?

Why can't you use LLMs with formal methods? Mathematicians are using LLMs to develop complex proofs. How is that any different?

I don't know why you're being downvoted, I think you're right.

I think LLMs need different coding languages, ones that emphasise correctness and formal methods. I think we'll develop specific languages for using LLMs with that work better for this task.

Of course, training an LLM to use it then becomes a chicken/egg problem, but I don't think that's insurmountable.


maybe. I think we're really just starting this, and I suspect that trying to fuse neural networks with symbolic logic is a really interesting direction to try to explore.

that's kind of not what we're talking about. a pretty large fraction of the community thinks programming is stone cold over because we can talk to an LLM and have it spit out some code that eventually compiles.

personally I think there will be a huge shift in the way things are done. it just won't look like Claude.


That's what exploration looks like; mutation plus selection. I think you know this but consider exploration willful, perhaps?

Yes, that's it. I could have worded it better. My point was that it's random, evolution isn't a directed willful phenomenon but a consequence of the physical world/physics.


How are academics going to assess AI-coauthored research for appointment and promotion?

Dw, by next 3 year AI itself will be better than as coauthor

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: