I think it's a mix of people being actually hyped and wishing this is the future. For me, productivity gains are mostly in areas where I don't have expertise (but the downside, of course, is I don't learn much if I let AI do the work) or when I know it's a throwaway thing and I absolutely don't care about the quality. For example, I'm bedtime reading a series of books for my daughter, and one of them doesn't have a Polish translation, and the Polish publisher stopped working with the author. I vibe coded an app that will extract an epub, translate each of the chapters, and package it back to an epub, with a few features like: saving the translations in sqlite, so the translation can be stopped and resumed, ability to edit translations, add custom instructions etc. It's only ~1000 lines of Rust code, but Claude generated it when I was doing dinner (I just checked progress and prompted next steps every few minutes). I can guarantee that it would take me at least an evening of coding, probably debugging problems along the way, to make it work. So while I know it's limited in a way it still lacks in certain scenarios (novel code in niche technology, very big projects etc), it is kinda game changer in other scenarios. It lets me do small tools that I just wouldn't have time to do otherwise.
So I guess what I'm saying is, even with all the limitations, I kinda understand the hype. That said, I think some people may indeed exaggerate LLMs capabilities, unless they actually know some secret recipe to make them do all those awesome hyped things (but then I would love to see that).
> you have to understand almost all of the language very intimately to be a productive programmer,
I've seen absolute Rust noobs write production code in Rust, I have no idea where did you get that notion from. Most of the apps I've written or I've worked with don't even need to use explicit lifetimes at all. If you don't need absolute performance with almost none memory allocations, it's honestly not rocket science. Even more so if you're writing web backends. Then the code doesn't really differ that much from Go.
Worker threads can't handle I/O, so a single process Node.js app will still have the connection limit much lower than languages where you can handle I/O on multiple threads. Obviously, the second thing you mention, ie. multiple processes, "solves" this problem, but at a cost of running more than one process. In case of web apps it probably doesn't matter too much (although it can hurt performance, especially if you cache stuff in memory), but there are things where it just isn't a good trade-off.
What kind of reality check would it be when the original sudo got two even more serious security issues this year, even though it's "tried and tested"?
I love WASM and WASI, but it's not nearly the same, unfortunately. Performance takes a hit, you can't use async in a straightforward way, launching hundreds of thousands of tasks is problematic etc. WASM is great for allowing to extend your app, but I don't see it as a replacement for an ABI anytime soon
In my experience it really depends on the situation. For stable APIs that have been around for years, sure, it doesn't really matter that much. But if you try to use a library that had significant changes after the cutoff, the models tend to do things the old way, even if you provide a link to examples with new code.
While I agree with the sentiment here, ie. that Ruby doesn't necessarily need namespaces, I think it's also not necessarily good to base Ruby usage on what Shopify is doing. They have so many expert Ruby devs, and whole teams that write extra tooling, that I'd argue they shouldn't be compared to pretty much most of the usage of Ruby/Rails out there
Shopify would benefit a ton from "some" namespaces. In a way, packwerk[0] was an attempt at bringing some namespaces benefits.
But I don't personally think Shopify would benefit from this specific implementation of namespaces (a couple colleagues do). I'm personally not even sure Namespace is a proper term to describe that feature. It's more some sort of lightweight sandboxing to me.
Also:
> They have so many expert Ruby devs
If anything, the average Ruby expertise at Shopify is likely noticeably lower than in most Ruby/Rails shop.
Have you ever written a web app in Rust? Most of the code is in a form of handlers that receive data, process the data and give some data back. There is rarely need to think about lifetimes or borrowing in these scenarios.
So I guess what I'm saying is, even with all the limitations, I kinda understand the hype. That said, I think some people may indeed exaggerate LLMs capabilities, unless they actually know some secret recipe to make them do all those awesome hyped things (but then I would love to see that).
reply