Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Go is a GC language that has eaten a chunk of the industry (Docker, TypeScript, Kubernetes... Minio... and many more I'm sure) and only some people cry about it, but you know who else owns sizable chunks of the industry? Java and C# which are both GC languages. While some people waste hours crying about GCs the rest of us have built the future around it. Hell, all of AI is eaten by Python another GC language.


And in D, there's nothing stopping from either using or not using the GC. One of the key features of D is that it's possible to mix and match different memory management strategies. Maybe I have a low level computational kernel written C-style with memory management, and then for scripting I have a quick and dirty implementation of Scheme also written in D but using the GC. Perfectly fine for those two things to co-exist in the same codebase, and in fact having them coexist like that is useful.


> And in D, there's nothing stopping from either using or not using the GC.

Wait so are you, or are you not, saying that a GC-less D program can use libraries written with the assumption that there's a GC? The statement "there's nothing stopping [you] from not using the GC" implies that all libraries work with D-without-GC, otherwise the lack of libraries written for D-without-GC would be stopping you from not using the GC


Sorry, but it's more complicated than this. I understand the point you're making, but if your desiderata for using a language is "If any feature prevents me from using 100% of the libraries that have been written for the language, then the language is of no use to me", well... I'm not sure what to tell you.

It's not all or nothing with the GC, as I explained in another reply. There are many libraries that use the GC, many that don't. If you're writing code with the assumption that you'll just be plugging together libraries to do all the heavy lifting, D may not be the right language for you. There is a definite DIY hacker mentality in the community. Flexibility in attitude and approach are rewarded.

Something else to consider is that the GC is often more useful for high level code while manual memory management is more useful for low level code. This is natural because you can always use non-GC (e.g. C) libraries from GC, but (as you point out) not necessarily the other way around. That's how the language is supposed to be used.

You can use the GC for a GUI or some other loose UI thing and drop down to tighter C style code for other things. The benefit is that you can do this in one language as opposed to using e.g. Python + C++. Debugging software written in a mixture of languages like this can be a nightmare. Maybe this feature is useful for you, maybe not. All depends on what you're trying to do.


You're the guy who said that "nothing is preventing you" from using D without a GC. A lack of libraries which work without a GC is something preventing you from using D without a GC. Just be honest.


I just said there are loads of libraries which have been written explicitly to work with the GC disabled. Did you read what I wrote?

It looks like you work in a ton of different domains. I think based on what I've written in response to you so far, it should be easy to see that D is likely a good fit for some things you work on and a bad fit for others. I don't see what the problem is.

The D community is full of really nice and interesting people who are fun to interact with. It's also got a smaller number of people who complain loudly about the GC. This latter contingency of people is perceived as being maybe a bit frustrating and unreasonable.

I don't care whether you check D out or not. But your initial foray into this thread was to cast shade on D by mentioning issues with proprietary compilers (hasn't been a thing in years), and insinuating that the community was fractured because of the GC. Since you clearly don't know that much about the language and have no vested interest in it, why not leave well enough alone instead of muddying the waters with misleading and biased commentary?


My conclusion remains that the language is fractured by the optional GC and that its adoption was severely hampered by the proprietary toolchain. Nothing you have said meaningfully challenges that in my opinion, and many things you've said supports it. I don't think anything I've said is misleading.


Rust is fractured by the optional async-ness of libraries. And all languages are fractured by GPL vs non-GPL libraries.

You have a point, but it is not worth the drama. D's biggest problem comes from the strong opinions of people that have not tried using it.


> language is fractured by the optional GC

That's an interesting fantasy you have constructed. You should try listening to the people who actually use it.


I did, I listened to sfpotter. From their description, it's the source of a great deal of animosity within the D community.


Well, no, it isn't. There's some frustration. But mostly what I've seen is a lot of lengthy conversations with a lot of give and take. There's a small number of loud people who complain intensely, but there are also people who are against the GC who write lots of libraries that avoid it and even push the language in a healthy direction. If these people hadn't argued so strenuously against the GC, I doubt Phobos would have started moving in the direction of using the GC less and less. This is actually the sign of a healthy community.


Still nothing prevents you from rewriting those libraries to not use a GC.

Stop making excuses or expecting others to do your work for you.

Nothing is preventing you.


There are third party libraries for the language that don't use the GC, but as far as I know there isn't a standardized one that people pick.


I'm not strongly for or against (non-deterministic) GC. Deterministic GC in Rust or the (there's no real Scotsman) correctly written C++ has benefits, but often I don't care and go / java / c# / python are all fine.

I think you're really overstepping with AI is eaten by python. I can imagine an AI stack with out python llama.cpp (for inference not training... isn't completely that, but most of the core functionality is not python, and not-GCd at all), I can not imagine an AI stack with out CUDA + C++. Even the premier python tools (pytorch, vllm) would be non-functional with out these tools.

While some very common interfaces to AI require a GC'd language I think if you deleted the non-GC parts you'd be completely stuck and have years of digging your self out, but if you deleted the 'GC' parts you can end up with a usable thing in very short order.


NVidia has decided that the market demand to do everything in Python justifies the development cost of making Python fast in CUDA.

Thus now you can use PTX directly from Python, and with the new cu Tiles approach, you can write CUDA kernels in a Python subset.

Many of these tools get combined because that is what is already there, and the large majority of us don't want, or has the resources, to spend bootstrapting a whole new world.

Until there is some monetary advantage in doing so.


For 99% of the industry use cases, some kind of GC is good enough, and even when that isn't the case, there is no need to throw away the baby with the babywater, a two language approach also works.

Unfortunely those that cry about GCs are still quite vocal, at least we can now throw Rust into their way.


There's a place for GC languages, and there's a place for non-GC languages. I don't understand why you seem so angry towards people who write in non-GC languages.


And there's a place for languages that smoothly support both GC and non-GC. D is the best language at that.


"AI" is built on C, C++, and Fortran, not Python.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: