Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As is the norm for HN and Rust commentary - any slight criticism is met with fury and downvotes.




I'm not "furious", but I do think your comment was bad and deserved to be downvoted. You're posting a random opinion with nothing to back it up, which is, to boot, factually wrong.

What breaking changes has Rust made "on a whim" ?


> What breaking changes has Rust made "on a whim" ?

I don't know about "on a whim", but this isn't far off in regards to breaking compatibility. And it caused some projects, like Nix, a lot of pain.

https://github.com/rust-lang/rust/issues/127343

https://devclass.com/2024/08/19/rust-1-80-0-breaks-existing-...


> I don't know about "on a whim"

Probably not the best way to lead, considering that that phrase is the entire root of the disagreement you're chiming in on!

> but this isn't far off in regards to breaking compatibility.

I think it might be worth elaborating on why you think that change "isn't far off" being made "on a whim". At least to me, "on a whim" implies something about intent (or more specifically, the lack thereof) that the existence of negative downstream impacts says nothing about.

If anything, from what I can tell the evidence suggests precisely the opposite - that the breakage wasn't made "on a whim". The change itself [0] doesn't exactly scream "capricious" to me, and the issue was noticed before Rust 1.80.0 released [1]. The libs team discussed said issue before 1.80.0's release [2] and decided (however (un)wisely one may think) that that breakage was acceptable. That there was at least some consideration of the issue basically disqualifies it from being made "on a whim", in my view.

[0]: https://github.com/rust-lang/rust/pull/99969

[1]: https://github.com/rust-lang/rust/issues/127343

[2]: https://github.com/rust-lang/rust/issues/127343#issuecomment...


Your post strongly reinforces Rust's reputation as a language whose language designers are willing to break compatibility on a whim. If Rust proponents argue like this, what breakage will not be forced upon Rust users in the future?

Your post itself reinforces the OP's claim.

Edit: Seriously. At this point, it seems clear that the culture around Rust, especially driven by proponents like you, indirectly have a negative effect on both Rust software, and software security & quality overall, as seen by the bug discussed in the OP. Without your kind of post, would Ubuntu have felt less pressured to make technical management decisions that allowed for the above bug?


> Your post strongly reinforces Rust's reputation as a language whose language designers are willing to break compatibility on a whim.

> Your post itself reinforces the OP's claim.

Again, I think it might be worth elaborating precisely what you think "on a whim" means. To me (and I would hope anyone else with a reasonable command of English), making a bad decision is not the same thing as making a decision on a whim, and you have provided no reason to believe the described change falls under the latter category instead of the former.


This new post you have made again reinforces the general notion that, yes, closer to "on a whim" than many like, the Rust community is willing to break backwards compatibility. It reflects extremely poorly on the Rust community in some people's eyes that you and other proponents appear to not only be unwilling to admit the issues, like the above issue that caused some people a lot of pain, but even directly talk around the issues.

In C and C++ land, if gcc (as a thought experiment) tried breaking backwards compatibility by changing the language, people would be flabbergasted, complain that gcc made a dialect, and switch to Clang or MSVC or fork gcc. But for Rust, Rust developers just have to suck it up if rustc breaks backwards compatibility. Like Dtolnay's comment in the Github issue I linked indicates. If and once gccrs gets running, that might change.

Though I am beginning to worry, for the specification for Rust gotten from Ferrocene might be both incomplete and basically fake, and that might cause rustc and gccrs to more easily risk becoming separate dialects of Rust, which would be horrible for Rust, and since there should preferably be more viable options in my opinion of systems languages, arguably horrible for the software ecosystem as well. I hope that there are plans for robust ways of preventing dialects of Rust.


> yes, closer to "on a whim" than many like

You're moving the goalposts. Neither the original claim nor your previous comment in this subthread used such vague and weakening qualifiers to "on a whim".

And even those still don't say anything about what exactly you mean by "on a whim" or how precisely that particular change can be described as such, though at this rate I suppose there's not much hope in actually getting an on-point answer.

> the Rust community is willing to break backwards compatibility

Again, the fact that Rust can and will break backwards compatibility is not in dispute. It's specifically the claim that it's done "on a whim" that was the seed of this subthread.

> appear to not only be unwilling to admit the issues

I suggest you read my comment more carefully.

I also challenge you to find anyone who claims that the changes in Rust 1.80.0 did not cause problems.

> but even directly talk around the issues.

Because once again, the existence of breaking changes and/or their negative downstream impact is not what the original comment you replied to was disputing! I'm not sure why this is so hard to understand.

> In C and C++ land, if gcc (as a thought experiment) tried breaking backwards compatibility by changing the language, people would be flabbergasted, complain that gcc made a dialect, and switch to Clang or MSVC or fork gcc.

No need for a thought experiment. Straight from the GCC docs [0]:

> By default, GCC provides some extensions to the C language that, on rare occasions conflict with the C standard.

> The default, if no C language dialect options are given, is -std=gnu23.

> By default, GCC also provides some additional extensions to the C++ language that on rare occasions conflict with the C++ standard.

> The default, if no C++ language dialect options are given, is -std=gnu++17.

Also from the GCC docs [1]:

> The compiler can accept several base standards, such as ‘c90’ or ‘c++98’, and GNU dialects of those standards, such as ‘gnu90’ or ‘gnu++98’.

So not only has GCC "chang[ed] the language" by implementing extensions that can conflict with the C/C++ standards, GCC has its own dialect and uses it by default. And yet there's no major GCC fork and no mass migration to Clang or MSVC specifically because of those extensions.

And it's not like those extensions go unused either; perhaps the most well-known example is Linux, which only officially supported compilation via GCC for a long time precisely because Linux made (and makes!) extensive use of GCC extensions. It was only after a concerted effort to remove some of those GNU-isms and add support for others into Clang that mainline Clang could compile mainline Linux [2].

> I hope that there are plans for robust ways of preventing dialects of Rust.

This is not a realistic option for any language that anyone is free to implement for what I hope are obvious reasons.

[0]: https://gcc.gnu.org/onlinedocs/gcc/Standards.html

[1]: https://gcc.gnu.org/onlinedocs/gcc/C-Dialect-Options.html

[2]: https://www.phoronix.com/review/clang-linux-53


> You're moving the goalposts.

Nope, I am not moving the goalposts, as is perfectly clear to you already. You are well aware that I am completely correct and that you are wrong.

> Again, the fact that Rust can and will break backwards compatibility is not in dispute. It's specifically the claim that it's done "on a whim" that was the seed of this subthread.

And you and the other Rust proponent's directly talking around it, as you again are doing here, only worsens the situation.

> No need for a thought experiment. Straight from the GCC docs [0]:

Technically correct, but outside of extensions that has to be enabled, more or less none of that breaks any backwards compatibility. A program written in pure C or C++ ought to behave exactly the same and compile exactly the same as in those default dialects. The default dialects amount to more or less just a strict superset that behaves the same, like adding support for C++ "//" comments, or backporting newer C standard changes to previous versions. The only extensions that change behavior significantly and are not only strict supersets with same behavior, require flags to be enabled.

Thus, yet again, radically different from what the rustc developers did just last year.

Overall, your posts and the posts of your fellow Rust proponents in this submission both worsen the situation for Rust and for software overall regarding compatibility, security and safety, as the bug of the submission indicates. Imagine being so brazen and doubling down on a path that arguably lead to a very public bug. I do not believe any responsible software company would want you anywhere near its code if it cared about safety and security.


> Nope, I am not moving the goalposts, as is perfectly clear to you already. You are well aware that I am completely correct and that you are wrong.

Turns out you have more than once. I wish I didn't have to spell this out for you, but here goes one last attempt...

The original part of awesome_dude's comment that started this subthread:

> Rust still has a very "Ready to make breaking changes on a whim" reputation

Note the existence and wording of the qualifier here. The claim here is not "Ready to make breaking changes", but "Ready to make breaking changes on a whim".

The relevant response from umanwizard:

> What breaking changes has Rust made "on a whim"?

Again, note the existence and wording of the qualifier. The question here is not "What breaking changes has Rust made", but "What breaking changes has Rust made 'on a whim'

Your first response:

> I don't know about "on a whim", but this isn't far off in regards to breaking compatibility.

This is the first goalpost move. You're not claiming to have an example of a breaking change "on a whim" (in fact, you explicitly distance yourself from such a claim), but instead you say you have an example of a breaking change that "isn't far off" of being "on a whim". Note that this is not the same unadorned "on a whim" qualifier, as it uses the (slightly) weakening and more vague "isn't far off". How far off and in what way is it not far off? You fail to elaborate on both counts.

Your next response:

> Your post strongly reinforces Rust's reputation as a language whose language designers are willing to break compatibility on a whim.

A second goalpost move. You're not using the "isn't far off" qualifier any more, and are instead using the unadorned "on a whim". Again, you fail to elaborate further on this.

And finally:

> closer to "on a whim" than many like

A third goalpost move, with "on a whim" having grown two qualifiers, neither of which have previously appeared in this subthread! Now it's neither "on a whim" nor "isn't far off" "on a whim", but it's now "closer to" "on a whim" "than many like".

How close is "closer to"? Who falls under "many"? How do these describe the example you provide? Who knows!

> And you and the other Rust proponent's directly talking around it, as you again are doing here, only worsens the situation.

It's not clear to me why it's so hard to understand what this subthread was originally about, nor why you seem so insistent on refusing to actually discuss the original topic.

> Technically correct, but outside of extensions that has to be enabled, more or less none of that breaks any backwards compatibility.

This is moving the goalposts yet again. We go from what is in effect "C/C++ compilers would never break backwards compatibility by adding language extensions!" to "You're correct in that they have done it, but it's mostly not a problem".

> The default dialects amount to more or less just a strict superset that behaves the same, like adding support for C++ "//" comments, or backporting newer C standard changes to previous versions. The only extensions that change behavior significantly and are not only strict supersets with same behavior, require flags to be enabled.

Not only does this claim contradict the snippets I quoted earlier, it also contradicts this other snippet from the docs (emphasis added) [0]:

> On the other hand, when a GNU dialect of a standard is specified, all features supported by the compiler are enabled, even when those features change the meaning of the base standard.

And given that GCC defaults to said GNU dialects, that means that non-strict-superset features are enabled by default.

[0]: https://gcc.gnu.org/onlinedocs/gcc/C-Dialect-Options.html


> This is moving the goalposts yet again. We go from what is in effect "C/C++ compilers would never break backwards compatibility by adding language extensions!" to "You're correct in that they have done it, but it's mostly not a problem".

Do you consider old program now compiles, that previously didn't to be a serious break in backward compatibility? I think they do not, and so do I.


If I'm understanding you correctly, I don't, but I interpreted the GCC docs to indicate the opposite. To me, the docs indicate that it's possible (albeit unlikely) for a program that compiles under -std=c* to fail to compile under -std=gnu* due to one of those extensions that "conflict with the [C/C++] standard" and/or "change the meaning of the base standard".

> To me, the docs indicate that it's possible (albeit unlikely) for a program that compiles under -std=c* to fail to compile under -std=gnu* due to one of those extensions that "conflict with the [C/C++] standard" and/or "change the meaning of the base standard".

That's correct. There are not much of those, but they do exist. These generally mean that the GNU dialect gave syntax a meaning, back when it was still forbidden in standard C. Then standard C adopted that feature due to it being implemented in a compiler (that's how language evolution should work), but gave it slightly different semantics. Now GCC has the choice between breaking old existing programs, or not exposing standard semantics by default. They solve that by letting the user choose the language version.

An example of that are arrays of size 0 at the end of a structure. This used to be used for specifying arrays whose size can be arbitrary large, but became obsolete with the introduction of flexible array members in C99. Now if GCC would only implement standard C, then the correct semantics for any array with a declared size that is accessed with an index larger than that size, would be undefined behaviour, but since GCC gave that the semantics, that now flexible array members have, before flexible array members existed, it chooses to implement these semantics instead, unless you tell it, which C standard you want to use.

Actually due to the use in popular codebases, such as the Linux kernel, this semantic is even assigned (based on a heuristic) with array sizes larger than zero.

From https://gcc.gnu.org/onlinedocs/gcc/Zero-Length.html :

> In the absence of the zero-length array extension, in ISO C90 the contents array in the example above would typically be declared to have a single element. Unlike a zero-length array which only contributes to the size of the enclosing structure for the purposes of alignment, a one-element array always occupies at least as much space as a single object of the type. Although using one-element arrays this way is discouraged, GCC handles accesses to trailing one-element array members analogously to zero-length arrays.

(With GNU projects when you have questions, the best source are the official docs themself. They are stellar and are even completely available offline on your computer in the interactive documentation system Info.)

The other extensions are in the sibling chapters, e.g. https://gcc.gnu.org/onlinedocs/gcc/Syntax-Extensions.html or https://gcc.gnu.org/onlinedocs/gcc/Semantic-Extensions.html . A one that I quite like is https://gcc.gnu.org/onlinedocs/gcc/Unnamed-Fields.html .


I did take a quick glance through some of the extension docs, but it seems the particular subsection you discuss was not among the pages I happened to look at. I had never had occasion to use the GNU dialect in the past and searches weren't helping me find specific examples, so I appreciate you taking the time to elaborate!

Out of curiosity, which pages did you discover? Not as in "You hold it wrong.", but as in "Which pages does the unfamiliar discover.".

Are you on an OS, where using the GNU Info system is an option? I quite like it. Unfamiliar people often are deterred from using it, either, because it is in the terminal, or, because they think they are looking at a pager. When it's only the latter preventing you from using it, have in mind, that this is not in fact a simple paged document, but an interactive hypertext system, that predates the WWW. Documents are generally structured as a tree. Use the normal cursor movement, use Enter to follow links, p for previous node, n for next node, u / backspace for up/parent node, / works for text search, i searches in the index. Use info info when you want to know more. Pressing h for help also works. (I just discovered that the behaviour of h depends on your terminal size :-) .) When you look at the GNU onlinedocs, you look at a HTML version of that Info document. Using Info directly is nicer, since it has native support for jumping in the doc tree and instead of relying on an external entity (like Google) to point you to the node that contains your information (Often resulting in bringing you at another version or document entirely, which can lead to confusion.) you can use the built-in index, which is maintained by the document authors, so it will be accurate.

GNU Info is in my opinion the best and fastest way to access documentation, that is more then a simple reference sheet, when you don't object to leaving the Web browser. It even has C tutorials and all, completely offline.


IIRC I was basically doing a top-down search starting from the "Extensions to the C Language Family" [0] and the "Extensions to the C++ Language" [1] pages. I did come across the syntax/semantic extensions subpages you list, but I didn't comprehensively go through all the extensions.

> Are you on an OS, where using the GNU Info system is an option?

Technically yes, but I admittedly have basically zero experience with using it.

> Using Info directly is nicer, since it has native support for jumping in the doc tree and instead of relying on an external entity (like Google) to point you to the node that contains your information (Often resulting in bringing you at another version or document entirely, which can lead to confusion.) you can use the built-in index, which is maintained by the document authors, so it will be accurate.

I'm not entirely confident about how helpful it'd be for someone who is less familiar with the subject material like me as opposed to someone who has a general idea of what they're looking for, but I suppose I won't know until I try it.

[0]: https://gcc.gnu.org/onlinedocs/gcc/C-Extensions.html

[1]: https://gcc.gnu.org/onlinedocs/gcc/C_002b_002b-Extensions.ht...


But that would imply a change of flags. And the extensions by the default gnu* look very mild, more or less strict supersets.

All day every day I see "random opinion with nothing to back it up" posts on Hacker News, but are not voted down - discuss.

Rust users read more xkcd than the average hn poster. Or less. Take your pick.

No crying in the casino, to quote a classic.

Yeah, sweeping hot takes with very little to back them up do tend to get downvoted.

More than anything, the Rust community is hyper-fixated on stability and correctness. It is very much the antithesis to “move fast and break things”.


> More than anything, the Rust community is hyper-fixated on stability and correctness. It is very much the antithesis to “move fast and break things”.

This is incorrect.

https://devclass.com/2024/08/19/rust-1-80-0-breaks-existing-...


The high level of visibility that this incident received is a great example of the point I'm making. Can you name one more?

I'm going to throw in another one.

Cargo always picks the newest version of a dependency, even if that version is incompatible with the version of Rust you have installed.

You're like "build this please", and it's like "hey I helpfully upgraded this module! oh and you can't build this at all, your compiler is too old granddad"

They finally addressed this bug -- optionally (the default is still to break the build at the slightest provocation) -- in January this year (which of course, requires you upgrade your compiler again to at least that)

https://blog.rust-lang.org/2025/01/09/Rust-1.84.0/#cargo-con...

What a bunch of shiny-shiny chasing idiots with a brittle build system. It's designed to ratchet forward your dependencies and throw new bugs and less-well-tested code at you. That's absolutely exhausting. I'm not your guinea pig, I want to build reliable, working systems.

gcc -std=c89 for me please.


Let the one without sin throw the first stone. Please describe to us how you do dependency management with C.

Also picking C89 over any later iteration is bananas.


> Please describe to us how you do dependency management with C.

     PKG_CHECK_MODULES([libfoo], [libfoo >= 1.2.3])
     AC_CHECK_HEADER([foo.h], ,[AC_MSG_ERROR([Cannot find foo header])])
     AC_CHECK_LIB([foo],[foo_open], ,[AC_MSG_ERROR([Cannot find foo library])])
There are additionally versioning standards for shared objects, so you can have two incompatible versions of a library live side-by-side on a system, and binaries can link to the one they're compatible with.

> Cargo always picks the newest version of a dependency, even if that version is incompatible with the version of Rust you have installed.

> PKG_CHECK_MODULES([libfoo], [libfoo >= 1.2.3])

This also picks the newest version that might be incompatible with your compiler, if the newer version uses a newer language standard.

> You're like "build this please", and it's like "hey I helpfully upgraded this module! oh and you can't build this at all, your compiler is too old granddad"

Also possible in the case of your example.

> What a bunch of shiny-shiny chasing idiots with a brittle build system.

Autoconf as an example of non-brittle build system? Laughable at best.


This is whataboutism to deflect from Rust's basic ethos being to pull in the latest shiny-shiny.

> This also picks the newest version that might be incompatible with your compiler, if the newer version uses a newer language standard.

It doesn't, it just verifies what the user has already installed (with apt/yum/dnf) is suitable. It certainly doesn't connect to the network and go looking for trouble.

The onus is on library authors to write standard-agnostic, compiler-agnostic headers, and that's what they do:

    #if __STDC_VERSION__ >= 199901L
        /* C99 definitions */
    #else
        /* pre-C99 definitions */
    #endif
For linking, shared objects have their own versioning to allow backwards-incompatible versions to exist simultaneously (libfoo.so.1, libfoo.so.2).

> This is whataboutism to deflect from Rust's basic ethos being to pull in the latest shiny-shiny.

No. You set a bar for Cargo that the solution you picked does not reach either.

> It doesn't, it just verifies what the user has already installed (with apt/yum/dnf) is suitable.

There's no guarantee that that is compatible with your project though. You might be extra unlucky and have to bring in your own copy of an older version. Plus their dependencies.

Perfect example of the pile of flaming garbage that is C dependency "management". We haven't even mentioned cross-compiling! It multiplies all this C pain a hundredfold.

> The onus is on library authors to write standard-agnostic, compiler-agnostic headers, and that's what they do:

You're assuming that the used feature can be represented in older language standards. If it doesn't, you're forced to at least have that newer compiler on your system.

> [...] standard-agnostic, compiler-agnostic headers [...] > For linking, shared objects have their [...]

Compiler-agnostic headers that get compiled to compiler-specific calling conventions. If I recall correctly, GCC basically dictates it on Linux. Anyways, I digress.

> shared objects have their own versioning to allow backwards-incompatible versions to exist simultaneously (libfoo.so.1, libfoo.so.2).

Oooh, that one is fun. Now you have to hope that nothing was altered when that old version got built for that new distro. No feature flag changed, no glibc-introduced functional change.

> hey I helpfully upgraded this module! oh and you can't build this at all, your compiler is too old granddad

If we look at your initial example again, Cargo followed your project's build instructions exactly and unfortunately pulled in a package that is for some reason incompatible with your current compiler version. To fix this you have the ability to just specify an older version of the crate and carry on.

Looking at your C example, well, I described what you might have to do and how much manual effort that can be. Being forced to use a newer compiler can be very tedious. Be it due to bugs, stricter standards adherence or just the fact that you have to do it.

In the end, it's not a fair fight comparing dependency management between Rust and C. C loses by all reasonable metrics.


You're just in attack mode now.

I listed a specific thing -- that Rust's ecosystem grinds people towards newness, even if goes so far to actually break things. It's baked into the design.

I don't care that it's hypothetically possible for that to happen with C, I care that practically, I've never seen it happen.

Whereas, the single piece of software I build that uses Rust, _without changing anything_ (already built before, no source changes, no compiler changes, no system changes) -- cargo install goes off to the fucking internet, finds newer packages, downloads them, and tells me the software it could build last week can't be build any more. What. The. Fuck. Cargo, I didn't ask you to fuck up my shit - but you did it anyway. Make has never done that to me, nor has autoconf.

Show me a C environment that does that, and I'll advise you to throw it out the window and get something better.

There have been about 100 language versions of Rust in the past 10 years. There have been 7 language versions of C in the past 40. They are a world apart, and I far prefer the C world. C programmers see very little reason to adopt "newer" C language editions.

It's like a Python programmer, on a permanent rewrite treadmill because the Python team regularly abandon Python 3.<early version> and introduce Python 3.<new version> with new features that you can't use on earlier Python versions, asking how a Perl programmer copes. The Perl programmer reminds them that the one Perl binary supports and runs every version of Perl from 5.8 onwards, simultaneously, and the idea of making all the developers churn their code over and over again to keep up with latest versions is madness, the most important thing is to make sure old code keeps running without a single change, forever. The two people are simply on different planets.


> I don't care that it's hypothetically possible for that to happen with C, I care that practically, I've never seen it happen.

I don't think your anecdotal experience is enough to redeem the disarray that is C dependency management. It's nice to pretend though.

> and tells me the software it could build last week can't be build any more. What. The. Fuck. Cargo, I didn't ask you to fuck up my shit - but you did it anyway. Make has never done that to me, nor has autoconf.

If you didn't get my point in previous comment, let me put it more frankly - it is your skill issue if you aren't fixing your crates to a specific version but depend on them remaining constant. This is not Cargo's fault.

> Make has never done that to me, nor has autoconf.

Yeah, because they basically guarantee nothing nor allow working around any of the potential issues I've already described.

But you do get to wait for a thousandth time for it to check the size of some types. All those checks are a literal proof how horrible the ecosystem is.

> There have been about 100 language versions of Rust in the past 10 years

There's actually four editions and they're all backwards-compatible.

> C programmers see very little reason to adopt "newer" C language editions.

Should've stopped at the word "reason".


Most of your post completely falls apart when considering https://github.com/rust-lang/rust/issues/127343

It's not relevant to this thread.

This doesn't "pick" anything, it only locates and validates the version specified and installed by the user, it does not start to fetch newer versions.

I would use a newer version of C, and consider picking C++, if the choice was between C, C++, Ada, and Rust. (If pattern matching would be a large help, I might consider Rust).

For C++, there is vcpkg and Conan. While they are overall significantly or much worse options than what Rust offers according to many, in large part due to C++'s cruft and backwards compatibility, they do exist.


> For C++, there is vcpkg and Conan

But I asked about C.


I looked it up, both vcpkg and Conan support C as well as C++, at least according to their own descriptions.

The way you've described both of those solutions demonstrates perfectly how C package management is an utter mess. You claim to be very familiar with the C ecosystem yet you describe them based on their own description. Not once have you seen them in use? Both of those are also (only) slightly younger than Rust by the way.

So after all these decades there's maybe something vaguely tolerable that's also certainly less mature than what even Rust has. Congrats.


You might be mixing up who you are replying to. I never claimed that C and C++ package management are better than what Rust offers overall. In some regards, Cargo is much better than what is offered in C and C++. I wouldn't ascribe that to a mess, more the difficulty of maintaining backwards compatibility and handling cruft. However, I know of people that have had significant trouble handling Rust. For instance, the inclusion of Rust in bcachefs and handling it in Debian that whole debacle.

https://lwn.net/Articles/1035890/

They did not have an easy time including Rust software as I read it. Maybe just initial woes, but I have also read other descriptions of distribution maintainers having trouble with integrating Rust software. Dynamic binding complaints? I have not looked into it.


> Please describe to us how you do dependency management with C

dnf or apt, depending on if Fedora/EL or Debian...


You're always building for the same distribution and release?

A small number of slowly moving variants, but for a given deployment it's roughly stable.

I suppose I missed the important case of Yocto though


It received a lot of attention and "visibility" because it caused a lot of pain to some people. I am befuddled why you would wrongly attempt to dismiss this undeniable counter-example.

Sorry, but your argument is incorrect.


I suspect you miss the point.

Somebody is attempting to characterize the Rust community in general as being similar to other programming communities that value velocity over stability, such as the JS ecosystem and others.

I’m pointing out that incidents such as this are incredibly rare, and extremely controversial within the community, precisely because people care much more about stability than velocity.

Indeed, the design of the Rust language itself is in so many ways hyper-fixated on correctness and stability - it’s the entire raison d’etre of the language - and this is reflected in the culture.


Comparing with JS ecosystem is very telling. Some early Rust developers, come from the JS ecosystem (especially at Firefox), and Cargo takes inspiration from the JS ecosystem, like with lock files. But JS ecosystem is a terrible baseline to compare with regarding stability. Comparing a language's stabilitity with JS ecosystem says very little. You should have picked a systems language to compare with.

And your post is itself a part of the Rust community, and it itself is an argument against what you claim in it. If you cannot or will not own up to the 1.80 time crate debacle, or mention the 1.80 time crate debacle proactively as a black mark that weighs on Rust's conscience and that it will take time to rebuild trust and confidence in Rust's stability because of it, well, your priorities, understood as in the Rust community's priorities, are clear, and they do not, in practice, lie with stability, safety and security, nor with being forthcoming.


Ok, I'm going to call it here. I don't know what this comment (or account) is, and I'm not particularly interested in a bad faith flamewar.

It is not "bad faith", or insincere in any way. If you actually considered it or cared, you could use it as constructive criticism.

Which would be borne out with discourse, not hate - but you do you



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: