There is also a variety of new, parallelized implementations of compression algorithms which would be good to have a close look at. Bugs causing undefined behaviour in parallel code are notoriously hard to see, and the parallel versions (which are actually much faster) could be take the place of well-established programs which have earned a lot of trust.
The time stamp of a git commit depends on the system clock of the computer the commit was checked in. This cannot be checked by github & co (except that they could reject commits which have time stamps in the future).
> They want to attack this company or this group of individuals. But someone who backdoors such a core piece of open source infrastructure wants to cast a wide net to attack as many as possible.
The stuxnet malware, which compromised Siemens industrial controls to attack specific centrifuges in uranium enrichment plants in Iran, is a counterexample to that.
Stuxnet wasn't similar to this xz backdoor. The Stuxnet creators researched (or acquired) four Windows zero-days, a relatively short-term endeavor. Whereas the xz backdoor was a long-term 2.5 years operation to slowly gain trust from Lasse Collin.
But, anyway, I'm sure we can find other counter-examples.
If a government wants to cast a wide nest and catch what they can, they'll just throw a tap in some IXP.
If a government went to this much effort to plant this vulnerability, they absolutely have targets in mind - just like they did when they went to the effort of researching (or acquiring) four separate Windows zero-days, combining them, and delivering them...
> And in general, the build system of a large project is doing a lot of work and is considered pretty uninteresting and obscure. Random CMake macros or shell scripts would be just as likely to host bad code.
Build systems can even have undefined behaviour in the C++ sense. For example Conan 2 has a whole page on that.
What could be a different case is Clojure. But I am not sure, and the performance characteristics of Clojure make it well-suited for server-style concurrency, but not so much for high-performance parallelism.
Is there something we can learn from these examples? Are there good reasons for these languages being adopted? And are the Racket designers with their approach of "a language to define interoperable DSLs" up to something?
I think programming languages are more like spoken languages than we give them credit for. Their design is more intentional, but the processes by which they spread, compete, and evolve is similarly difficult to pin down.
Isn't Rust really well suited as the main extension language of functional-preferring, strongly-dynamically-typed, interactive languages, such as Racket, Guile, OCaml, or F#?
I am thinking more and more it would be a pretty good idea to keep the place habitable where we are now. Besides other things, it is quite beautiful here.