Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
C++ creator rebuts White House warning (infoworld.com)
74 points by joaogui1 on March 18, 2024 | hide | past | favorite | 137 comments


There's the language as idealized, and the language as used. Stroustrup is clearly brilliant, but he's talking about the former while everyone else means the latter. If you started a brand new C++ project today, using only the modern, safe ways of doing things and including only dependencies that do the same, OK, fine. That's, what, 0.1% of C++ projects? The rest of them use a soup of features and misfeatures that've been released in the spirit of trying to make everyone happy simultaneously.

I am solidly in the camp that believes C++ is unsafe. With enough discipline and tooling, it is possible to write safe C++. Are there more than a sliver of shops jumping through those hoops?

Of course, there may be a survivorship bias involved that proves me wrong. If it turned out that every remaining C++ shop is great at writing C++ code, because all the shops that weren't gave up and migrated to something else, I wouldn't be shocked.


> With enough discipline and tooling, it is possible to write safe C++

I think it's possible to write "safe enough" C++ for real-world use. Then after some time you get a weird crash because MSVC stdlib implementation does something weird in new spec. Or your dependency does something unsafe and hoses you. Or the new guy uses `std::string_view` but forgot it doesn't guarantee null-termination. Or you casually forget about iterator invalidation because you're tired and compiler doesn't help you. Or or or. I like C++, it's a nice language but after befriending Rust compiler, or seeing how awesome hot reload in C# is or live-coding a front-end in javascript or any other advantages of other languages I find the use-case for C++ shrinking daily. Really if the C++ ecosystem wasn't so damn rich with libraries, tools and more I doubt it would have as much backing today.


string_view doesn't guarantee life time of the backing buffer either. It's a fundamentally memory unsafe type.


C++ codebases are substantially about vintage. C++03 with this and that vendor-specific thing is still in the wild and those codebases are just tough to fix: it’s clearly possible, Google has a bunch of C++ from that era deep in the heart of things and they’ve managed to keep it not only running but to all evidence performant and secure. There are some financial shops that seem to have walked a similar line. But the typical C++ codebase from the early/mid 2000s isn’t something you’d sign up to own. It’s a pretty elite C++ shop that has managed to walk stuff from 98 or 03 to the present day without a real pile of grief.

And it’s eminently understandable that Mozilla was looking at a codebase from that era and said: “it’s time to consider drastic measures”.

But C++11 eventually got codified and adopted, clang-tidy and the sanitizers came on the scene, the screws got tightened yet further with 14, and 17, and 20 which is now substantially supported.

It’s just not black and white anymore: C++ has real interop/FFI and a largely “opt-in” mentality around the extreme low defect approach, Rust has a much more opinionated posture that’s trivially better at getting extreme low defect outcomes at shops more diverse in experience and rigor than old-school FAANG or Microsoft but botches some key stuff, there is not a one-size-fits all answer here.

Or rather there is if you don’t over constrain the solution space: if extreme low defect is the call you want a real type system. At the less mainstream end that’s probably Haskell or OCaml or something of that lineage.

But ultimately the Quality Inequality (good fast cheap pick two) remains binding: and TypeScript is just trivially where you find a lot of people who grew up with advanced dependent typing and don’t cost a million bucks a year each. It’s faster than most would guess, and the type system is up there with Idris or something like that in terms of the scope for machine checked rigor: these are MSR people, would anyone expect less?

The conversation is over-rotated on Rust vs. C++ which is a complicated question to which few need an answer.


> If you started a brand new C++ project today, using only the modern, safe ways of doing things and including only dependencies that do the same, OK, fine.

My impression is this is not effectively true. It's not easy to follow all the rules of safe C++. I could be wrong but things like forgetting to use std::move in the right place or forgetting to use unique_ptr where you should. Putting member initializers in the wrong order. Forgetting a copy constructor or forgetting to hide the default, etc...

I'm also not sure of mapping structures to binary data (maybe this isn't enforced in any language but ...). Let's say you want to read a BMP file. You declare a struct that matches the header. If you declare every byte it might work but if you declare it as 16bit values and 32bit values then IIUC, padding rules extra make your code non-portable. What rule enforces that you don't write non-portable code? Also, what rule enforces no undefined behavior, like say int overflow?

The point isn't that it's possible. The point is it has to be enforced by the compiler. Does that option exist `--safe-cpp-only` or something like that the enforces 100% safe usage only? (and obviously, just like rust, you could break out when you need to?)


> Let's say you want to read a BMP file.

You unquestionably shouldn't use C++ or Rust, you should use WUFFS to write this portion of the software, this way you get the absolute safely you presumably expect and also you get much better performance than you'd get from hand rolling say C++. You will need to write extensive test suites since the WUFFS tooling doesn't know what a BMP is so your testing is the only way to know you're decoding it correctly - but the safety property drops out of the language design, and while performance isn't magically guaranteed it's much easier to write small fast code in a language designed to help you do that and which also catches every safety mistake you make in the attempt.


Perhaps I should've prefaced that with "for the sake of argument, let's assume...". I think you're right. But even if we're wrong, and modern C++ is bulletproof, that doesn't matter because very little code is written with those idioms.


I'd say it's actually the other way around. C++ in the abstract is full of half-baked, ill conceived ideas that later had to be deprecated or walked back entirely: std::auto_ptr and memory leaks; std::shared_ptr::unique() not being threadsafe; taking the time to implement bounds checking in std::vector, but the default access operator isn't checked; corner cases with std::initializer_list leading to unexpected behavior.

In practice, most C++ shops seem to limit themselves to a small subdialect of C++ that's reasonably safe. The C++ Core Guidelines are essentially an effort at canonicalizing some of these best practices and building them into the tooling.

I think it's fair to say that Rust or managed code should at least be considered for any greenfield project, maybe even a hard requirement for certain domains like cryptography, networking, financial applications, etc. But I don't believe that every application needs to be security hardened to that level, and the C++ library ecosystem is still much more robust than Rust's is. Crates.io feels like a graveyard of experimental libraries that reached 40% the functionality of their C++ cousins before the commit activity dropped to zero.


> Crates.io feels like a graveyard of experimental libraries that reached 40% the functionality of their C++ cousins before the commit activity dropped to zero.

Not only that, but you see a huge increase in external dependencies because of the ease of importing crates. I'll admit I have absolutely no evidence to back this up, but the crates system feels to me like the Achille's Heel of Rust's security model in two respects:

1) There's the obvious supply-chain risk in that the provenance of most of these crates is...uncertain at best... At best you can see some stats that x number of projects use the crate, and who the owner is or at least purports to be.

2) Having eliminated most memory-related vulnerabilities (or at least constrained them to unsafe blocks,) the remaining vulns are going to tend to be logic flaws, and Rust, of course makes no guarantees in that realm. If you import 1 crate, ok, you can probably audit that crate and maybe reason accurately about what your code is really doing. But, when you import a half dozen crates, and those crates have dependencies, and so on, and you end up with 100+ external dependencies, I would argue that reasoning accurately about the behavior of your code is going to be quite difficult.

To me, this is a cultural problem/blindspot with Rust that will be difficult to fix.


How are these two problems unique to Rust though?


Where did I say they are unique or exclusive to Rust? I said they are problems with Rust.


C++ could benefit from a grassroots project to provide a from-the-ground up alternative library, which shares not a single thing with the ISO C++ one. It would provide everything: I/O streams, strings, containers, smart pointers, ...

This project's number one priority would be safety, followed by ergonomics. Performance would be somewhere down the list, below portability.


> There's the language as idealized, and the language as used.

As an aside from safety, I'd like to whine about how the "modern" solution for organizing C++ code into modules was standardized ~4 years ago after however many years of development and still barely works in practice on any of the three major toolchains. What good is solving problems on paper if the solution is apparently near impossible to implement properly?


To write safe C++ you need moral discipline, rather than intellectual/cognitive. The language has the tooling.

Moral discipline means rejecting the use of unsafe constructs for the sake of speed or convenience, and doing the safer, slower, sometimes more clumsy thing.

For instance, to skip thinking like "oh, we can just retain a direct pointer into here, and then we don't need to make a copy ...".

And, of course, while that is all well, you can only practice this advice only applies to small team or solo greenfield projects.


> Stroustrup is clearly brilliant, but he's talking about the former while everyone else means the latter.

Where is he doing this? His main statement doesn't bear out this accusation: "“[t]here are two problems related to safety. Of the billions of lines of C++, few completely follow modern guidelines, and peoples’ notions of which aspects of safety are important differ. I and the C++ standard committee are trying to deal with that.”


> If it turned out that every remaining C++ shop is great at writing C++ code, because all the shops that weren't gave up and migrated to something else, I wouldn't be shocked.

I would. Almost all conversations about C/C++ security are, to this day, alive with people that are in my opinion delusional. The survivorship bias is more that the community is left with an over-representation of people that take “it’s often impractical for a team to write secure C/C++” as a personal attack against their intelligence and choose to dig their heels in as a result.


Boy do I have some codebases I could show you. I mean, the biggest problem isn’t that they’re written in C++, but it really didn’t help either.


On the other side, Rust is a safe language that in real world uses unsafe blocks and unsafe libraries underneath (OpenSSL and other C libraries in practical terms).

That is not safe either in practical terms.

So there is always this discussion about putting C++ as an unsafe thing and it depens a lot, as you said, on how you use it.

I use max warning level, warnings as errors, smart pointers, almost everything return by value and sanitizers.

In Rust I have the advantage that libraries can be audited for unsafe blocks, but it still has unsafe and it will still use unsafe libraries in practice from C.

So I always challenge these people that say Rust is safe. It is when it is. In rea life it is not perfect.

I am pretty sure that the distance gap in safety from well-written C++ and Rust is, well, very small.


Certainly it is a question of degrees.

But to say the gap in safety is very small is essentially to say Rust's lifetime system is close to zero-value.

Unsafe code still exists, the goal is to localize it to small, auditable blocks. However safety often depends on non-local properties. For example, unchecked iteration over a vector requires we know the vector's lifetime exceeds the iteration lifetime, and that the backing buffer won't change. Rust's lifetime system allows this to be expressed, so the "critical zone" can be localized within a single block in the library implementation. If the library is correct, all consumers are safe. In C++, these cannot be expressed, so the critical zone for this safety guarantee is smeared across all code executed during the iteration. No consumer is guaranteed safe.


It is not that the borrow checker is zero value.

But the borrow checker enters the game also depending on your code style. And when it does it gets quite viral with annotations and the like.

That is why I like more a model like the one found in Hylo language. It sidesteps the rigidity of the borrow checker and it can do very well.

Also, the code style I try to keep in C++ is quite "value-oriented" to avoid the typical lifetime pitfalls.

Nowadays I try to use STL with debug mode but not escape iterators. I am actually considering using a library like Flux because I think that, as fast as the iterator model is, it is basically playing with fire. Flux seems to be very competent performance-wise and safer, besides compatible with ranges.

I am quite happy with C++ at the moment. That does not mean I won't use Rust in the future. Who knows.

I hope more work towards safety will be fone in C++. I would not want a borrow checker, but avoiding as many pitfalls as possible is a good thing. I always use compilers with the maximum warning level and warnings as errors. It is actually a good thing to do.


> smart pointers, almost everything return by value and sanitizers

All of these have runtime costs. What you describe is C++ forcing you to practice defensive programming in order to avoid costly debugging sessions.

Rust’s borrow checker lets you avoid some of that runtime cost. You can pass around references much of the time without wrapping them in smart pointers. You don’t have to reference count and don’t have to malloc and free as much. And while bounds checking will be on, you can avoid some of those sanitizers too.

You would get this benefit even if all code was unsafe in the libraries that you use, which of course it isn’t.


> Improving safety has been an aim of C++ from day one and throughout its evolution. Just compare the K&R C language with the earliest C++, and the early C++ with contemporary C++. My CppCon 2023 keynote outlines that evolution,

C++ safety may have improved a lot, but it’s still far, far behind most other languages we use. It’s not even a close comparison.

You throw a bunch of programmers at a problem and you will get some number of bugs in the code. In C++, some percentage of those bugs will be memory errors. You can eliminate raw pointers but that doesn’t solve the problem—there are all sorts of places that dangling pointers crop up anyways, like in references that get captured in lambdas which get stored somewhere and then the programmer doesn’t realize that the lambda is called after the object is destroyed.

I’ve seen various solutions proposed to these problems. The worst solution is to “just hire better programmers and be careful”. The easiest solution for greenfield projects, most of the time, is to pick a different language.


I admit I mostly don’t understand the “modern C++ is safe” argument. I maintain a decent size C++ code base, I try to use modern features in good taste, and I do think that C++ has added a lot of nice things. But nice != safe.

basic_string_view is new in C++17. It sure beats pointers, but it’s a far cry from the kind of safety that you get in essentially any other language (except C): it is a reference with unknown lifetime, and the toolchain does not help track that lifetime.

I think that Stroustrup would say that the C++ Core Guidelines fix this, and I think he’s referring to this:

https://github.com/isocpp/CppCoreGuidelines/blob/master/docs...

which seems like it’s maybe partly implemented in some version of Visual Studio and was maybe prototyped in clang. But it does not seem to be a well-specified language or a fully-implemented language, and I can’t use it now.

So, as far as I’m concerned, I can use Rust or Python or Perl or Go or Swift or bash or Java or JavaScript or Haskell or Lisp or Scheme or O’Caml or Tcl and I can manipulate strings without worrying about undefined behavior. Or I can use C++ and worry. Or I can dream about using “Modern C++?”


> basic_string_view is new in C++17. It sure beats pointers, but it’s a far cry from the kind of safety that you get in essentially any other language (except C): it is a reference with unknown lifetime, and the toolchain does not help track that lifetime.

I a decade of writing C++ I never experienced dangling pointer bugs _until_ I started using string_view. Using it safely in multi-threaded environments is damn near impossible. I now don't let them escape the enclosing scope, and if I have to then I make a std::string from it.


If you’ve only been using C++ for a decade, you may have missed out on auto_ptr being modern. But have you never used an iterator? :)


C++ is safe if you ignore most of its libraries and write everything from scratch, making safety your #1 priority, ahead of performance and everything else, and then doggedly stick to using nothing but the safe primitives you have created.


I guess you had better never use a class template, then, because that typename T parameter and associated member of type T is polymorphic over whether it is a reference or pointer to a value owned elsewhere.


How you instantiate a template is under your control (except when it's a decision in a legacy codebase you've inherited).


This sounds like the standard under which C is safe because I, personally, don’t make mistakes.

If foo<string> is fine but foo<string_view> compiles successfully, generates no warnings, but is nonetheless UB and accesses a dangling pointer that merely usually still contains the expected value, then the overall programming environment is not safe.

C++, as it exists today, makes it very very difficult to build a safe system while still using most of the features that make C++ worth using. Every other language I have as an example has an ecosystem in which one can pass values around such that the semantics are actually defined for any remotely reasonable program that can be executed. The GC and automated-reference-counted languages keep values alive (although Python does struggle to make it clear whether one has captured an immutable value or a remotely mutable reference). The languages that don’t really have references (e.g. bash) copy values as needed. Rust says “you have full control as to how you instantiate things, but you must prove to the compiler that you did it right, and I will steer you toward patterns that make this relatively painless.” C++ has none of this.


I have absolutely no idea what point you're trying to make.

I clearly wrote "C++ is safe if you ignore most of its libraries and write everything from scratch [...] and [...] stick to using nothing but the safe primitives you have created".

If you're using foo<string_view>, where string_view is the std::string_view one, then you're not ignoring an unsafe library feature.

Regarding templates, you could at least bring up argument deduction. It's possible to end up with a string_view template parameter implicitly, without using foo<string_view> syntax.


Safe primitives such as garbage collection?


> Of the billions of lines of C++, few completely follow modern guidelines, and peoples’ notions of which aspects of safety are important differ. I and the C++ standard committee are trying to deal with that

If only people were perfect, then things would be perfect.

There’s, what, 40 years of evidence to suggest that most people, most of the time, simply cannot write memory-safe C++ code (50 years if you count C.)

Maybe we should continue the experiment for another 50 years, just to be really sure the language is the problem.


Have we tried genetically engineering perfect C++ programmers? I feel like that might be possible in another 50 years.


If they don't have to be human, probably 10-15.


> If only people were perfect, then things would be perfect.

That argument doesn't hold up too well, considering that people are involved either way.


It’s been holding up pretty well for the specific memory safety guarantees offered by modern languages, and for a simple reason: if your language is unsafe and has a bunch of foot guns in it, all the users of that language have to be perfect all the time to avoid trouble. If your language provides a certain safety guarantee, the implementors of that guarantee have to be perfect, but at least you have a relatively constrained point to test, perhaps even something small enough to provide a formal correctness guarantee. You can’t do that writ-large with end-user code.

The flipside, of course, is that language or tool problems have a much, much, larger blsst radius.

Still, you can twist yourself into knots to try to claim otherwise, but the track records of these tools speak for themselves.


> It’s been holding up pretty well for the specific memory safety guarantees offered by modern languages

That seems unrelated to whether people need to be perfect to avoid security exposures. In fact, it would seem to be in contradiction.

> if your language is unsafe and has a bunch of foot guns in it, all the users of that language have to be perfect all the time to avoid trouble

That's an entirely different argument, and depends on the semantics of "unsafe" and "foot guns". I think it's more than fair to say that having more protections in place allows for mistakes to avoid becoming vulnerabilities more often, but that's a different argument.


> That seems unrelated to whether people need to be perfect to avoid security exposures. In fact, it would seem to be in contradiction.

It seems to me to be directly related when modern languages make certain types of mistake impossible, imperfect though the creators of those languages are.

> That's an entirely different argument, and depends on the semantics of "unsafe" and "foot guns".

No. It is literally the same argument. People are imperfect, and will make mistakes (that are possible to make.) In the case of C++, we have decades of evidence that people will make memory-safety mistakes over and over and over again.


Some modern languages have protections from some types of mistakes. C++ has some protections for some types of mistakes as well. I agree that there is decades of evidence that people can create security vulnerabilities in C++. I could quibble that C++20 is not the same language that most of those vulnerabilities occurred with, and that if we called Rust "C++24" it'd be no more prone to security vulnerabilities, but that seems like a mostly silly argument.

However, I would point out that some very non-modern languages also offer protection from the same class of memory-safety mistakes, but over decades have proven to be fully capable of having serious security vulnerabilities.

It is also true that we have decades of evidence that C++ programmers aren't perfect, often make mistakes, and often make mistakes that don't lead to security vulnerabilities, both due to protections in the toolset, and because lack of perfection doesn't mean you've got a security vulnerability.

It'd be great to eliminate a whole class of vulnerabilities by always using tools that prevent them, but you have to consider the price that comes with it. In particular, switching everything over to a new set of tools involves rewriting all that code, and that creates the possibility that entirely different classes of security vulnerabilities (some that C++ might offer protections & mitigations for) could be introduced in to programs that have proven to be secure, sometimes for decades.

It's a far more nuanced problem. Rather than taking an absolutist position, I think it makes sense to do what we've always done (or at least should always do), which is on a case-by-case basis, weigh the risks and make the appropriate choices. I'm sure that will lead to C++ not being used in a lot of cases.


Programmer education, tools, language standards and best practices are all vastly different than 50 years ago. That's like pointing at a Ford Edsel and then claiming that modern humans can't make good cars.


I’m not pointing at a contemporary akin to Ford or Edsel. I’m pointing at people today using the latest versions of these tools today making the same class of mistake as someone might have made 4 or 5 decades ago. The tool is the problem.


Ironically, Margaret Hamilton _was_ able to produce a bug free program back in the 60s. However, no-one is seriously considering replicating her techniques because they (some of them, ant least) are way too expensive for modern software.


1) The Apollo Guidance Computer software was not bug-free[0]

2) The AGC software was not written in C++

The NASA approach to software development undoubtedly results in high quality assurance, but at a huge productivity cost that most commercials shops could not shoulder.

What do you do about the huge amount of software that needs to be developed that is too complex and cannot justify things like having 5 independent concurrent executions running two completely separate but functionally identical code bases tested by a completely independent adversarial test team?

I’d argue you could start by using tools that don’t allow, let alone encourage, your programmers to make common mistakes with catastrophic consequences.

0 — see, for instance, https://ibiblio.org/apollo/Documents/COM-1.pdf


In 1958 35,000 people died on American roads. Last year, 43,000. So yeah, cars have gotten safer at about the rate that C++ has.


Just as what's damning to C++ is how its security track-record compares to other programming languages available today, it's how the US's current number of roadway deaths and serious injuries compares to other developed nations that is so embarrassing and depressing:

> [I]n the last 30 years, the US has not kept pace with tumbling traffic death rates in Europe, east Asia and Canada. In 2021, as the US hit a 16-year high for fatalities, Japan and Norway posted the lowest number of road deaths since the 1940s.

> The contrast is especially striking among so-called vulnerable road users, a category that includes walkers as well as those using bikes, scooters and wheelchairs. According to the OECD, pedestrian deaths in the US rose over 40% from 2010-18, more than twice the pace of any other member country (most of which saw a decline).

https://www.bloomberg.com/news/features/2022-11-03/why-us-tr...


The rate of deaths has dramatically declined. A lot more people are driving, and they're driving a lot more.


Hehe, but if you count "broken dependencies" as accidents/fatalities akin to bugs, it's way safer than all new languages by virtue of being too old/disorganized for a built-in packaging system.

The left-pad incident was a 4,000,000-car pile up!


An Edsel, 70 years later, still doesn't have seat belts, air bags, a crumple zone, and a crash-test rating.

Almost all modern cars are better.


He seems to be in denial. I watched Stroustrup's CppCon 2023 talk about safety[0] a few months back. He spends about an hour talking about how important safety is and all the new things C++ offers to write safe code. Why did he suddenly start caring about safety now?

C++ is like a mad hatter's bad acid trip and somehow people are convinced it's still a great language to use in 2024.

[0]: https://www.youtube.com/watch?v=I8UvQKvOSSw


Is that the talk where Stroustrup avoids even saying the word Rust for the entire talk even though the talk is 100% a reaction to Rust?

On the other hand Herb Sutter recently wrote an interesting and grounded article on his views of Safety. While I don't agree 100% with him, at least he acknowledges the position C++ is in.

Edit: The article:

https://herbsutter.com/2024/03/11/safety-in-context/


I’m going off old memory here, but I happened to have his C++ book in late 90s. I read that before I learn C. In intro he claimed strongly no need to learn C. Just learn C++. I was young and just followed his wisdom. Obviously there’s zero replacement for learning C before C++. I can at best say I lost a lot of time confused about basic stuff. Until I gave up, ignored him and learnt C like I should have. I was in school, I did not expect to come across faith based software development. That’s a different course.


The obvious answer would seem to be he's afraid of the (slow) death of C++ which is his legacy/child. No one wants to see their work thrown on the scrap pile of history.


He isn’t talking about memory safety and keeps avoiding it.


Stroustrup needs to realize that "just wait a few more years" isn't an acceptable answer when you're already a decade late to the party. The white house is not some radical pioneer at the frontier of programming language design. By the time it says anything on the subject, it's been obvious to everyone else for years.

There might be a reasonable discussion here if we were discussing profiles when the earliest incarnations of it appeared around 2015, but we're not. It's 2024 and they're still not in the standard. There isn't even a clear proposal for compilers to begin implementing. Once there is, profiles will still be an optional, partial, and incremental solution at best. They won't even fulfill Stroustrup's stated desires to address all kinds of safety (for which there hasn't even been discussion yet).


Bjarne is simply pointing out that if we wait for static analysis tools to become sentient thanks to upcoming breakthroughs in AI and quantum computing, we can finally have fewer CVEs in C++ software.

> The white house is not some radical pioneer at the frontier of programming language design. By the time it says anything on the subject, it's been obvious to everyone else for years.

That we should return to Ada? ;p Oh, wait, sorry; I have no idea what that is. We need Rust.


Stroustrup as always fails to recognize the vast surface area of C++ features, foot cannons, and the heavy weight of C compatibility around C++ neck.

C++ barely made sense in 1995. It makes absolutely no sense today.


I think the funny thing is you no longer even get peak performance from C++. In many ways Java is running rings around C++ performance. Partly, it's because you get state-of-the-art peak optimizations for free from Java, and you'll need a team of 20 full-time build engineers to get a peak C++ artifact with PGO, LTO, and post-link optimizations. Partly it's because the speed of C++ is illusory, with the superficial success coming at the start of a project, followed by years of monotonic performance degradation as the flaws in the original are iteratively discovered and remediated. We see this very clearly with gRPC, where the performance of the C++ implementation has been cut in half over the last 3 years while the Java implementation, which is faster today than ever before, is the performance champion.


I acknowledge C++'s safety concerns, but no, Java is definitely NOT running rings around C++ perf. Not a single AI/ML model is implemented in Java. The core of AI/ML runs on C++ only. You may see a lot of Python, but the core engine that Python is wrapping is written in C++. Sorry to break it to you but Java cannot even come close here.


"Multiply an insane amount of stuff" is not an interesting point in complexity-safety-performance space.


> "Multiply an insane amount of stuff" is not an interesting point in complexity-safety-performance space.

So in other words, Java is bad at scaling a basic math operation like multiplication?

> "In many ways Java is running rings around C++ performance"

The AI-space is proof positive of that being false. C++ is kicking Java's butt, squeezing literally trillions of ops per second of performance, all of this happening before the Java runtime can even startup.

The first transformer-based language model was built with PyTorth; imagine if PyTorch was a python wrapper for Java code instead of C++ code? It'd be garbage. I mean how freaking long did it take for Java to even get a basic Vector API? C++ devs have been writing SIMD optimized code for years now. And besides that, try getting a Java program to scale at tens or even hundreds or thousands of GPUs for a large language model. You'd need GPUs with literally 20-30% more RAM just to accommodate the GC.


The reason for that mostly is CUDA only supports C and C++ natively.


Try writing high performance math in Java even for CPU.


> you'll need a team of 20 full-time build engineers to get a peak C++ artifact with PGO, LTO, and post-link optimizations.

As someone working in a team of 3 with a c++ code base this statement is hilarious hyperbole.


How do you explain the almost complete absence of peak-optimized software in the real world? Percona, an entire company dedicated to supporting one stupid program, does not offer a PGO MySQL, much less a BOLTed one. "Try to enable PGO" is still an open issue for Envoy, an 8-year-old project with multiple gigantic companies contributing.


How do you explain the almost complete absence of

How do you explain the almost complete absence of consumer software written in java where performance matters? Web browsers, databases, video codecs, high end games, all written in C++. According to you they should all be written in java for maximum performance.


followed by years of monotonic performance degradation as the flaws in the original are iteratively discovered and remediated.

This doesn't make any sense and nothing in this comment is something an experienced optimizer would say.

I'm not sure where the fantasy comes from that java is going to beat C++, but anyone experienced in optimization is going to control their memory allocations, then control data access being linear so the prefetcher works well, then control memory alignment, then worry about SIMD and multi-threading.

When people talk about optimizing in java, it's usually about turning off the garbage collection, fighting with the garbage collection etc. Memory allocation is trivial in C++ because so much ends up on the stack and preallocating memory is trivial. Java optimization gets stuck on step 1, trying to control memory allocations and pointer chasing.

Show me a program in java and I will show you how it can run faster in C++ (and possibly even faster in ISPC).


C++ does indeed give you this level of control.

Most C++ code I have seen does not even approach this level of care and customization.

When we are talking about speed, most people mean basic out-of-the-box speed of the code without heroic efforts. C++ and Java are a wash at this point performance wise. Most analytics code I have seen on Wall Street was either Java or C++, with the choice driven by what the quants wanted in their resume, not performance.


Even the heroes eventually capitulate to complexity in C++, compromising performance. It is virtually impossible to write a realistically complex zero-copy RPC server in C++ because the lifetime issues are too daunting. You don't get it "for free" in Java but because the language and the JVM have formalized the lifetime problem you can write a zero-copy RPC service very, very cheaply. Even Go can beat C++ in this use case, for the same reasons.

All the other stuff in this thread amounts to toy problems, from a software engineering complexity standpoint. Yes, you can write a fairly good CFD kernel in C, C++, or fortran. There are no lifetime or boundary issues in these use cases. The safety of the language in such cases is of no interest.


Even the heroes eventually capitulate to complexity in C++, compromising performance.

It is usually dead simple. Reserve memory in a vector, put data in, loop through it linearly. Modern C++ is very simple most of the time. Where are you getting this idea?

It is virtually impossible to write a realistically complex zero-copy RPC server in C++ because the lifetime issues are too daunting.

This is not only untrue, it doesn't even make sense. What is it that you think can be done in java and what are these lifetime issues you think are in the way? You didn't give any actual technical examples, so feel free to show something real and back up this claim.

Even Go can beat C++ in this use case, for the same reasons

Prove it, let's see where these ideas are coming from, because I don't think they are coming from experience with C++.

The safety of the language in such cases is of no interest.

This is also overblown in modern C++. It is easy to boil things down to value semantics and let simple lifetimes manage resources with scope. When it isn't something you can do with scope and you have to manage resources yourself, the language won't help you anyway since you are writing it yourself.


> Reserve memory in a vector, put data in

You've already proven that you don't know what I am talking about and now you are doubling down. There is no way to put data into a std::vector without copying it. Vector can't adopt memory that already exists. So your plan doesn't suit the use case I am discussing: zero-copy RPC servers.


Focus up, these are two different things. You said complexity, I said looping through a vector.

You said "zero-copy RPC servers" (for some reason) but never gave any evidence of what you are saying or explained it technically in any way.

I'll ask again, show me what is being done in java that can't be done in C++. Show me technically, prove what you are saying.

I know how these conversations usually go. It's mostly the other person trying anything they can to not give evidence of what they're saying.


Most C++ code I have seen does not even approach this level of care and customization.

True, but all of this isn't neccesary to be faster than java.

When we are talking about speed, most people mean basic out-of-the-box speed of the code without heroic efforts.

Fair enough in general. All it takes in C++ is allocating memory out of hot loops and looping through memory linearly. This basically eliminates memory allocation as a bottleneck and pointer chasing as a bottleneck. These two things alone can make a program 100x faster or more.

In C++ makes this is often the simplest possible way to do something, although unfortunately that doesn't mean programs are always done like this.


Im not sure which one is the more delusional, java or c++ gang. That said being familiar with google's c++ (especially protobuf) codebases, i dont doubt your claims. Lets just say theres reason people use alternative runtimes like nanopb.


> Of the billions of lines of C++, few completely follow modern guidelines, and peoples’ notions of which aspects of safety are important differ.

The fundamental problem is they’re just guidelines and they’ll always be just guidelines. You can still do all the wild old stuff without so much as a warning and you’ll have to figure out how it even interacts with the new stuff which exposes yet another vector for failure.


linters nudge about not following guidelines, and project policies turn the Judges into enforcement.


Which is fine until you have third party dependencies.


Even such basics as initialization is full of traps in C++, its truly language thats awful to produce anything safe with. C on the other hand isnt a bad language, but C's problem comes from the horrible standard library and bad standard. You can create much better C by passing -fno-strict-aliasing -ftrapv -fsigned-char to the compiler even now. Heck rust's unsafe is more unsafe than C. "C" could be improved with a better compiler that ignores the standard, but perhaps it wouldnt be C anymore even if the syntax remained the same.

C++ problem is not helped by the compilers either, shoutouts to msvc++ accepting absolutely wrong and horrible code by default.


>> "C" could be improved with a better compiler that ignores the standard, but perhaps it wouldnt be C anymore

Then it wouldn't be C.

C is 52 years old. For most use cases, there are better options.

Instead of trying to apply bandages to C or C++, why not move to something better?

Legacy code written in C or C++ can be maintained or rewritten as needed. New code should not be written in C or C++ because it is not safe--thus the White House warning.


I've already moved to zig since it's the only real C alternative out there right now. I'm just making a point C itself as a written language is not bad. You could make something that is very similar, but avoids most of the common issues with C (which are much less than with C++). C won't die and interop will be the C ABI for long time still.


IMO the problem with C isn’t just the horrible standard library. It’s that you can’t make a better standard library — the language constructs needed to abstract almost anything don’t really exist.


The standard library consists many functions that simply should be dropped. All the globals like errno, locale should be removed. Instead of global allocator, allocator struct should be passed around (zig). Headers like stddef and stddint in the standard library should be in the language instead.

https://github.com/Cloudef/zig-budoux/blob/master/src/c.zig#...


All true, but that still doesn’t mean that there will ever be a genuinely nice string type or map from string to int or a nice HTTP or any similar thing where some of the data types are dynamically sized.


struct and few functions and you get there. Look at zig's std lib. "String" type is also IMO anti-pattern https://mortoray.com/the-string-type-is-broken/

People often complain about null terminated "strings" in C, but I don't really see them as a problem. It rarely is performance bottleneck, and when it is you can opt for your own "string type". There is also bstring that's compatible with these strings but prepend header to the data which contains the length. That said I wish C had language level slicing.


Really sad to see such an influential computer scientist lose interest in advancing computing for the perceived slight against his legacy.

Ironically I think Stroupstrup is actually doing more harm than good to his reputation by "evolving" C++ than simply putting it in maintenance mode and contributing to a modern language.


I agree. I think he could have done wonders starting fresh with something new.


This link is a mobile (AMP) version of the article that has been submitted already:

https://news.ycombinator.com/item?id=39748953


So what it sounds like:

Strousoup: The future is a yet to be defined profiles.

Sutter: The future is a yet to be defined C++v2 that's backwards compatible but also solves the problems.

Chandler: C++ compile times are too slow. Going to build a brand new front-end that really fixes compile performance and maybe fixes memory safety & let's you call C++ code.

My take:

Re Strousoup: From what I've seen, Strousoup's ideas don't seem particularly extra likely to make it to the final standard and that's when he already has a formal standard written up. Prognosis: Modules were much simpler (conceptually at least), took ~6 years, and even 4 years after coming out have seen minimal adoption. This definitely isn't on track for C++26 and likely will encounter real-world roadblocks by C++30. If everything goes well, the industry would be ready to start adopting profiles in ~10 years. Or they could start using Rust today. Still quite vague in terms of whether or not this idea can be implemented by compiler authors.

Re C++v2: Vague hand-waving without a real plan on how to be meaningfully competitive with Rust. Since it's an experiment and experiments can fail, unlikely to lead anywhere and unlikely to see industry adoption.

Re Carbon: interesting experiment and the most likely real-world candidate to displace C++. Being done by a company with a massive C++ codebase that would benefit from migration to Carbon. From the looks of it, they're spending most of their time on compile performance. Neat technical experiment but not really showing that a language that's bidirectionally compatible with C++ can meaningfully improve on safety. Also, Rust compile times have improved significantly in the past few years and are better than C++ in my experience (but of course, hard to have a fair comparison of the two and I'm compiling these days on a much faster machine than I've used for C++ in the past). The recent Cranelift work might reasonably speed things up another ~2-3x.

EDIT: TLDR: C++ today is not safe enough and no idea when it will be, regardless of the work going on to try to make it safer with no idea when that might happen.


hooray! a down to earth stay with the facts response! thank you for that. if I'm not mistaken, Chandler and Sutter lead efforts to untangle what we have, sucht that we can then improve upon it, with better templates/metaprogramming or borrow annotations or bounds checking containers etc.

for C++ especially it's really two things that make it hard: it's idiosyncracies (that's what they work on) and then in addition the lack of these new abilities.


Bjarne Stroustrup: Remember the Vasa! (2018) https://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0...

Bjarne Stroustrup 2018: We are on a path to disaster though enthusiasm and design-by-committee (or rather “design-by-committees”). During the early days of WG21 the story of the Vasa was popular as warning against overelaboration (from 1992):

“Please also understand that there are dozens of reasonable extensions and changes being proposed. If every extension that is reasonably well-defined, clean and general, and would make life easier for a couple of hundred or couple of thousand C++ programmers were accepted, the language would more than double in size. We do not think this would be an advantage to the C++ community.”

“We often remind ourselves of the good ship Vasa. It was to be the pride of the Swedish navy and was built to be the biggest and most beautiful battleship ever. Unfortunately, to accommodate enough statues and guns it underwent major redesigns and extension during construction. The result was that it only made it half way across Stockholm harbor before a gust of wind blew it over and it sank killing about 50 people.”

“It has been raised and you can now see it in a museum in Stockholm. It is a beauty to behold - far more beautiful at the time than its unextended first design and far more beautiful today than if it had suffered the usual fate of a 17th century battle ship -- but that is no consolation to its designer, builders, and intended users.”


I wish that rather than making excuses, C++ apologists switched to trying to just make the whole language memory safe.

It’s possible. There just aren’t good incentives in place to do it.


I've long held that belief. But doing so is very technical and political, and not fun.

Turns out what is more fun is to create a new language and rewrite the whole ecosystem. So a lot of the energy is going there instead.

Programming is part fashion...


Yeah. Building a new language from scratch is definitely easier than retrofitting new properties into the semantics of an existing one.

Hasn’t stopped me from trying though. :-)

And the CHERI folks are trying, too.

So it’s really more of a political problem than a technical one. The excuse extravaganza that we’re seeing from Stroustrup and Sutter doesn’t help at all.


What makes me wary of jumping to Rust is the async stuff. From what I read, colored functions were introduced and most of the libraries that do useful stuff adopted async and kind of force you to be aware of it in your own code.

For me, the perfect C++ replacement would be something like Go without the runtime burden. I'm not sure that this exists, so I use Go whenever I can and C++ in the ultra rare cases where I can't.


This "color" thing is a terrible miscommunication, because the original article is about two things that don't even make sense in Rust's context:

• an architectural limitation of JavaScript which Rust doesn't have (in Rust you can wait for an async result in a sync function, or run CPU-heavy code from an async function).

• a wish that languages had implicit magic that made sync and async calls look the same, which Rust intentionally doesn't want, because implicit magic is a terrible footgun in low-level code. It can be hidden in a high-level VM language that is in charge of all I/O, syscalls, and locks. But that is counter-productive for a low-level systems language for implementing I/O drivers, kernel syscalls, and custom locking primitives.

So Rust is "purple".

In reality Rust's async is awesome for what it is: a syntax sugar for state machines, which is able to flatten an entire call tree into a single fixed-size struct.

Most other async architectures need at least one allocation per async call or per await, but Rust needs one allocation per the entire call graph with any number of async calls and awaits.


If you want to call an async function from a sync context, use `futures::executor::block_on`. Nobody is forcing you to use async.


I just want a safe-ish language that doesn't nerd-snipe 95% of software engineers into creating glorious and incomprehensible syntactic monstrosities that go nuts with allocations and hidden function calls. Really hoping Zig getting some more traction.


Table saw manufacturers wonder why Exacto knife users are so worried about their fingers getting cut off. The simple and obvious explanation is we now live in a world where many people are too lazy and/or dumb to spend time learning the tools of their trade.

3..2..1


With all due respect Stroustrup seems completely delusional about real software out there in the wild. He needs a dose of reality.


Can C++ be safe? Absolutely! Does a jr. developer know how to avoid all the pitfalls and craft this magical, safe code? No examples can be shown.


The amount of iconoclastic knee jerking in this thread is kinda nuts. Equating this rebuttal to an old man yelling at clouds? Saying Cpp never made sense?

The first step to solving a problem is accepting reality. Cpp has been foundational, like C, to our computing world. If the rich legacy of libraries that underpin our "better" language choices is offensive to us, or if we really believe we are powerless to improve the situation around the most popular programming language(s) then we're not being realistic.

This is a huge debate of national importance and it'll shape programming language design for decades, it's important to get this right, and that will take more than just once choice and more than one approach to get right.


> The first step to solving a problem is accepting reality.

Ok, the first thing I’d like to accept is that C++ is just not safe enough for most applications. And yes—you also can’t throw C++ in the garbage.

Both of those statement are part of our reality—C++ is unsafe, and we will use it anyway. That’s why we solve this problem on two fronts. First, we advise programmers to ditch C++ for safer languages, when reasonable, because C++ isn’t safe enough for most needs. Second, we invest a lot of energy into making C++ safer, with better tools, safer libraries, changes to compilers, static analysis, run-time instrumentation, etc. It won’t close the gap—C++ is still unsafe and will be for the foreseeable future—but it will make a big difference to the people who, for whatever reasons, still use C++ despite its safety problems.

The C++ FAQ has a better picture here:

https://isocpp.org/wiki/faq/big-picture

> In 99% of the cases, programming language selection is dominated by business considerations, not by technical considerations. Things that really end up mattering are things like availability of a programming environment for the development machine, availability of runtime environment(s) for the deployment machine(s), licensing/legal issues of the runtime and/or development environments, availability of trained developers, availability of consulting services, and corporate culture/politics. These business considerations generally play a much greater role than compile time performance, runtime performance, static vs. dynamic typing, static vs. dynamic binding, etc.

There are a lot of good reasons to use C++. C++ is also unsafe. Both are true.

We don’t need to write long apologia explaining why C++ is actually safe, and you shouldn’t be looking at legacy code, or you need to hire better programmers, or you’re using the wrong tools or wrong practices or something like that. Ultimately, those arguments don’t withstand scrutiny.


I've yet to see a valid scenario where C++ is superior to Rust, Python and Go.

Use Python. If you need concurrency, then use Go. If you need even more performance, use Rust (using unsafe Rust only for the parts that need it). For the highest performance stuff, maybe consider C for critical parts only.

C++ is not safe. It's a minefield of things that compile but are memory management mistakes. And then you're like "Look, I have a map of the minefield. If we just make sure we don't step on any mines, we are completely fine."


I can think of a few scenarios for C++ off-hand. Most of them involve integrating with libraries or frameworks that are written in C++, running on platforms that have good C++ toolchains, or working with verification systems that can process certain C++ or C subsets.


Yes, but all of these are essentially all because the existing code is already based on C(++). We should be able to move to Rust sooner than people think.


Should ≠ will.

We still have Fortran and Cobol. It’s easy to imagine that Rust will replace C++ Real Soon Now, because you just have to imagine that all of the legacy code disappears, and Rust gets all the tooling support that C++ has.


> "Look, I have a map of the minefield".

This doesn't exist. WG21 (the C++ language committee) has considered the idea of writing an appendix with such a map - a list of the UB (Undefined Behaviour) and IFNDR (Ill-formed, No Diagnostic Required) clauses in the language but this work has yet to be undertaken and I see no reason to expect it in C++ 26.

I have been in plenty of disagreements with C++ proponents here and on r/cpp where it's clear that there isn't even agreement on what the standard means today in respect of these problems, if you guess one way and the people who implemented your compiler judged differently then your program may have defects you didn't even realise were possible.


> I've yet to see a valid scenario where C++ is superior to Rust, Python and Go.

What? Python and Go are used in entirely different domains. Not everything is a web backend!

Today C++ is used mainly for performance critical applications: HPC, realtime audio, video editors, game engines, web browsers, operating systems, etc. In these fields, Rust would be pretty much the only practical alternative, but it still needs to catch up with the massive and mature(!) C++ ecosystem. Things like Eigen cannot even be implemented efficiently in Rust because its metaprogramming features are still too limited.


It’s not just the lack of safety. It is all the insane gaggle of features, from the multipage errors from template errors to “diamond inheritance” to experts yelling at each other over temporary lifetimes.

The language has always been a dumpster fire. The best thing I can say about it is that is spurred development of many other languages to get away from it.


isolated experiments and a rank-choice ordering of languages is not a realistic picture of the industry.

In robotics, everything is C++. There's plenty of python being used to train networks, but that's not because of performance, safety, or anything, it's mostly born out of the existence of key libraries being in Py. But those libs are just wrappers on C (cuda, essentially).

Essentially the whole of every robot is C and C++. Essentially the whole of every airplane is C, C++, and a scattering of memory safe languages in isolated corners. The ATC system, the rail system, most industrial processes are C or PLC, or maybe C / PLC generated by matlab/labview. Automobiles, basically everything with a microcontroller. It's all C.

Our scientific computing? fortran. Nodejs? A bunch of C++.

What's my point? It's that for any new project or extension of above projects, the existing language is a superior choice (as viewed by managers, business leaders, CTOs, or rushed grad students trying to get quick results, etc), simply because the legacy provides a quicker startup. This is the "reality" - we have had better options essentially forever, and I feel we are effectively stuck with C/C++ forever. It's just that we'll see less and less of it if the new communities are diligent about extending the existing ones. Otherwise, it will never make sense to start clean, not on a mass production level, or at least not this decade or likely next.

C++ is a bad choice, and it is the choice. it can be both a prevalent "obvious" choice, and also a bad one. The existence of a better language does not shift reality on its own. You need targeted investment for development of a replacement ecosystem built around that better option. Whether that is Rust, or safe C++, or C+borrow checking, or Dada, or the language of the minute. We as a community cannot keep screaming about how nice a new idea is without building out the ecosystem to make it the obvious idea.


Don't use Python. You will produce unmaintainable code that will have to be thrown away in 5 years because it is impossible to refactor.


Interesting. Are you sure it's the language and not the programming style? There doesn't seem to be a reason to prevent you from writing maintainable Python code.


Reasonably sure. As soon as one of your dependencies gets a major version bump with incompatible API, the clock starts ticking.


I have no experience with refactoring a Python project. Wouldn’t type hints make it only somewhat more painful than refactoring Rust?


Only if they are consistently applied.


yes!

and a core part of getting this right is the maturity of other features, like generics/templates, duck typing instead of or in addition to inheritance, portable and well defined data memory layout, ease of reuse of existing mature C++ libraries (think numerical Fortran). I consider a language defined module system, straightforward parsing and higher level concurrency (like channels) almost icing on the cake...

nobody is served with a forced rush from one idiosyncratic complexity to another.

and imnsho the core issue with C++ is _not_ "safety" but complexity, of the language as such. it's just way to easy to write leaky or out of bounds writing code, in C and still in C++.

and in that vein I really love carbon and Herb Sutter's Work and get me right, also rust and even good old Ada got a lot right.

but please, can we discuss the better, next gen systems language as if it were not religions? but technology choices with ups and downs?


C++ is an unsafe language in all of its current implementations except CHERI.

Stroustrup is being tone deaf or maybe he just doesn’t get it.

The issue is that there are so many ways in which you could write a C++ expression that violates memory safety and gives users control of your heap.

In Java or other truly safe languages, there are zero ways to do that short of pwning the JVM with a bug. In Rust and other safe systems languages, to do something unsafe you have to call it out using the unsafe keyword.

So - the places in your C++ code where you might have a memory safety violation are everywhere while in the alternatives they are either nowhere or they are carefully demarcated.


tangentially nothing in my career made me as angry as working through Bjarnes Book because none of the examples worked (that was an early edition of the book maybe later editions added the necessary includes etc)


I mean, you can call it a rebuttal, but this:

“There are two problems related to safety. Of the billions of lines of C++, few completely follow modern guidelines, and peoples’ notions of which aspects of safety are important differ. I and the C++ standard committee are trying to deal with that,”

sounds like an admission that, following decades of improvements and modernisations to C++, safety and quality remain a practical concern in most actual C++ codebases. In many ways it’s surprising that it has taken this long to call time on it. I can understand Stroustrup’s frustration; the work they’ve been doing has been excellent, but there’s nothing stopping industry or the government switching to other options that are making better headway against problems like this.


> Improving safety has been an aim of C++ from day one and throughout its evolution. Just compare the K&R C language with the earliest C++, and the early C++ with contemporary C++

This rhetoric from Stroustrup comes off as disingenuous; what saves it from being outright dishonest is the wording "an aim". As in one of many. Not "the aim".

Firstly, of course we get a jump in safety from K&RC to C++.

But most of the development of C++ has been driven by a hedge between multiple factors, only one of those being safety, and not always having the top priority. Factors like: ease of implementation, performance, safety and backward compatibility.

We can point to recently introduced library features that are not safe, and easily identify backward compatibility issues that prevent improvements in safety.

In the 1990's, C++ introduced a standard library of containers. If safety had been the top priority, iterators would never have had undefined behavior when the target object changes, and std::vector would have been impervious to out-of-bounds accesses. These things could easily have been achieved in what is just library code. A C++ developer can easily ignore the library, and develop their own containers with safe iterators, vectors that can't be misused and other elements. The standard didn't do that because of those other factors: ease of implementation and performance.

(Can someone point to three situations in the development of C++ in which safety ran up against efficiency or implementation ease, and did not lose?)



TL;DR: Bjarne Stroustrup is frustrated that the authors of the government's proposal don't realize that, theoretically, there may exist teams of talented developers who are able to consistently write safe C++ code.


Yeah. The language you pick doesn't magically make you Fort Knox regardless of "memory safety". Rewind in time and the White House would be berating all of us to write in Java... you know... a "memory safe" language, only for the worst security fail to come along Log4Shell.


But language and tooling can protect you from error classes, and when the error class (“memory safety”) factors in up to 70% of security vulnerabilities (by some estimates,) it makes sense to pay attention to tools which can protect you from those errors, rather than lament the lack of protection for the other 30% or errors, which, by the way, could have just as easily happened in a language without strong memory safety protections.


70% sounds a bit high but I can accept it.

You can do a lot in regards to memory safety also in C / C++ : https://llvm.org/pubs/2006-05-24-SAFECode-BoundsCheck.pdf

Anyway. Writing safe code in C is HARD but possible. Especially with good tooling. That is not to say everyone should use C. C is an exceptionally hard language to use safely and correctly and is not for everyone.


> 70% sounds a bit high but I can accept it

One meta-source for this is Prossimo[0]. They link to multiple vendor reports that range from 60 - 90%.

> Writing safe code in C is HARD but possible. Especially with good tooling.

I don’t disagree in theory, but I think it is so hard as to be impractical in almost every case. So, other than maintaining a legacy code base, why try at this point when other options are available?

0 - https://www.memorysafety.org/docs/memory-safety/#how-common-...


> So, other than maintaining a legacy code base, why try at this point when other options are available?

What options would you recommend in 2024? I write C (not C++) and work on projects that are inherently memory unsafe (the last one required hand-written assembly code). I've explored potential C successors in the past, but have yet to discover one that matches the freedoms, simplicity, ergonomics, and performance of the C language.


Zig. It's not as safe as Rust, but it's a hell of a lot better than C while remaining largely interoperable.


I don't like Java. Nothing about it appeals to me. It pains my eyes to look at it.

And in spite of my distaste for it, I respect that it completely eliminates giant classes of vulnerabilities. You can write bad logic in any language. At least in Java you can't write bad logic that also suffers from memory issues.


> At least in Java you can't write bad logic that also suffers from memory issues.

Oh you bet you can. There are a number of ways to screw that up.

- memory leaks/loitering is quite common, gc won’t help if your code hangs on to stale refs

- using Unsafe memory can blow up in glorious C++ fashion, directly or indirectly

- using native calls incorrectly can be just as nasty


I'd wager Unsafe in Java is far more rare per million lines of code written than unsafe blocks in Rust. Been coding in Java since ~1997. Unsafe just doesn't come up in 99.999% of Java projects.

Now JNI on the other hand, that made up maybe 0.5% back in the day. Not anymore where FFI is the predominant native linkage (but still comparatively quite rare).

(Cue the one weird guy coming out of the woodwork who claims that all of his Java projects in the last 10 years have used Unsafe to great effect.)


> Been coding in Java since ~1997. Unsafe just doesn't come up in 99.999% of Java projects.

Your project may not use it directly, but there’s a pretty good chance one of your libraries does.


And when I code in C, there's typically at least one library with inline assembly even though my C doesn't use any inline assembly.

There is indeed a difference here between the vast majority taking part in risky behavior and library authors using focused performance improvements. There are a lot more eyes on the library code due to downstream effects than in everyday top-level code.


That was a library bug, not a language bug. Allowing external logging configuration by default is not like having a buffer overflow issue.


Yeah the logging bug was bad. That’s one bad bug.

C++ has bugs that bad that are found and weaponized daily.

So, Java is much safer probably by 2-3 orders of magnitude.

Also - the log4j thing shows just how dangerous class loading is. It’s an eval like mechanism. Probably future languages designed with safety in mind should avoid eval-like mechanisms as well as avoiding type system escape hatches.


FWIW, class loading isn't necessary for log4j to be bad - unchecked Java deserialization is sufficient for pretty bad pwnage. So we really should add "no unchecked deserialization" to the list of things that a safe language should have.

Java's biggest mistake here, IMHO, was to have a serialization setup that was based solely on a generic interface rather than having the caller declare up front what class it expects to see. In the latter case, so long as you aren't storing naked Object/Serializable fields, type limits strongly restrict the set of naughty objects that can show up in your deserialized object graph. Rust Serde gets this right.


What I mean by class loading being bad isn’t that it’s bad that you can load a new class, but that it’s possible to pass a string into an api and have that api vend you a class. That’s bad even if it’s a class that exists already and doesn’t need to be newly loaded.

I think unchecked serialization is bad because it leads to that “given a string you get an instance of the class named by it” problem.


Ha! Like C & C++ coders haven't been sending and receiving structs as byte streams over networks. Less common today to be sure, but that was a VERY common practice that obviously led to a lot of CVEs. C coders before the internet went public, working on Novell Netware networks were slinging structs willy-nilly over their token ring and thin net, let me tell you!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: