Power, perhaps. But I'm a bit skeptical about usability. Lean doesn't even use one of the most obvious things that make interactive proof systems far more usable - a declarative mode instead of the usual tactics-based scripts. (Yes, you can kinda sorta fake the former with "structuring" tactics, except not really - declarative proofs are really their own kind of thing.) There even used to be systems that automatically rendered inputed definitions and declarative proofs in natural language (given that the basic terms and symbols were previously defined of course) which does enable even an average mathematician to easily figure out what the system is up to. You just can't do this properly if all you have is a list of "tactics" fiddling with the prover state.
"Lean doesn't even use one of the most obvious things that make interactive proof systems far more usable - a declarative mode instead of the usual tactics-based scripts."
Citation needed. I can make a perfectly reasonable Isar-style declarative proof in Lean. Just because most users of Lean choose not to do this doesn't mean it can't be done. I should mention that users are more willing to write imperative proof code instead of declarative in Lean is because 1) the interactive debugger is responsive and easy to use, and 2) writing a nicely structured declarative proof is more work than an imperative proof.
How so? Rust is much, much less complicated than C++, a language that's routinely taught to beginners. Python is far more complex than either of those, especially once you account for the incidental complexity in whatever "packages" you're interfacing with - Rust simplifies a lot of that stuff without giving up on performance.
Indeed, the "at a mere fraction of a UBI’s cost" claim is outright ignoring the economic cost associated with very high marginal tax rates. But Acemoglu also mentions the NIT, which would work a lot better than either. And, to be fair, the best-known economic models of non-linear income taxation imply that marginal tax rates on a NIT should actually be fairly substantial (60% or so would not be out of the question!), so as to keep the break-even point from getting too high; the rates just shouldn't be as high as 100% or more!
A bit of a clickbait title. Acemoglu does say that a 'UBI' is a bad idea, but then advocates for the NIT (Negative Income Tax), which actually works in much the same way. What he actually seems to be saying is that way too much UBI advocacy makes unrealistic claims about what a UBI-ish system might look like - and it's hard to disagree about that!
Debian's current, supported version is the stable version. The reason why it's only released every two years and why it feels so 'old', is because it takes Debian Developers many months to "further harden" it before release. It wouldn't make sense to release it under a quicker schedule. Debian does offer "rolling" channels with prompt updates (testing, unstable) but those are officially not meant for real, production use.
I'm not talking about Debian, I'm talking about the old versions of third party software shipped with Debian that have to continue being supported with security updates.
I wouldn't be so sure. Linux still makes very good use of spinning rust media, and on a modern system with 2GB+ RAM (and a reasonably light distro like Debian) much of it ends up being used as disk cache so drive speeds aren't even that relevant. SSD's do speed up the boot process though, I'll give you that.
The thing is they went from Windows on spinning rust to Linux on an SSD.
* Modern windows has a really hard time on spinning rust, they'd have seen extreme performance improvements going to windows on an SSD
* Linux deals better with spinning rust, they'd have seen performance improvements going to linux on spinning rust
The SSD is still going to be most of the performance improvements. The gap between a 5400 HDD and an SSD (even more so NVMe) is just ridiculous on every single metric, there's orders of magnitude differences in throughput (especially on random reads or writes) and even more so latency and concurrency.
This, so much. I for one find the vast majority of so-called "high art" and "high culture" to be frankly depressing and off-putting, rather than "grand" or "transcendent" in any way. In the best of cases this is somewhat offset by the HNish intellectual interest and curiosity one can take in the stuff (and some variant of this is an often-cited point that's supposed to demonstrate the superiority of "high" culture) but all things considered, I'd rather explore the hidden intellectual interest of things that are usually dismissed as mundane, "middlebrow" or even "lowbrow".
IME, touch screens are simply way too error-prone for any sort of serious work. You can make them work, but only by adding a "swipe to confirm this input" step for any potentially-destructive activity. (AOSP "recovery" environments do have this, for a reason!) Current command-line environments are not well setup for this, but you could make this work in combination with a "web"-like interface (especially if designed around REST principles).
There's nothing wrong with Go, in its proper domain! Heck, maybe the typical program written in a GC'd "scripting"-like language should be rewritten in Go. Rust is nice, but sometimes you really can't do without a GC.
Implementation inheritance does more than break encapsulation. It creates inherent fragility because of how it makes code defined in a base class dependent on behavior that may be overridden willy-nilly in a derived class; the possibility of overriding means that the whole bundle of public and protected methods/fields is essentially part of an object's external interface, due to its open extensibility - with method implementations typically calling through that same interface. This is far worse than what you would get with a plain old non-OOP program, even one that doesn't use encapsulation at all! In most cases, implementation inheritance can be treated as simply a bad idea that should be avoided altogether - composition is clearly the better approach.