Hacker Newsnew | past | comments | ask | show | jobs | submit | more logicprog's commentslogin

Fortunately the options aren't merely some form of marxist communism or capitalism! There are whole branches of (anarchist/federalist) socialism that actually predicted what would happen in the Soviet Union while Marx was still alive and writing and had very different plans themselves (Bakunin and Proudhon), and there were also distinct forms of socialism that predated Marxism, and forms that came after it, so even if we were to restrict ourselves to socialist schools of thought that have existed for round about 200 years (a very narrow slice of economic systems), there are a lot more options to try than "bureaucratic oligarchy, with a top down centrally planned economy and a fundamentally authoritarian ideology" and "a system designed to give some power over others via absentee property ownership, and facilitate the concentration of that power through the concentration of property due to the right of increase and profit."


The problem with using large language models for brainstorming or writing is that the fundamental mechanism by which they work is to choose the most common thing to say at any given point — that is, the most average, middle of the bell curve thing to say. That's how they give the appearance of having any form of coherence, by rarely if ever deviating from the happy path. So any ideas you get from it are going to be pretty unoriginal, and even if you give it original prompts for ideas, it's eventually just going to regress back to the mean again as it traverses the probability space back in the direction of the average at every step. And its writing is always going to be essentially the average human writing style.


The public foundational models maybe.

But this isn't true in general, you can easily train a local model to write in very localized styles, and include temperatures that allow for wild swings outside of average.

If you want a rambling, occasionally brilliant Kerouac or de Montaigne you can make one.


Well that's just the problem isn't it? The "outside" you're probably thinking of doesn't really exist anymore in a lot of places either. Suburbia especially is a hellscape of empty flat crab grass lawns and sidewalks and tarmac. There isn't any fun nature to explore. If there had been where I had grown up I would have loved it.


This is actually similar to the lore in Raised By Wolves — there's a war between the Mithraic (dominant religion on their planet) and the atheist faction, and the Mithraic are actually far more technologically advanced because their scriptures mysteriously contain precise instructions for wildly advanced technology. Of course that only makes the Mithraic more advanced in the few areas their scriptures happen to talk about, and far less advanced in other areas, because they don't understand the principles and so can't apply them elsewhere, and they don't even know how their own tech works really, so it's an interesting conflict. The whole show is extremely interesting — I really should finish it sometime.


> The whole show is extremely interesting — I really should finish it sometime.

Unfortunately, even the show won't be finishing the show. Every episode kept adding more "mysteries" and ending on increasingly ridiculous cliffhangers until it got cancelled without explaining anything.

It definitely had promise, but I'm disappointed in Ridley Scott for not putting together a coherent, self-contained narrative, and I'm a fan of all his previous works. I guess he (wrongly) gambled that if he left enough unanswered, they wouldn't dare cancel it.


> Every episode kept adding more "mysteries" and ending on increasingly ridiculous cliffhangers until it got cancelled without explaining anything.

While also getting more and more slow and labored. That's why I stopped, yeah. But I'm also a completionist.


Something similar happens in Larry Niven's "Footfall", though I won't say what to avoid spoilers.


Yeah, if only there were decades of programming language design research and field testing since C for Go to draw on...


Having to make ad hoc domain specific languages embedded in your comments that are then processed and expanded by entirely separate compilers, essentially reinventing the C macro preprocessor, is in my opinion the mark of a terminally underpowered language that even the people regularly using it realize is underpowered even if they don't admit it to themselves.


This is my position as well.

Call me a crank, but while a lot of people say that you don't need to go without a garbage collector for most problems, I believe the converse: that there aren't many problems where you actually need a garbage collector (if you have some other way of freeing yourself from having to manually manage your memory and manually avoid use after free and so on). I think once your mind gets adjusted to working without one and within the constraints of the borrow checker, it's really not that much of a drag, and if you really need to do something outside those bounds (or just want to ignore it and hack something together quickly) that's what Gc and Arc are for.

And honestly, in general I like the unique intersection of features that Rust offers as a language, totally independent from its performance or the presence or absence of a garbage collector at all. Even if it was garbage collected and about 2x slower, I think I would still prefer it to most of the 40 or so languages that I've tried over the years. It feels like a version of OCaml with a much better standard library and ecosystem, a much better build system, a vastly better and cleaner way of doing parametric polymorphism, and much better metaprogramming facilities in comparison.

Not only that, but I genuinely find the constraints of the borrow checker — most especially only allowing either one mutable reference or multiple read-only references at a time — to be a great help in making my code clearer and more straightforward: it essentially helps me reach a lot of the same benefits as pure functional programming (where the data flow of my application is carefully threaded and there is no spooky action at a distance) while still allowing me to have imperative programming and mutation where I need it, by just forcing me to only have one place in my code be able to mutate anything at a given time, to explicitly mark it whenever I'm giving any piece of my code mutable access to anything, and essentially preventing me from having any persistent mutable access to a data structure be held in some other data structure or portion of code that can then continue to mutate it without my explicit permission and knowledge. So why give up the uniquely productive constraints of the borrow checker, which again to reiterate give me 90% of the benefits of pure functional programming with only 50% of the annoyance, and add in all of the unnecessary bloat and non-determinism of a garbage collector, when I have never really found myself to need it?

Furthermore, Rust is significantly more performant than even very fast and systems oriented garbage collected languages with their own runtime such as Go or OCaml or Java, which has the added benefit of allowing me to not really have to worry about performance at all beyond picking reasonable algorithms, and sometimes not even then since the language is often fast enough to make brute force feasible, which paradoxically means that when I am programming in rust, this high performance systems language, it often frees me from the burden of having to worry about performance like I would in something like python. For instance, I've actually implemented programs in Python that end up being so hilariously and frustratingly slow that I eventually gave up on working on them.

Moreover, if I do have to concern myself with performance in rust, I feel like working in a language that uses RAII and doesn't wrap everything in pointers for GC by default provides a significantly clearer mental model for me of what my program is actually doing with its memory, and how my data is actually laid out, and I also typically have a clearer mental model of what CPU operations my program is actually doing as well. And in fact just having this clear mental model of what my program is actually doing helps me Implement more efficient algorithms and make cleaner and less wasteful programs in general, which I quite like.

So yes, in essence I would turn the question back on the questioner and ask why they think they need a garbage collector and a language runtime and all of this extra stuff. Yes it does make the language marginally easier and quicker to write things in, but if I am sitting down to write a large scale program or long running project then being slowed down slightly in return for increased productivity.


Honestly I feel like the Zeitgeist is that anything that smells of “optimization” is supposed to be preemptively justified. Like do you need a borrow checker or a static language or an AOT language implementation or...? Like anything that looks like they might by-default give you better performance (perish the thought) is supposed to be justified by some line about, oh well it turns out in our case that we need this thing actually, please forgive us for not using a dynamic language.


That's a really good elaboration of what I'm getting at! Now that you mention it, it does seem like that's the case — as if people have taken the mantra about "premature optimization" way too far, to rule out not just micro optimizations, but also using good algorithms, and even starting with good foundations. As if by default we should be using the slowest and least efficient language and runtime possible, and any little concession to by-default performance (which in my opinion doesn't really count as optimization even) has to be justified by some particularity of your field or task. And honestly, I have to wonder if that's sort of the mentality that has led us to things like Electron. It makes sense to be worried about optimization for performance if it makes it makes your developer velocity clearly worse, but in my opinion, and according to what little research I've seen, the difference in development velocity between an ahead of time compiled statically typed language and a dynamic language quickly disappears beyond a certain program size, which makes sense to me, since it would be presumptively a constant factor.


I use Aegis as my main app for 2FA so I can answer these questions:

> Can you do an encrypted backup on demand (protected with a password you supply)?

Yes!

> Is there any desktop app such backup can be opened/read with (or even eg. read with something like sqlite db browser)?

It's just plain JSON once decrypted, so it's always readable; I do know the GNOME Circle app "Authenticator" can natively import Aegis backups as well, since it's what I use on my desktop machine, but I don't know what other apps exist.

> Can the app be configured to save an encrypted copy to eg. Dropbox whenever changes are made?

It does have some facilities for automatic and cloud backups judging from the settings page, but I've never tried them

> Is it recommended to install from Play store, or the APK off GitHub?

If you do the latter you'd lose automatic updates. I used F-Droid.


>If you do the latter you'd lose automatic updates.

Obtainium[1] will give you automatic updates from most sources, including Github/Gitlab/Codeberg and F-Droid repos. Especially relevant to this discussion, since Aegis 3.0 hasn't hit F-Droid yet, as at the writing of this comment.

[1] https://github.com/ImranR98/Obtainium


Well, concerning memory safety, that might be a more important feature than you think. Allow me to quote a comment I just got an email notification for on a GNOME thread about NVIDIA:

> I wish I could be more optimistic about mutter!3304 coming soon, though... it's been 5 months already, and it doesn't seem anywhere closer to being merged into main To make matters worse, patched mutter 45.5 seems to be causing use-after-free on NVIDIA's kernel driver.


> Most custom ROMs(including official LineageOS) are as secure, usually even more, than OEM ROM

I definitely don't think that's the case. I'd check out the sections on microG and custom ROMs here: https://madaidans-insecurities.github.io/android.html (they're very knowledgeable, so usually factually accurate, but often frame or evaluate things in a very biased way IMHO, so be sure to evaluate the verifiable facts they state for yourself, and your own threat model). They're way more open to exploitation and far less secure than OEM ROMs, typically because of the way they have to pry open security stuff to get their various hacks to work. They are typically much better on privacy, but having your phone, from which you do all your communication and banking, and which has a dense cluster of sensors and transmitters that follow you wherever you go, be more open to exploitation by any random hacker or piece or malware seems like a bad idea to me.


OEM ROMs that passed Google certifications and everything have been found in the past to:

- Send the content of your clipboard to the internet (OnePlus)

- Disabled SELinux on boot (Asus)

- Allowed any app to be uid 1000, and exploit known for years (Samsung)

- Ignore Linux policy and just picked security-looking patches. And got pwned repeatedly by simply looking at LTS patches. (Google)

As for madaidans-insecurities, they are extremely biased.

Just mentioning microg:

> which allows apps to request to bypass signature verification.

There is EXACTLY *one* app (k k k, two because of Play Store fake too) that bypasses signature verification, and you can VERIFY, which app does it, and WHICH signature it fakes. LineageOS integrated their own microg/fake signature mechanism thanks to Google anti-freedom policy, and you can review their own integration that is even more restricted than what I did (which already is infinitely more secure than what madaidans-insecurities mention): https://review.lineageos.org/c/LineageOS/android_frameworks_...

The final comment in the microg section basically sounds like "oh yeah, that argument could be completely wrong, meh"


> OEM ROMs that passed Google certifications and everything have been found in the past to

That's really good to know to keep things in perspective! Thanks for taking the time to bring that perspective. Although I would say that it seems like LineageOS kind of does all of those kinds of things at once, whereas OEM ROMs might do one or the other each? Or am I wrong?

Edit: also, I can't find any info on your ASUS claim, and the OnePlus one seems misleading (it's not some vulnerability or passive background thing that just broadcasts your clipboard, it was an app you could electively use to send clipboard stuff to other computers).

> There is EXACTLY one app (k k k, two because of Play Store fake too) that bypasses signature verification, and you can VERIFY, which app does it, and WHICH signature it fakes.

I'm not familiar with Lineage OS — does this mean that you know only one app will ever do this and can verify that, so it's just one specific exception to the rule, or is it just that the spoofing was made possible for just that one app, and only one app is known to do it, but any app could without your knowledge in theory?

> As for madaidans-insecurities, they are extremely biased.

Like I said, their factual knowledge is generally useful, but their framing (including context, so you can get some perspective) and analysis is usually wildly biased IMHO. I wonder what their damage is.


> I wonder what their damage is.

Torvalds' low opinion of on security researchers in general comes to mind.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: