I remember visiting a computer exhibition (CeBIT) in the very early 90s. In one booth they had some of the big Amiga systems (2000, I think) and at some point on of the booth's staff did the 3 finger salute (press 3 specific keys on the keyboard to force a reboot) on one of the machines. The machine was back up in what felt like an instant. I was amazed by that. They probably had setup the whole boot process via RAM (see "RAD" disk on the Amiga), but I hadn't any idea about that back in the days.
Still to this day I think this is how it should be. You want to switch ON your computer and it should be ready for use.
But what do we get? What feels like minutes of random waiting time. My Raspberry PI with Linux which probably eats 10 of those Amiga 2Ks for breakfast shifts through through a few 1000 lines of initialising output… my Mac which probably eats like 50 of those Amiga 2Ks for lunch… showing a slowly growing bar doing whatever… Why didn't this improve at all in the last 30 years?
It's the OS. About 10 years ago, I had an Asus EeePC, which was an underpowered piece of trash with a 32 bit Intel Atom CPU, but it cold-booted in less than 3 seconds. And by "booted", I mean completely booted, i.e. not like Android, where you have to wait a few more minutes until all the background services settle and the UI stops lagging.
Mine broke after a decade, or I would use it too I think. Just so neat to bring everywhere, I even got used to code on that thing. Albeit it was the later model with more normal keys, the original 701 was not great to type on, not because of the size (I got used to it) but it was something about the layout which was weird.
> They probably had setup the whole boot process via RAM (see "RAD" disk on the Amiga), but I hadn't any idea about that back in the days.
> Still to this day I think this is how it should be. You want to switch ON your computer and it should be ready for use.
Don't we already kind of have this? It's setup to be dynamic, and we'd ended up calling it "sleep", but it basically does what you're talking about, but dynamically and optionally, basically chucking the entire state into RAM (or disk for "hibernate") then resumes from that when you wanna continue.
Personally I've avoided it for the longest of times because something always breaks or ends up wonky when you resumes, at least on my desktop. The PS5 and the Steam Deck handles this seemingly even with games running, so seems possible, and I know others who are using it, maybe Linux desktop is just lagging behind there a bit so I continue to properly shut down my computer every night.
Macs on the other hand are extremely stable. In my 4 years of using my MacBook Pro M1 Max, I’ve only restarted during OS updates. There were maybe a handful instances where it froze and I forced restart. Other than that, I only put it to sleep every time and it works like a charm. I use it for heavy duty software development and experimentation with local models, so it’s even more surprising!
I'm using an M4 Macbook right now and I constantly have issues with USB devices (especially hubs) failing to work properly after sleep. Its very unpredictable too, I can't seem to make it happen.
Its actually kind of funny, because while people talk about how unreliable Bluetooth is, moving a few of those devices from USB to Bluetooth (like my trackball mouse) made the situation far more reliable. Sleep has been that bad.
The hardware Apple makes is incredible, bar none, which is why is such a shame the OS and application UX is absolutely horrible and continues to get worse with each iteration. If Apple would publicly support Linux efforts on Apple hardware I'd probably switch back in an instant. But until then, I guess I'll continue turning off my desktop at night, and waiting a whole of 15 seconds for the startup in the morning oh the horrors.
I have used Windows hibernate since Windows XP and never had an issue with devices after resuming Windows.
Within recent years on Windows 10 I have gone months without a restart, only hibernating my pc.
In the early days I used a custom built pc. In the later years(post 2005) I have only used laptops, mostly Dells with a sprinkling of Lenovos; if that matters.
I don't know why Windows now hides it from the power menu by default now.
I think it's mostly us with lots of external gear (mostly audio related) that things get a bit wonky, and if you're running graphic-heavy applications that you're trying to resume at the same time. For example, Ableton for the longest of times couldn't handle resuming from hibernation for me, seems to work today (Windows 11), but still having the same issue with a running Houdini window, resuming from hibernation does something with the communication with the GPU (my hunch) and the window freezes when resuming.
Windows prioritize phoning home and data collection over UX. If you have a corporate install you’ll also have negligent EDP software killing your boot times.
You can get fast boot times on linux if you care to tweak things.
AFAIR I only hat to put a bit of duct tape over that silly extra hole to make a proper (Amiga) 880KiB DD (Double Density) out of those weird "PC" HD (High Density) disks… — oh wait… I think I went the other way around than you! :-)
I guess that's that acceptable software bloat the site probably is talking about. :) (I have not clicked. I first read comments to find out if it's worth clicking.)
Lol. It's the behavior I see when there's a malicious chrome plugin installed. A link on a page loads a spam site randomly in a new tab, but the links works normal after that. Im pretty sure it's none of my plugins I guess.
I bought V2 a while ago too when it was offered extra cheap. The problem it doesn't run on my rusty machine. I bought it to have it as reserve once I upgrade my machine someday (who knows if my V1 stuff still runs then?). I learned about this weird activation server stuff afterwards, so ultimately I had to ask for my money back. There was no way to "activate" the software and store the key/keyfile in a backup. In no way this is future proof in my view.
I want to use my software w/o depending on the availability of some random 3rd party server. I guess it just got worse with this new app here. I'm not enthusiastic about it at all. This has nothing to do with a price point at all (I was happy to pay for all my 3 V1 apps separately).
I simply add "0.0.0.0 youtube.com" to the (network wide) hosts file on my router. Problem solved. I'm simply do understand why the problem is just with "shorts", most of the "longs" are not worth wasting time with either. :-)
1. an activity done regularly in one's leisure time for pleasure.
If you enjoy the activity then it is neither pointless nor a waste of time since the primary purpose is enjoyment. If you don't enjoy it, it's not a hobby.
No doubt these posts could be your hobby; you do it for pleasure but it's otherwise just a pointless waste of time.
Hedonism — the doctrine that pleasure or happiness is the sole or chief good in life.
"It's not a waste of time since the primary purpose is enjoyment" — in that sentence you place enjoyment above any other benefits the activity might have (but most often does not). Hence, it's a hedonistic approach.
I was not making a global claim about ethics; I was defining what makes an activity a hobby. Saying "for hobbies the primary purpose is enjoyment" doesn't assert that pleasure is the sole good in life -- it simply notes the internal aim of leisure activities done for their own sake. That aim can coexist with other goods (skill, friendship, health, meaning) and the fact an activity fulfills its own goal means the time isn't "wasted" even if no external output results.
Enjoyment here is a sufficient justifier within that domain, not in a life-wide philosophy. Labelling it "hedonism" is a significant overreach.
Isn't D supported by the GNU compiler collection? I personally would prefer this type of tooling over what Rust and Go do (I can't even get their compilers to run on my old platform anymore; not to mention all this dependencies on remote resources typical Rust/Go projects seem to have: which seems to be enforced by the ecosystem?)
It is supported. However on Windows GDC does not work. LDC based on LLVM needs Visual Studio but I maybe wrong since there are clang/llvm distributions based on mingw64. Other than that DMD works fine, a bit slower than the other compilers, but a charm to use.
But is that a problem of the language itself, or is it just a problem of available toolchains? E.g. if the gcc compiler collection would come with BASIC support and you just could type something like "gbasic -O3 foobar.bas -o foobar" to get a properly optimised executable out of your BASIC source code file then some may would still use BASIC today, I guess?
I started with BASIC too. Also enjoyed BlitzBasic2 for a long time on the Amiga. That's where I learned programming… back then when programming was still fun.
Which basic. I remember Atari BASIC - which was a terrible language since the organization was line numbers. Visual BASIC looked a lot nicer the few times I saw it, but I have a total of 10 minutes of my life looking at visual BASIC and so I have no clue how it is in the real world. I do know Visual BASIC was used by a lot of untrained programmers who never learned why we have best practices and so there is a lot of bad BASIC out there - this isn't the fault of the language just the users.
Realistically? No, not at all. The reason there are no toolchains for BASIC is because nobody uses BASIC (because it's not functional in our modern world), not the other way round.
Why do you think BASIC is not functional? Our modern world does not differ from the 1980 world at all. Variables are variables, subroutines are subroutines.
It fall out of fashion, along with Pascal, Perl, Ruby, but that's just fashion.
I have even seen pretty darn impressive things done with VisualBasic back in the day. And that were not hobby things. I've seen it used in very important mission critical telecommunication equipment. The compiler was custom, not the one from Microsoft. After all, the language had pretty much anything other languages had.
How can a language be "inefficient"? You can say it lacked on expressiveness. Maybe was too verbose? But I would not place BASIC into the "verbose" category.
Because BASIC simply doesn't have first-class functions, and they would be quite hard to represent in a BASIC-like syntax while keeping the language idiomatic. Even the unreasonably clunky C pattern of having a pointer to a function taking void* as its first argument (to account for closure captures) gets you a whole lot closer to functional programming than even the fanciest BASICs.
For example there's also something called "ghost leeching" (side channel entirely bypassing tracker reporting) which can lead to other peers reporting upload for which there's no opposite account of download on the tracker. Making it look like peers over-reported upload and cheated when they are in fact entirely innocent. There's no way for a private tracker to be really sure about stats. The most the moderators can do is to check for repeating suspicious usage patterns across many torrents of a particular peer under scrutiny.