Hacker Newsnew | past | comments | ask | show | jobs | submit | avhception's commentslogin

Huh, I'm over 10 years in and didn't know about the rightclick-resize either. I really like it! Thanks!

When KDE 4 came out, I switched to Gnome 2. When Gnome 3 came out in (checks notes) 2011, I switched to XFCE. And that was that. I have a minimal taskbar at the bottom of my screen, with a little tray and a little button for the whisker menu. But I usually launch that using hyper + space. It gets out of my way, it gets shit done, I love it. Let's hope that it will survive the Wayland transition.

I don't have as much hatred towards Gnome 3 like everyone else does.

Don't get me wrong, I am certainly not defending it. I was a little heart broken as I really liked Gnome 2. However, I tried to be optimistic with their plans overall.

(I think the early days on Gnome 3 featured something call Gnome Legacy to keep that Gnome 2-ish feel. I likely stayed on that for a while)

I still use Gnome 3 today... but Xfce would certainly be my second choice.


I don't have "hatred" towards Gnome3. I use it for friends and families desktops, they seem to like it. I have also rolled out about ~20 Gnome3-based desktops for my employer.

That said, there are definitely areas were Gnome could be improved. Some of them are understandable and probably stem from a lack funding / devs. Others less so, like removing the options to scale / stretch / center the wallpaper w/o installing "Gnome tweaks".


Yeah - I find it a little frustrating that the first thing to do after installing Gnome3 is to install Tweaks.

It's not even the chore of installing it, Ansible will mostly do that for me. It's that I can't comprehend why something as basic as fundamental wallpaper config is not part of the normal GUI. The reasoning for that one is beyond me.

I did exactly the same series of switches.

I'm on 600mbps fiber with low latency. Some times, I can't be arsed to load the websites linked on HN and simply head straight into the comments. For example, when it's a link to Twitter, I get an endless onslaught of in-site popups. Cookie banners, "sign in with google", "sign in to X", "X is better on the app" and so on and so forth. Meh. I'll sometimes just stick to HN, especially when I'm on my phone on the sofa or something.

Give me a minimal / plain text website every day, it's not just the link speed.


I absolutely love the things that IPv6 delivers and employ it on purpose.

The world very clearly doesn’t revolve around what HN users “love”.

I think the western world very much revolves around:

* The internet

* Linux servers

* Automation

I get your point, but it falls on deaf ears to me since most people don’t feel the benefits until some passionate nerd makes something that scratches an itch.

For a practical example: peer-to-peer sharing like Airdrop is much easier to implement in a world with ipv6.


> For a practical example: peer-to-peer sharing like Airdrop is much easier to implement in a world with ipv6.

And without firewalls. Unfortunately this world does not exist.


According to my last job interview, linux servers are only for websites and worthless otherwise.

The world at large doesn't care what I love, correct. But my users care about whether they have to remember that they're supposed to use port bla instead of the standard port foo, which is a common scenario with v4. Not enough addresses, and / or you can't get them to the VM or container or VPN client or whatever that needs them. IPv6 can often fix these kinds of issues.

Does the world at large care? No.

Do I care? Yes.

Do my users care? Yes, albeit indirectly.

Does my organization care? Yes, in the sense that it removes friction from what it needs the employees to do.

And that's all the justification that's needed, I'd say. The world very clearly doesn't need to revolve around what I love for IPv6 to be a good thing.


Interesting, that's a point of view that I didn't consider so far. Growing up in Europe, even the local church often dates back quite a few centuries. My small hometown has residential buildings that are multiple centuries old, still inhabited today. The town itself dates back to 1072. The attitude towards the buildings and history is very different here.


But there are also hometowns of the mind that disappear, e.g. someone who grew up in East Germany would lament that the cartoons and foods they grew up with no longer exists...


As a "West-German", I'd argue that's also true over here. The 80s and 90s are gone. I even sometimes use the construct "Bonner Republik" to refer to the time before unification.


Hello from Germany, and thanks for the blog post. Fascinating read. I liked how you intertwined the personal point of view with the bigger picture.

"Facing a complex past" is a big theme in Germany, too, of course, and I think it's the only proper way to deal with it. Direct witness accounts and retelling are important and add something that a dry history book can't provide. Keep up the good work!


Thank you so much!


Maybe it's just that you're mostly viewing this through the LLM lens?

I remember having to fight with fglrx, AMDs proprietary Linux driver, for hours on end. Just to get hardware-acceleration for my desktop going! That driver was so unbearable I bought Nvidia just because I wanted their proprietary driver. Cut the fiddling time from many hours to maybe 1 or 2!

Nowadays, I run AMD because their open-source amdgpu driver means I just plonk the card into the system, and that's it. I've had to fiddle with the driver exactly zero times. The last time I used Nvidia is the distant past for me. So - for me, their drivers are indeed "so much better". But my usecase is sysadmin work and occasional gaming through Steam / Proton. I ran LMStudio through ROCm, too, a few times. Worked fine, but I guess that's very much not representative for whatever people do with MI300 / H100.


> and occasional gaming through Steam / Proton

And how does that work on AMD? I know the Steam Deck is AMD but Valve could have tweaked the driver or proton for that particular GPU.


I play lots of games on a AMD GPU (RX 7600) for about a year and I can't remember a game that had graphical issues (eg driver bugs).

Probably something hasn't run at some point but I can't remember what, more likely to be a Proton "issue". Your main problem will be some configuration of anti-cheat for some games.

My experience has been basically fantastic and no stress. Just check that games aren't installing some Linux build which are inevitably extremely out of date and probably wont run. Ex: human fall flat (very old, wont run), deus ex mankind divided (can't recall why but I elected to install the proton version, I think performance was poor or mouse control was funky).

I guess I don't play super-new games so YMMV there. Quick stuff I can recall, NMS, Dark Souls 1&2&3, Sekiro, Deep Rock Galactic, Halo MCC, Snow runner & Expeditions, Eurotruck, RDR1 (afaik 2 runs fine, just not got it yet), hard space ship breaker, vrising, Tombraider remaster (the first one and the new one), pacific drive, factorio, blue prince, ball x pit, dishonored uhhh - basically any kind of "small game" you could think of: exapunks, balatro, slay the spire, gwent rougemage, whatever. I know there were a bunch more I have forgotten that I played this year.

I actually can't think of a game that didn't work... Oh this is on Arch Linux, I imagine Debian etc would have issues with older Mesa, etc.


Works very well for me! YMMV maybe depending on the titles you play, but that would probably be more of a Proton issue than an AMD issue, I'd guess. I'm not a huge gamer, so take my experience with a grain of salt. But I've racked up almost 300 hours of Witcher3 with the HQ patch on a 4k TV display using my self-compiled Gentoo kernel, and it worked totally fine. A few other games, too. So there's that!


Don’t know what LLM lens is. I had an ATI card. Miserable. Fglrx awful. I’ve tried various AMDs over the last 15 years. All total garbage compared to nvidia. Throughout this period was consistently informed of new OSS drivers blah blah. Linus says “fuck nvidia”. AMD still rubbish.

Finally, now I have 6x4090 on one machine. Just works. 1x5090 on other. Just works. And everyone I know prefers N to A. Drivers proprietary. Result great. GPU responds well.


Well, I don't know why it didn't work out for you. But my AMD experience has improved fundamentally since the fglrx days, to the point where I prefer AMD over Nvidia. You said you don't know why people say that AMD has improved so much, but it definitely rings true for me.

I said "LLM lens" because you were talking about hardware typically used for number crunching, not graphics displays, like the MI300. So I was suggesting that the difference between what you hear online about the driver and your own experience might result from people like me mostly talking about the 2d / 3d acceleration side of things while the experience for ROCm and stuff is probably another story altogether.


I see. I see. I got tripped up by 'LLM' since I got the GPUs for diffusion models. Anyway, the whole thing sounds like the old days when I had Ubuntu Dapper Drake running flawlessly on my laptop and everyone was telling me Linux wasn't ready: it's an artifact of the hardware and some people have great support and others don't. Glad you do.


Sometimes it's also possible to simply disconnect the hotel's SIP phone from the Ethernet jack and use that :)


> Because what this AI-generated SEO slop formed from an extremely vulnerable and honest place shows is that women’s pain is still not taken seriously.

Companies putting words in people's mouth on social media using "AI" is horrible and shouldn't be allowed.

But I completely fail to see what this has to do with misogyny. Did Instagram have their LLM analyze the post and then only post generated slob when it concluded the post came from a woman? Certainly not.


Obviously I am putting words in the author's mouth here, so take with a grain of salt, but I think the reasoning is something like: such LLM-generated content disproportionately negatively affects women, and the fact that this got pushed through shows that they didn't take those consequences into account, e.g. by not testing what it would look like in situations like these.


> such LLM-generated content disproportionately negatively affects women,

Major citation needed


> Ahead of the International Women's Day, a UNESCO study revealed worrying tendencies in Large Language models (LLM) to produce gender bias, as well as homophobia and racial stereotyping. Women were described as working in domestic roles far more often than men ¬– four times as often by one model – and were frequently associated with words like “home”, “family” and “children”, while male names were linked to “business”, “executive”, “salary”, and “career”.

https://www.unesco.org/en/articles/generative-ai-unesco-stud...

> Our analysis proves that bias in LLMs is not an unintended flaw but a systematic result of their rational processing, which tends to preserve and amplify existing societal biases encoded in training data. Drawing on existentialist theory, we argue that LLM-generated bias reflects entrenched societal structures and highlights the limitations of purely technical debiasing methods.

https://arxiv.org/html/2410.19775v1

> We find that the portrayals generated by GPT-3.5 and GPT-4 contain higher rates of racial stereotypes than human-written por- trayals using the same prompts. The words distinguishing personas of marked (non-white, non-male) groups reflect patterns of othering and exoticizing these demographics. An inter- sectional lens further reveals tropes that domi- nate portrayals of marginalized groups, such as tropicalism and the hypersexualization of mi- noritized women. These representational harms have concerning implications for downstream applications like story generation.

https://aclanthology.org/2023.acl-long.84.pdf


The question is whether these LLM summaries disproportionately "impact" women, not whether LLMs describe women as more often working in domestic roles.


Then you have to do your research on whether domestic roles have an equal status to non-domestic roles, and not rest on your preconceptions


Unfortunately I can't provide that, since I'm merely trying to come up with the reasoning of the author. If they have sources, though, that could lead to this reasoning.


> Did Instagram have their LLM analyze the post and then only post generated slob when it concluded the post came from a woman? Certainly not.

I actually am sympathetic to your confusion—perhaps this is semantics, but I agree with the trivialization of the human experience assessment from the author and your post, but don't read it as an attack on women's pain as such. I think the algorithm sensed that the essay would touch people and engender a response.

--

However, I am certain that Instagram knows the author is a woman, and that the LLM they deployed can do sentiment analysis (or just call the Instagram API and ask whether the post is by a woman). So I don't think we can somehow absolve them of cultural awareness. I wonder how this sort of thing influences its output (and wish we didn't have to puzzle over such things).


When all one has is a hammer, everything looks like a nail.


I tried local models for general-purpose LLM tasks on my Radeon 7800 XT (20GB VRAM), and was disappointed.

But I keep thinking: It should be possible to run some kind of supercharged tab completion on there, no? I'm spending most of my time writing Ansible or in the shell, and I have a feeling that even a small local model should give me vastly more useful completion options...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: