I think many people tried making AMD GPU go brrr for the mass of the developers but no one succeeded.
I don't get why AMD doesn't solve their own software issues. Now they have a lot of money so not having money to pay for developers is not an excuse.
And data centers GPUs are not the worst. Using GPU compute for things like running inference at home is a much, much better experience with Nvidia. My 5 years old RTX 3090 is better than any consumer GPU AMD released up to this date, at least for experimenting with ML and AI.
I recently switched from an NVidia card (5090) to a couple of AMD cards (R9700 32GB) for my inference server.
I must say it's been a completely positive experience. The mainline Fedora kernel just worked without any need to mess with the DKMS. I just forwarded /dev/dri/* devices to my containers, and everything worked fine with ROCm.
I needed to grab a different image (-rocm instead of -cuda) for Ollama, change the type of whisper build for Storyteller. And that was it! On the host, nvtop works fine to visualize the GPU state, and VAAPI provides accelerated encoding for ffmpeg.
Honestly, it's been an absolutely pleasant experience compared to getting NVidia CUDA to work.
> Now they have a lot of money so not having money to pay for developers is not an excuse.
NVidia is the exception to the rule when it comes to hardware companies paying competitive salaries for software engineers. I imagine AMD is still permeated by the attitude that software "isn't real work" and doesn't deserve more compensation, and that kind of inertia is very hard to overcome.
My man, your world view is twisted by dogma. You may not personally like how she runs AMD, but Lisa Su is eminently qualified for the job. Her gender has nothing to do with this. You need to check yourself.
At the CEO level there is no "qualified for the job". It's not like you can get a PhD in being a successful CEO. There is only actual success.
And it's not me twisted by dogma. I'm just predicting what would happen. Do you seriously argue Su could be chucked out (very likely replaced by a man) without a giant screaming fest from the usual suspects? No way. It's the NYTA journos who'd go on the warpath and be twisted by dogma.
You’re bringing a lot of emotion to this but not much information or a compelling argument. Perhaps you shouldn’t be leveling accusations of “screaming fest”.
Also, the NYT recently ran a piece asking whether women ruined the workplace. It’s unclear why you think they would “go on the warpath” over a CEO being pushed out for business reasons.
Yeah and look at the response to that article ... NYT readers couldn't believe it had been published at all. But you know what I mean.
You are perceiving emotion where there isn't any, just analysis. Maybe it makes it easier for you to dismiss the point. I don't have a position in AMD and don't care what happens to them. It's just obvious what would happen and why they'd be reluctant to swap out their CEO.
That "bad leadership" dug AMD out of hole and transformed the company into a behemoth. From under $2 a share to around $250 in eight years. I'll invest in that kind of bad leadership all day everyday.
You should compare AMD vs its peers, not its even worse prior state.
AMD should by all rights be a strong competitor to NVIDIA with a big chunk of the AI market. They have nearly nothing. The buck should stop at the top, but with AMD it doesn't.
This uses to be impressive, then you look at the gains that Bitcoin investors have and this is quite paltry, especially when you consider that inflation is 8-10%, per year.
nVidia has been deeply involved in the software side, first with gaming, forever. It’s written into their DNA. Even when ATI/AMD could outperform them in raw hardware, nVidia worked well with every last game and worked with individual developers even writing some of their code for them.
I don't get why AMD doesn't solve their own software issues. Now they have a lot of money so not having money to pay for developers is not an excuse.
And data centers GPUs are not the worst. Using GPU compute for things like running inference at home is a much, much better experience with Nvidia. My 5 years old RTX 3090 is better than any consumer GPU AMD released up to this date, at least for experimenting with ML and AI.