Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"GPUs per user" would be an interesting metric.

(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.



A lot of GPUs are allocated for training and research, so dividing the total number by the number of users isn’t particularly useful. Doubly so if you’re trying to account for concurrency.


I'm kinda scared of "1.2 hours a day of ai use"...


Sorry, those figures are skewed by Timelord Georg, who has been using AI for 100 million hours a day, is an outlier, and should have been removed.


Roger, but I still think with that much energy at its disposal, if AI performs as desired it will work it's way up to using each person more than 1.2 hours per day, without them even knowing about it :\


When GPUs share people concurrently, they collectively get much more than 24 hours of person per day.


You're right!

With that kind of singularity the man-month will no longer be mythical ;)


It will be epic!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: