Submitted by 10MinsForUsername t3_11b0na9 in technology
nicuramar t1_j9xvqbs wrote
Reply to comment by KarmaStrikesThrice in ChatGPT on your PC? Meta unveils new AI model that can run on a single GPU by 10MinsForUsername
> AI is not computationally demanding to run
ChatGPT kinda is, due to the size of the neural network. But it’s all relative, of course.
KarmaStrikesThrice t1_j9y13vs wrote
But is it the size that is limiting or the performance? ChatGPT is definitely too huge for 1gpu (even the A100 server gpus with 80GB of memory), but once you connect enough gpus to have the space available, i bet you the performance is quite fast. It is similar tu human brain, it takes us days, weeks, years to learn something, but we can then access it in a split of a second. The fastest supercomputers today have tens of thousands of gpus, so if chatgpt can have millions of users running it at the same time, one gpu can have hundreds and thousands of users using it.
Viewing a single comment thread. View all comments