Viewing a single comment thread. View all comments

nicuramar t1_j9xvqbs wrote

> AI is not computationally demanding to run

ChatGPT kinda is, due to the size of the neural network. But it’s all relative, of course.

6

KarmaStrikesThrice t1_j9y13vs wrote

But is it the size that is limiting or the performance? ChatGPT is definitely too huge for 1gpu (even the A100 server gpus with 80GB of memory), but once you connect enough gpus to have the space available, i bet you the performance is quite fast. It is similar tu human brain, it takes us days, weeks, years to learn something, but we can then access it in a split of a second. The fastest supercomputers today have tens of thousands of gpus, so if chatgpt can have millions of users running it at the same time, one gpu can have hundreds and thousands of users using it.

1