Viewing a single comment thread. View all comments

A-Delonix-Regia t1_jacethn wrote

Nearly no one will do that. There are millions of gamers, and what? A few tens of thousands of people who are interested enough in AI content generation to buy a new GPU?

6

RuairiSpain t1_jacukww wrote

ChatGPT has 125 million vocabulary, to hold that in memory you'd need at least 1 80GB nVidia card, at $30,000 each. As AI models grow they'll need more RAM and Cloud is the cheapest way for companies to timeshare those prices.

It's not just training the models, it's also query the models that need that in memory calculations. I'm not expecting gamer to buy these cards. But scale up the number of using going to query OpenAI, Bing X ChatGPT or Google x Bard, and all the other AI competitors and there will be big demand for large RAM GPUs

−3