Submitted by Stiven_Crysis t3_11e2wui in technology
A-Delonix-Regia t1_jacethn wrote
Reply to comment by RuairiSpain in PC GPU Shipments Drop 35% Year-over-Year in Q4 2022: Report by Stiven_Crysis
Nearly no one will do that. There are millions of gamers, and what? A few tens of thousands of people who are interested enough in AI content generation to buy a new GPU?
RuairiSpain t1_jacukww wrote
ChatGPT has 125 million vocabulary, to hold that in memory you'd need at least 1 80GB nVidia card, at $30,000 each. As AI models grow they'll need more RAM and Cloud is the cheapest way for companies to timeshare those prices.
It's not just training the models, it's also query the models that need that in memory calculations. I'm not expecting gamer to buy these cards. But scale up the number of using going to query OpenAI, Bing X ChatGPT or Google x Bard, and all the other AI competitors and there will be big demand for large RAM GPUs
Viewing a single comment thread. View all comments