Adiwik t1_j6ipl7n wrote
Reply to comment by steviaplath153 in OpenAI executives say releasing ChatGPT for public use was a last resort after running into multiple hurdles — and they're shocked by its popularity by steviaplath153
Until somebody releases the public version for free somewhere else.
NegotiationFew6680 t1_j6iuiut wrote
Yup, just any day now for some rando to train an LLM with their 30k+ spare GPUs and TPUs, then offer it for free running on hardware likely costing on the order of several cents up to a dollar per query.
EmbarrassedHelp t1_j6ixyl6 wrote
LAION and other groups are working on open source chatbots right now as we speak, and they're making great progress.
NoUtimesinfinite t1_j6j00k1 wrote
The problem isnt training. If initial upfront cost was the only barrier then yes a free version would eventually pop up. The problem is that each query costs a lot, something that cannot be made up by ad rev so anyone running the servers will require money to run it.
GreatBigJerk t1_j6j01xh wrote
It's extremely unlikely that regular people will be able to run anything close to as good as ChatGPT for several years. Language models are far more resource hungry than things like Stable Diffusion.
neoplastic_pleonasm t1_j6j72be wrote
The ChatGPT model is in the neighborhood of 750GB, so sadly we won't be seeing anything remotely as capable that can run on consumer hardware any time soon.
EmbarrassedHelp t1_j6lhtho wrote
Where are you getting the file size from?
slashd t1_j6jhrth wrote
> 750GB
That easily fits on a $50 1TB ssd 😁
neoplastic_pleonasm t1_j6jk8gt wrote
Yep, now you only need a hundred thousand dollars more for a GPU cluster with enough VRAM to run inference with it.
NegotiationFew6680 t1_j6jmsiq wrote
Hahahaha
Now imagine how slow that would be.
There’s a reason these models are run on distributed clusters. A single query to ChatGPT is likely being processed by multiple GPUs across dozens of machines
gmes78 t1_j6k6myq wrote
You need to fit it in GPU VRAM. So go ahead and show me a consumer GPU with 750GB of VRAM.
Avorius t1_j6istpi wrote
chat bots are pretty heavy so I'd imagine it'll be quite a while till an offline version becomes available
shinra528 t1_j6j6h40 wrote
It’s already available to the public for free. You just need $20M in hardware to power it.
Viewing a single comment thread. View all comments