Viewing a single comment thread. View all comments

Adiwik t1_j6ipl7n wrote

−9

NegotiationFew6680 t1_j6iuiut wrote

Yup, just any day now for some rando to train an LLM with their 30k+ spare GPUs and TPUs, then offer it for free running on hardware likely costing on the order of several cents up to a dollar per query.

16

EmbarrassedHelp t1_j6ixyl6 wrote

LAION and other groups are working on open source chatbots right now as we speak, and they're making great progress.

−1

NoUtimesinfinite t1_j6j00k1 wrote

The problem isnt training. If initial upfront cost was the only barrier then yes a free version would eventually pop up. The problem is that each query costs a lot, something that cannot be made up by ad rev so anyone running the servers will require money to run it.

7

GreatBigJerk t1_j6j01xh wrote

It's extremely unlikely that regular people will be able to run anything close to as good as ChatGPT for several years. Language models are far more resource hungry than things like Stable Diffusion.

6

neoplastic_pleonasm t1_j6j72be wrote

The ChatGPT model is in the neighborhood of 750GB, so sadly we won't be seeing anything remotely as capable that can run on consumer hardware any time soon.

5

slashd t1_j6jhrth wrote

> 750GB

That easily fits on a $50 1TB ssd 😁

−1

neoplastic_pleonasm t1_j6jk8gt wrote

Yep, now you only need a hundred thousand dollars more for a GPU cluster with enough VRAM to run inference with it.

7

NegotiationFew6680 t1_j6jmsiq wrote

Hahahaha

Now imagine how slow that would be.

There’s a reason these models are run on distributed clusters. A single query to ChatGPT is likely being processed by multiple GPUs across dozens of machines

6

gmes78 t1_j6k6myq wrote

You need to fit it in GPU VRAM. So go ahead and show me a consumer GPU with 750GB of VRAM.

2

Avorius t1_j6istpi wrote

chat bots are pretty heavy so I'd imagine it'll be quite a while till an offline version becomes available

3

shinra528 t1_j6j6h40 wrote

It’s already available to the public for free. You just need $20M in hardware to power it.

3