Viewing a single comment thread. View all comments

timtulloch11 t1_jduxgh6 wrote

How does alpaca 7b perform compared with chatgpt? I don't have any experience with it. I do have a raspberry pi 8g with 1tb storage for a bitcoin node, it would be awesome if I could run in on there locally. It sounds like the larger parameter models require way too much processing power?

1

Anjz OP t1_jdvkutn wrote

It's not as good as ChatGPT but it's much lighter. Granted it's just a small copy of fine tuning from GPT-3 API, given more parameters for fine tuning on GPT-4 it would probably be a whole different beast. It has something like 10x less data if not more. We're fully capable of creating something much better, it's just a matter of open source figuring out how and catching up to these companies keeping the Krabby Patty secret formula. Turns out for profit companies don't like divulging world changing information, who woulda thought?

If you take a look at Youtube there are a couple demos from people running it on rPI, granted at the moment it's at a snails pace - this could be a different story a year or so from now. It works decently well with a laptop.

1