GregoryfromtheHood

GregoryfromtheHood t1_jdusrmn wrote

Absolutely. I've got Wikipedia and Stsckoverflow downloaded locally, just in case. But as soon as I got llama and alpaca up and running locally and started using it to complete real tasks and using it for work during the day, I realised that I could just use this in a situation where the internet is gone. This little 16gb file is all I need. And I could run a smaller model on a small computer for travelling. We've basically got Jarvis already. Openai whisper to talk to it, alpaca to do the thinking and tortoise to respond back with a realistic voice. All locally.

1