Submitted by pixiegirl417 t3_11wt2fl in MachineLearning
timedacorn369 t1_jczscaf wrote
It's mentioned as open source. So it means I can get the model weights and run it locally if I want to right?
pixiegirl417 OP t1_jd04sxc wrote
That's right!! Model is here: https://huggingface.co/OpenAssistant/oasst-sft-1-pythia-12b
BayesMind t1_jd8ps8g wrote
Is there an example script somewhere for how to run this? All I've seen is the heavy inference server example in the repo.
pixiegirl417 OP t1_jd8s4nc wrote
I haven't tried to run it locally since I don't have the hardware requirements, and haven't tried to find a way to do it.
However you can check my GitHub if you want to try the server attached inference API (I know it may not be what you're looking for).
Thiago_Von_Duck t1_jd00vqq wrote
That would you mean self-hosted
Viewing a single comment thread. View all comments