utopiah t1_je8vryb wrote
It's pretty cool and thanks for providing the playground. I wouldn't have bothered without it. I think it's very valuable but also is quite costly, both economically and computationally, while creating privacy risks (all your data going through OpenAI) so... again in some situation I can imagine it being quite powerful but in others and absolute no. That being said there are others models (I just posted on /r/selfhosted minutes ago about the HuggingFace/Docker announcement enabling us to run Spaces locally) e.g Alpaca or SantaCoder or BLOOM that might enable us to follow the same principle, arguably with different quality, without the privacy risks. Have you considering relying on another "runtime"?
xander76 OP t1_je9emb1 wrote
We have tried out other runtimes, and a lot of them seem to work decently well. If memory serves, Claude was quite good. I'm definitely interested in supporting other hosting and other models as a way to balance quality, cost, and privacy, and we are currently building IDE tools that will let you test your imaginary functions in ways that will hopefully surface those tradeoffs.
Viewing a single comment thread. View all comments