Viewing a single comment thread. View all comments

Akimbo333 t1_ja24wyr wrote

Very interesting. Is this Open Sourced?

19

blueSGL t1_ja27gk8 wrote

You need to request access.

16

Akimbo333 t1_ja29201 wrote

Ok. How is it?

6

duffmanhb t1_ja2o0mz wrote

No idea... They only allow in published researchers.

4

visarga t1_ja2u514 wrote

But they documented how to make it by sharing paper, code, dataset and hyper-parameters. So when Stability wants to replicate, it will be 10x cheaper. And they showed a small model can be surprisingly good, that means it is tempting for many to replicate it.

The cost of running inference on GPT-3 was a huge moat that is going away. I expect this year we will be able to run a chatGPT level model on a single GPU, so we get cheap to run, private, open and commercial AI soon. We can use it for ourselves, we can make projects with it.

12

duffmanhb t1_ja2ugba wrote

I hope so. I'm still waiting for them to accept my invite. But soon as I get it, first thing I'll do is create some llama bots for Reddit and see how effective it is compared to GPT3 posting believable comments. If it's nearly as good, but can be ran locally, it'll completely change the bot game on social media.

3

FC4945 t1_ja2qzxt wrote

Meta but it's proof of concept and it's being done in February 2023.

1

NoidoDev t1_ja5t1pw wrote

This. Same question as mine. And of course no. The introduction, the hope, was fake and futile. For now.

1

Akimbo333 t1_ja5tx72 wrote

Well, from what I understand, this model was Multimodal. So it is much much stronger

1