Viewing a single comment thread. View all comments

redditgollum t1_j8xekdd wrote

It will return even greater and better than you could ever imagine in form of open source stuff. Just be patient.

121

cerspense t1_j8yi06s wrote

The only open source gpt alternative is bloom and its not very good. These models take hundreds of gb of vram to run, so you need your own personal server farm or a p2p setup like bloom uses. The more advanced these models get, the less likely it will be for us to run them at home.

17

warpaslym t1_j8z5bs8 wrote

major leaps in optimization and efficiency are normal for every other type of software, i don't see why AI will be any different

18

epSos-DE t1_j8yo6fj wrote

He is more correct than most assume.

​

Strongest indicator = AMD is building AI ASICs into their latest CPUs.

​

The CPU makers do prepare to serve AI on the laptop , NOT on the server.

​

We can expect that AI will come to computers and some phones. The Gooogle phones have tensor AI chips integrated, I think.

11

drekmonger t1_j8zqpw4 wrote

AI already comes with your phone. It's just not the kind of AI you're interested in.

4

sachos345 t1_j93r9xe wrote

People can help with OpenAssistant RLHF dataset in their page, the more the merrier.

1