Submitted by Neurogence t3_114pynd in singularity
redditgollum t1_j8xekdd wrote
It will return even greater and better than you could ever imagine in form of open source stuff. Just be patient.
cerspense t1_j8yi06s wrote
The only open source gpt alternative is bloom and its not very good. These models take hundreds of gb of vram to run, so you need your own personal server farm or a p2p setup like bloom uses. The more advanced these models get, the less likely it will be for us to run them at home.
warpaslym t1_j8z5bs8 wrote
major leaps in optimization and efficiency are normal for every other type of software, i don't see why AI will be any different
ChromeGhost t1_j8z4p9c wrote
StabilityAI is making one
TeamPupNSudz t1_j8z6l5d wrote
If you're thinking of Open Assistant, that's LAION, not StabilityAI.
epSos-DE t1_j8yo6fj wrote
He is more correct than most assume.
​
Strongest indicator = AMD is building AI ASICs into their latest CPUs.
​
The CPU makers do prepare to serve AI on the laptop , NOT on the server.
​
We can expect that AI will come to computers and some phones. The Gooogle phones have tensor AI chips integrated, I think.
drekmonger t1_j8zqpw4 wrote
AI already comes with your phone. It's just not the kind of AI you're interested in.
hydraofwar t1_j8y34jn wrote
How?
helpskinissues t1_j8y51l0 wrote
OpenAssistant, Bard, Sparrow, Lambda, ChatGPT, Claude... Please, there're too many options to believe in!
Baturinsky t1_j8yefiq wrote
sachos345 t1_j93r9xe wrote
People can help with OpenAssistant RLHF dataset in their page, the more the merrier.
Viewing a single comment thread. View all comments