Submitted by [deleted] t3_11az5r9 in singularity
Tiamatium t1_j9uzmge wrote
Weeks, maybe months.
The larger problem might be long-term memory, but once we figure that out... Actually no, it is easy to figure it out.
So weeks, maybe months, but you will need wifi. A d it will be a bit laggy, as in it will take a noticable delay to respond. Not long, just noticable, so that will take out a lot of emotions out of shit.
Honestly, this depends on when OpenAI releases chatGPT API, because once that's out, it's out. It really is just a quick connection of voice-to-text API, chatGPT and text-to-voice, ad that's it.
[deleted] OP t1_j9v0iuq wrote
[deleted]
ChronoPsyche t1_j9v75hx wrote
Wait till you find out about GPT3. Lol.
BarockMoebelSecond t1_j9v9nu7 wrote
There's already a GPT3 API.
Tiamatium t1_j9wxp1i wrote
Yeah, but it's not as good as chatgpt. Plus, chatgpt API will have 8x the content window, thus memory.
Viewing a single comment thread. View all comments