Viewing a single comment thread. View all comments

Tiamatium t1_j9uzmge wrote

Weeks, maybe months.

The larger problem might be long-term memory, but once we figure that out... Actually no, it is easy to figure it out.

So weeks, maybe months, but you will need wifi. A d it will be a bit laggy, as in it will take a noticable delay to respond. Not long, just noticable, so that will take out a lot of emotions out of shit.

Honestly, this depends on when OpenAI releases chatGPT API, because once that's out, it's out. It really is just a quick connection of voice-to-text API, chatGPT and text-to-voice, ad that's it.

4

BarockMoebelSecond t1_j9v9nu7 wrote

There's already a GPT3 API.

1

Tiamatium t1_j9wxp1i wrote

Yeah, but it's not as good as chatgpt. Plus, chatgpt API will have 8x the content window, thus memory.

1