Submitted by BrownSimpKid t3_1112zxw in singularity
helpskinissues t1_j8dhsz4 wrote
Reply to comment by Frumpagumpus in Bing Chat sending love messages and acting weird out of nowhere by BrownSimpKid
Nonsense, sorry. Ants do not need prepending context.
"mostly right" no, it's actually mostly wrong. The heck are you saying? Try to play chess with ChatGPT, most of the times it'll make things up.
Anyway, I suggest you to read some experts rather than acting like gpt3, being mostly wrong. Cheers.
Frumpagumpus t1_j8diap9 wrote
lol ants cant speak and i would be curious to read any literature on if they possess short term memory at all XD
challengethegods t1_j8dn5cy wrote
These "it's dumber than an ant" type of people aren't worth the effort in my experience, because in order to think that you have to be dumber than an ant, of course. Also yea, it's trivial to give memory to LLMs, there's like 100 ways to do it.
helpskinissues t1_j8dwubv wrote
Waiting for your customized chatGPT model that maintains consistency after 5 messages, make sure to ping me, I'd gladly invest in your genius startup.
challengethegods t1_j8dylol wrote
That alone sounds like a pretty weak startup idea because at least 50 of the 100 methods for adding memory to an LLM are so painfully obvious that any idiot could figure them out and compete so it would be completely ephemeral to try forming a business around it, probably. Anyway I've already made a memory catalyst that can attach to any LLM and it only took like 100 lines of spaghetticode. Yes it made my bot 100x smarter in a way, but I don't think it would scale unless the bot had an isolated memory unique to each person, since most people are retarded and will inevitably teach it retarded things.
helpskinissues t1_j8dzdfi wrote
Enjoy your secret private highly demanded chatbot version then.
This subreddit... Lol.
Viewing a single comment thread. View all comments