Submitted by Fit-Meet1359 t3_110vwbz in singularity
Fit-Meet1359 OP t1_j8bndg7 wrote
Reply to comment by Economy_Variation365 in Bing Chat blew ChatGPT out of the water on my bespoke "theory of mind" puzzle by Fit-Meet1359
No, I made it up. Initially I wanted to see if ChatGPT could understand someone's feelings/opinions if I kept drowning it in irrelevant/tangential information (eg. what other people think their feelings/opinions are).
amplex1337 t1_j8dkb3j wrote
Plot twist. Bob is autistic and does love dogs, but doesn't necessarily show his love in ways that others do. His wife understood that and bought the shirt for him knowing it would make him happy. Bob probably wouldn't have bought a dog on his own because of his condition, and was very happy, but isn't super verbal about it. Sandra probably wouldn't have married Bob if he didn't love dogs at least a little bit.
duffmanhb t1_j8d3p0h wrote
What's interesting is someone in your other thread got the same exact to the letter response from Chat GPT. This says 2 things: This is likely the same build as 3.5 on the backend... And there is a formula it's using to get the same exact response.
monsieurpooh t1_j8e68t1 wrote
I think the simplest explanation is just caching, not a formula
TILTNSTACK t1_j8cjpbk wrote
Great experiment that shows yet more exciting potential of this product.
Viewing a single comment thread. View all comments