Viewing a single comment thread. View all comments

CertainMiddle2382 t1_j6wwyvp wrote

“If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck”

In all honesty, I don’t really know if Im really thinking/aware, or just a biological neural network interpreting itself :-)

2

purepersistence OP t1_j6x005a wrote

>“If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck”

The problem is people believe that. With chatGPT it just ain't so. I've given it lots of coding problems. It frequently generates bugs. I point out the bugs and sometimes it corrects them. The reason they were there to begin with is it didn't have enough clues to grab the right text. Just as often or more, it agrees with me about the bug but it's next change fucks up the code even more. It has no idea what it's doing. But it's still able to give you a very satisfying answer to lots and lots of queries.

1