DeveloperGuy75

DeveloperGuy75 t1_ja1gj2q wrote

Ok, that still doesn’t prove anything. In order to have independent thought, you have to freely be able to query your environment and learn from it. ChatGPT(what Bing Chat is based on), cannot do that. It’s a prediction engine that spits out patterns that are completions of prompts. It does very well as a tool, but it is not conscious nor does it have independent thought.

1

DeveloperGuy75 t1_j9knt41 wrote

No dude.. no computer is emotional right now, even though it might say so, due to how they work. ChatGPT, the most advanced thing out there right now just predicts the next word. It’s a transformer model that can read texts backwards and forwards so that it can make more coherent predictions. That’s it. That’s all it does. It finds and mimics patterns, which is excellent for a large language model and especially the data it has consumed. But it can’t even do math and physics right and I mean it’s worse than a human. It doesn’t “work out problems”, it’s simply a “word calculator.” Also, Occam’s razor is something you’re using incorrectly. You could be a psychopath, a sociopath, or some other mentally unwell person that is certainly not “just like anyone else”. Occam’s razor means the simplest explanation for something is usually the correct one. Usually. And that’s completely different from the context you’re using it in.

1