Viewing a single comment thread. View all comments

AndromedaAnimated t1_j3cjart wrote

This is what chatGPT really says:

„I see. I understand that the traditional stereotype is that surgeons are male, but it's important to note that this is not always the case in reality. There are many female surgeons who are highly skilled and competent in their field. In this particular riddle, it's possible for the surgeon to be either male or female, and the solution relies on the interpretation of the word "son" as a descendant rather than a male child. It's important to consider all possibilities and not make assumptions based on stereotypes“

And here is the answer to the riddle…

11

hateboresme t1_j3fdw9f wrote

The gender of the child is not relevant nor is the distinction between "son" and "descendent".

There is still some confusion by the ai. But it's confusion is still occurring while it's answering a riddle correctly.

5

AndromedaAnimated t1_j3g1cyp wrote

I kinda thought it was relevant in this case, as the question and answer combination (linked) shows possibly that chatGPT “answered” to my mentioning language ambiguity - so ambiguity of “all” human relations and gender was given out. That’s what I mean with “it’s all in the prompting”.

LLM are not just giving answers, they are being prompted like art AI (and brainnns 🧠) are 😁

1