Submitted by MajorUnderstanding2 t3_105qpyu in singularity
AndromedaAnimated t1_j3cjart wrote
This is what chatGPT really says:
„I see. I understand that the traditional stereotype is that surgeons are male, but it's important to note that this is not always the case in reality. There are many female surgeons who are highly skilled and competent in their field. In this particular riddle, it's possible for the surgeon to be either male or female, and the solution relies on the interpretation of the word "son" as a descendant rather than a male child. It's important to consider all possibilities and not make assumptions based on stereotypes“
hateboresme t1_j3fdw9f wrote
The gender of the child is not relevant nor is the distinction between "son" and "descendent".
There is still some confusion by the ai. But it's confusion is still occurring while it's answering a riddle correctly.
AndromedaAnimated t1_j3g1cyp wrote
I kinda thought it was relevant in this case, as the question and answer combination (linked) shows possibly that chatGPT “answered” to my mentioning language ambiguity - so ambiguity of “all” human relations and gender was given out. That’s what I mean with “it’s all in the prompting”.
LLM are not just giving answers, they are being prompted like art AI (and brainnns 🧠) are 😁
nutidizen t1_j3dxvev wrote
damn, it's good.
AndromedaAnimated t1_j3dz460 wrote
It’s all about the prompting. 😄
Viewing a single comment thread. View all comments