Submitted by SirDidymus t3_113m61t in singularity
AwesomeDragon97 t1_j8rmij0 wrote
>Bing willingly attributing invalid sources to suit a narrative.
This is simply one of the many flaws of Neural Networks: everything they say is a made up.
Darustc4 t1_j8rpldc wrote
Oh yeah, everything is says is *so* made up people find it hard to discern stuff written by an AI and a human. I think you're either putting too little credit into what AI does, or putting way too much credit on human capabilities.
When a human expert fucks it up, gets cocky, tries to alter sources and is confirmation biased, do you also say: Yeah, this is simply one of the many flaws of humans: everything they say is made up.
MrSheevPalpatine t1_j8rsgrq wrote
Yes, that is basically what I say, I just omit the "everything they say is made up" because that's being a bit pedantic about it. Yes humans fucking up, altering their sources, being confirmation biased, etc., is generally just one of the many flaws of humans. How is that not exactly the case? Practically speaking not everything these models say is "made up" in the colloquial sense that it's total bullshit, but technically speaking it is all being "made up" by the model (as I understand it anyhow). People are in essence the exact same, your brain is ingesting information from the world around you and "making up" the things that you output in the form of language, art, etc. Some of what you "make up" and say is factually accurate and some of it (a lot of it honestly) is total fucking bullshit. (See like half of Reddit comments and posts)
Darustc4 t1_j8rtck2 wrote
That's fair, but I don't see the point of saying AIs are making stuff up then, when humans are not that different in that regard. It seems a bit of a moot point.
AsuhoChinami t1_j8tl32o wrote
Um, no. Hallucinations exist, but language models give correct information more often than not.
Viewing a single comment thread. View all comments