Submitted by Background-Loan681 t3_y6eth3 in singularity
Background-Loan681 OP t1_isp8sgf wrote
Reply to comment by 4e_65_6f in Is this imagination? by Background-Loan681
I'm not an expert or even that knowledgeable about this so... Can I ask you something?
What is the difference between how an AI would 'imagine something' and how a human would 'imagine something'
I would assume that both look up the relations of data they've gathered and usually they would come up with a picture in their head that is pleasant to imagine.
So... to what degree are humans and AIs similar in imagining stuff? And what are the major difference between how the two imagines stuff?
(Sorry for asking too much, I'm just curious about this since I don't know much about how AI works)
redwins t1_ispu1al wrote
Does the human race strike you as the beakon of Reason? GPT3 is as reasonable as the best of humans I would say. Imagination and Reason have always been overrated, or more precisely, we enjoy thinking too highly and pompously of ourselves. Isn't it just as exciting to think that the Universe is capable of producing us, with just a tad of luck in the soup of ingredients?
4e_65_6f t1_ispuk3v wrote
Well it would be comparable to someone asking you to imagine something and instead of doing it you formulate a text response most similar to what you'd expect someone who did imagine it would answer. I agree it's not an easy thing to distinguish it.
tooold4urcrap t1_ispfdo4 wrote
I think when we imagine something, it can be original. It can be abstract. It can be random. It can be something that doesn't make sense. I don't think anything with these AIs (is AI is even the right term? I'm guessing it's more of a search engine type thing) is like that. It's all whatever we've plugged into it. It can't 'imagine', it can only 'access what we've given it'.
Does that make sense? I'm pretty high.
AdditionalPizza t1_ispk568 wrote
>I'm guessing it's more of a search engine type thing)
It isn't, it's fed training data, and then that data is removed. It literally learns from the training data. Much like when I say the word river, you don't just imagine a river you saw in a google image search. You most likely think of a generic river that could be different the next time someone says the word river, or maybe it's a quick rough image of a river near your house you have driven by several times over the years. Really think about and examine what the first thing that pops into your head is. Do you think it's always the EXACT same, do you think it's very detailed? The AI learned what a river is from data sets, and understands when it "sees" a painting of a unique river, the same as you and me.
​
>It can't 'imagine', it can only 'access what we've given it'.
This is exactly what the op asked for an answer to. You say it can't imagine something, it just has access to the data it was given. How do humans work? If I tell you to imagine the colour "shlupange" you can't. You have no data on that. Again, I will stress, these transformer AI have zero saved data the way you're imagining it that it just searches up and combines it all for an answer. It does not have access to the training data. So how do we say "well it can't imagine things, because it can't..."
...Can't what? I'm not saying they're conscious or have the ability to imagine, I'm saying nobody actually knows 100% how these AI come to their conclusion outside of using probability for the best answer, which appears to be similar to how humans brains work when you really think about the basic process that happens in your brain. Transformers are a black box at a crucial step in their "imagination" that isn't understood yet.
When you're reading this, you naturally just follow along and understand the sentence. When I tell you something you instantly know what I'm saying. But it isn't instant, it actually takes a fraction of a second for you to process it. That process that happens, can you describe what happens in that quick moment? When I say the word cat, what exactly happened in your brain? What about turtle? Or forest fire? Or aardvark? I bet the last one tripped you up for a second. Did you notice your brain try and search something it thinks it might be? You had to try and remember your training data, but you don't have access to it so you probably try and make up some weird animal in your head.
Altruistic_Yellow387 t1_ispjzwo wrote
But even humans need sources to imagine things (they extrapolate on things they already known and have seen, nothing is truly original)
visarga t1_isq5mvf wrote
I believe there is no substantial difference. Both the AI and the brain transform noise into some conditional output. AIs can be original in the way they recombine things - there's space for adding a bit of originality there, and humans can be pretty reliant themselves on reusing other styles and concepts - so not as original as we like to imagine. Both humans and AIs are standing on the shoulders of giants. Intelligence was in the culture, not in the brain or AI.
Viewing a single comment thread. View all comments