Submitted by RamaSchneider t3_10u9wyn in Futurology
StackOwOFlow t1_j7e87a1 wrote
Reply to comment by georgiedawn in What happens when the AI machine decides what you should know? by RamaSchneider
our brains are also giant probability calculators
gundam1945 t1_j7f1ais wrote
Yes and no. For instance, Ml will map event that is not known to it to a event known to it. On the contrary, we will be able to recognize we don't know about it. From there, you could try to make an analog and solve it or invent some new theory to fit it. Machine lacks the intelligence to solve something new.
orincoro t1_j7fs7fw wrote
I’ve seen the argument that truly creative cognition requires the biological executive function. Something has to instantiate the desire to create, and in our minds, this is driven ultimately by the need for survival and reproduction (and of course, the shadow function of a need for death).
gundam1945 t1_j7jjsie wrote
Yes, the ability to adapt. We still don't understand the exact mechanism of our creative thinking. Machine learning is modeled according to how child learn. Therefore, without a model for creative thinking, I see that computer scientists will be difficult to come up with a truly creative AI.
orincoro t1_j7fs06k wrote
Not really. We don’t actually know exactly how cognition works, so it would be a little overzealous to analogize it with a machine. Whenever we do this, we tend to over-rely on such analogies. 20 years ago technologists were talking about how our brain has “apps.” 20 years before that, our brains had “ram.” And so forth. We analogize to machines because we can understand machines, but this does not our brain a machine make.
scrubbless t1_j7uz7rh wrote
My brain is smaller than a computer, so jokes on you!
... Wait.. Hmm...
Viewing a single comment thread. View all comments