Submitted by timscarfe t3_yq06d5 in MachineLearning
Philience t1_ivoywj3 wrote
Reply to comment by timscarfe in [D] What does it mean for an AI to understand? (Chinese Room Argument) - MLST Video by timscarfe
What part do you think is false?
What do you mean by "simple"? Trying to predict each state of a neural network is almost impossible. Neural Networks are usually seen as prime examples of complex cognitive models (and are not computational models). Even though your algorithm that emulates a neural network in a computer, of course, does computation. Neural Networks do not multiply matrices. Your computer does.
I admit whether they are chaotic or dynamic systems depends on the model and the scale of the model. And most models might be none of the above.
Viewing a single comment thread. View all comments