Submitted by timscarfe t3_yq06d5 in MachineLearning
trutheality t1_ivqhe68 wrote
Reply to comment by waffles2go2 in [D] What does it mean for an AI to understand? (Chinese Room Argument) - MLST Video by timscarfe
Yes, the premise of the thought experiment is that the man doesn't understand Chinese but the system "appears" to understand Chinese. This is used to argue that because the man doesn't understand Chinese, the apparent "understanding" is not "real." What I'm saying is that the reason the man doesn't understand Chinese is that the level of abstraction of which he's conscious of is not the level at which understanding happens, and I'm asserting that this is not a convincing argument against the system "truly" understanding Chinese as a whole.
Viewing a single comment thread. View all comments