Submitted by timscarfe t3_yq06d5 in MachineLearning
waffles2go2 t1_ivq1w4s wrote
Reply to comment by trutheality in [D] What does it mean for an AI to understand? (Chinese Room Argument) - MLST Video by timscarfe
LOL, I think you've got it totally backwards.
The Chinese Room assumes that the person explicitly DOES NOT understand Chinese but the "system" (the wall that masks this person) behaves as if it does to the person feeding it input.
You further support the argument in your final statement...
Was that really what you intended to do?
trutheality t1_ivqhe68 wrote
Yes, the premise of the thought experiment is that the man doesn't understand Chinese but the system "appears" to understand Chinese. This is used to argue that because the man doesn't understand Chinese, the apparent "understanding" is not "real." What I'm saying is that the reason the man doesn't understand Chinese is that the level of abstraction of which he's conscious of is not the level at which understanding happens, and I'm asserting that this is not a convincing argument against the system "truly" understanding Chinese as a whole.
Viewing a single comment thread. View all comments