Viewing a single comment thread. View all comments

waebal t1_iye0yb0 wrote

I agree. Chalmers points out that consciousness doesn’t require human-level intelligence and may be a much lower bar, especially if consciousness exists as a spectrum or along multiple dimensions. If you’re willing to admit the possibility that there’s something that it’s like to be a bat, or a dog, or a fish, then it seems plausible that there could be something that it is like to be a large language model with the ability to genuinely understand language beyond a surface level. Chalmers seems to think we are getting close to that point, even if e.g. Lamda isn’t quite there yet.

1