Submitted by SirDidymus t3_114ibv2 in singularity
jdawgeleven11 t1_j8zcvq8 wrote
If one were to agree with Kant, time is just a construct of our perception, a necessary condition to experience, not something that one experiences.
Further, as another has mentioned, we are not sure what the substrate and dynamics that give rise to the internal representation of ourselves and the outside world we call consciousness is, and therefore cannot know whether any synthetically intelligent system would ever have a first person subjective experience that they could call time in the first place.
Does an AGI need a visual system? Does that system have to be sufficiently integrated to auditory and sensory inputs as well as its intelligent manipulation of symbols in order to experience? To be determined.
Also further, like another other has said, you are anthropomorphizing these systems. We are bound to the drives that evolution has saddled us with, these systems, I doubt, will be burdened with emotions or suffering unless they are given those capacities. Why would we give these systems a sense of suffering? If all they know is a language, nothing about the knowledge of what suffering means in a web of meaning will help an AI actual experience suffering.
Viewing a single comment thread. View all comments