Submitted by flowday t3_10gxy2t in singularity
EVJoe t1_j572xpw wrote
Reply to comment by croto8 in AGI by 2024, the hard part is now done ? by flowday
Consider synesthesia, the phenomenon wherein a sensory stimulus in one channel (let's say hearing) activates a sensory perception in another sensory channel (let's say vision).
Imagine you have synesthesia, and you're a pre-linguistic human surviving in the wild. You hear a tiger roar, and via synesthesia you also "see" bright red, then a member of your tribe gets eaten by a tiger while others flee.
For such a person, "seeing" red now has personal symbolic meaning associated with tiger danger.
Symbolism does not need to derive from culture or a larger system. All you need is the capacity to recognize patterns between various stimuli. What that looks like for a "mind" that isn't limited to human sensation is another question entirely
croto8 t1_j573xp6 wrote
Your example uses personal experience to create the symbolic representation and subsequent association. Kind of my point.
Edit: to further, pattern recognition could create a similar outcome, through using training data which has this symbolic pattern inherently, but without the personal experience, sense of risk, and context, it’s just gradient descent based on the objective function that was configured to emulate the process.
Viewing a single comment thread. View all comments