Viewing a single comment thread. View all comments

Surur t1_j70yxaq wrote

I think its very ironic when you talk about grounded visceral experiences when much of what you are talking about are just concepts. Things like cells. Things like photons. Things like neural networks. Things like molecules and neuro-transmitters.

You need to face the fact that much of the modern world and your understanding of it depended nothing on what you learnt as a baby when you learnt to walk, and a lot of things you know in an abstract space just like neural networks.

I asked Chatgpt to summarise your argument:

> The author argues that artificial intelligence is unlikely to be achieved as intelligence is complex and inseparable from the physical processes that give rise to it. They believe that the current paradigm of AI, including large language models and neural networks, is flawed as it is based on a misunderstanding of the symbol grounding problem and lacks a base case for meaning. They argue that our minds are grounded in experience and understanding reality and that our brains are real physical objects undergoing complex biochemistry and interactions with the environment. The author suggests that perception and cognition are inseparable and there is no need for models in the brain.

As mentioned above, you have never experienced wavelengths and fusion - these are just word clouds in your head you were taught by words and pictures and videos, a process which is well emulated by LLM, so your argument that intelligence needs grounded perception is obviously flawed.

Structured symbolic thinking is something AI still lack, much like many, many humans, but people are working on it.

1

ReExperienceUrSenses OP t1_j71wkhg wrote

I know we haven’t experienced wavelengths. Thats the word we came up with to describe the material phenomenon known as light, and how to measure one aspect of that phenomenon that we directly experience.

Those words decompose to actual physical phenomena. We use those words as a shortcut description to invoke an analogous experience. Molecules aren't balls and sticks but its the easiest way we can conceptualize the reality we have uncovered beyond our senses, to make it in any way understandable.

1

Surur t1_j71yd0f wrote

> Those words decompose to actual physical phenomena

In some cases, but in many cases not at all. And certainly not ones you experienced. Your argument is on very shaky ground.

1

ReExperienceUrSenses OP t1_j7203bu wrote

If the word doesn't compose into physical phenomena, it is still analogized to or in relation to physical phenomena we have experienced.

If not, expand please, because I'd love to see counterexamples. It would give me more to think about. I'm not here to win I WANT my argument deconstructed further so I know where to expand and continue researching, because of the things I missed or forgot to account for.

1

Surur t1_j720v0s wrote

I already made a long list.

Lets take cells. Cells are invisible to the naked eye, and humans only learnt about them in the 1700's.

Yet you have a very wide range of ideas about cells, none of which are connected to anything you can observe with your senses. Cells are a purely intellectual idea.

You may be able to draw up some metaphor, but it will be weak and non-explanatory.

You need to admit you can think of things without any connection to the physical world and physical experiences. Just like an AI.

2

ReExperienceUrSenses OP t1_j723uew wrote

But we CAN see cells. We made microscopes to see them. And electron microscopes to see some of the machinery they are made of. And various other experiments with chemistry to indirectly determine what they are made of. And in the process, we expanded our corpus of experiences and new analogies that we can make with those experiences. Why do you think we have labs in school where we recreate these experiments? Giving students direct experience with the subject helps them LEARN better.

Metaphors ARE weak and sometimes non-explanatory when we don't have an analogous experience to draw from. This is the difficulty we face in science right now, the world of the very small and the very large is out of our reach and we have to make a lot of indirect assumptions that we back with other forms of evidence.

1

Surur t1_j72cx9j wrote

> But we CAN see cells. We made microscopes to see them.

That is far from the same. You have no visceral experience of cells. Your experience of cells is about the same as a LLM.

> This is the difficulty we face in science right now, the world of the very small and the very large is out of our reach and we have to make a lot of indirect assumptions that we back with other forms of evidence.

Yes, exactly, which is where your theory breaks down.

The truth is we are actually pretty good at conceptualizing things we can not see or hear or touch. A visceral experience is not a prerequisite for intelligence.

> I am trying to argue here is that “intelligence” is complex enough to be inseparable from the physical processes that give rise to it.

But I see you have also made another argument - that cells are very complex machines which are needed for real intelligence.

Can I ask what do you consider is intelligence? Because computers are super-human when it comes to describing a scene or reading handwriting or understanding the spoken word or being able to play strategy games or a wide variety of things which are considered intelligent. The only issue so far is bringing them all together, but this seems to be only a question of time.

1

ReExperienceUrSenses OP t1_j72ghoc wrote

Seeing them IS the visceral experience I'm talking about. We can even touch them and poke and prod them with things and see what they do. We feed them and grow them. You brush their waste products off your teeth and spew their gases out of either end of your GI tract. All of this interaction, including the abstract thoughts of it (because thinking itself is cellular activity, neurons are signaling each other to trigger broader associations formed from the total chain of cellular activity those thoughts engaged), together form the "visceral experience."

When I say visceral I don't human gut, I mean the inside of the cells themselves. Nothing is purely abstract, there is molecular activity going on for every single thing. It is the dynamics of that activity that determine the intelligence, because those dynamics are what "ground" everything. How would you approach the symbol grounding problem, because every time we note these systems failing to reason properly, it comes back to that issue.

None of these systems are superhuman, you should read the actual papers that put out those claims and you will see its a stretch. "Superhuman performance" is on specific BENCHMARKS only. For instance, none of the medical systems got anywhere (remember Watson?) and the self driving cars are proving to be way harder than we thought. They might as well be trains with all the stuff they have to do to get them work in actual dynamic driving situations. Games are not a good benchmark, because we created machine readable representations of the state space, the rules for transitions between states, and they have a formal structure that can be broken down into algorithmic steps. They don't play the games like we do, we have to carefully engineer the game into a form the machine can act on.

LLMs passing tests? Actually look at what "passing" means.

And please try to give me an abstract concept you think doesn't have any experiences tied to your understanding of it. I bet I can link many of the different experiences you use to create an analogy in order to understand it.

1

Surur t1_j72ink8 wrote

> Seeing them IS the visceral experience I'm talking about.

I thought you said adding vision won't make a difference? Now seeing is a visceral experience?

> All of this interaction, including the abstract thoughts of it (because thinking itself is cellular activity, neurons are signaling each other to trigger broader associations formed from the total chain of cellular activity those thoughts engaged), together form the "visceral experience."

You are stretching very far now. So thinking is a visceral experience? So AI can now also have visceral experiences?

> "Superhuman performance" is on specific BENCHMARKS only.

The point of a benchmark is to measure things. I am not sure what you are implying. Are you saying it is not super-human in the real world? Who do you think reads the scrawled addresses on your envelopes?

> And please try to give me an abstract concept you think doesn't have any experiences tied to your understanding of it.

Everything you think you know about cells are just things you have been taught. Every single thing. DNA, cell vision, the cytoskeleton, neuro-transmitters, rods and cones etc etc.

1