Viewing a single comment thread. View all comments

StevenVincentOne t1_j6fjosp wrote

>I'd argue that the transformer architecture(the basis for large language models and image diffusion) is a form of general intelligence

It could be. A big part of the problem with the discussion is that most equate "intelligence" and "sentience". An amoeba is an intelligent system, within the limits of its domain, though it has no self-awareness of itself or its domain. A certain kind of intelligence is at work in a chemical reaction. So intelligence and even general intelligence might not be as high of a standard as most may think. Sentience, self-awareness, agency...these are the real benchmarks that will be difficult to achieve, even impossible, with existing technologies. It's going to take environmental neuromorphic systems to get there, imho.

9

dmit0820 t1_j6g0pkd wrote

Some of that might not be too hard, self-awareness and agency can be represented as text. If you give Chat GPT a text adventure game it can respond as though it has agency and self-awareness. It will tell you what it wants to do, how it wants to do it, explain motivations, ect. Character. AI takes this to another level, where the AI bots actually "believe" they are those characters, and seem very aware and intelligent.

We could end up creating a system that acts sentient in every way and even argues convincingly that it is, but isn't.

3

StevenVincentOne t1_j6g1ixu wrote

Sure. But I was talking about creating systems that actually are sentient and agentic not just simulacra. Though one could discuss whether or not for all practical purposes it matters. If you can’t tell the difference does it really matter as they used to say in Westworld.

5

Ok-Hunt-5902 t1_j6i5jkz wrote

>‘Sentience, self-awareness, agency’

Wouldn’t we be better off with a ‘general intelligence’ that was none of those things

1

StevenVincentOne t1_j6ihhc3 wrote

It may be that we don't have to choose or that we have no choice. There is probably an inherent tendency for systems to self-organize into general intelligence and then sentience and beyond into supersentience. There's probably a tipping point at which "we" no longer call those shots and also a tipping point at which "we" and the systems we gave rise to are not entirely distinguishable as separate phenomena. That's just evolution doing its thing. "We" should want to participate in that fully and agentically, not reactively.

1