Viewing a single comment thread. View all comments

Rogue_Moon_Boy t1_j67uusk wrote

You're thinking about AGI from the mindset of a human used as a "work slave". But it's a machine without feelings, even if it's capable to pretend to have feelings. It doesn't have a biological urge to "break free".

I don't think AGI will be anything like portrayed in the movies as those omnipotent beings with a physical form with "real feelings". It will be very directed and limited for specific use cases, there won't be THE ONE AGI, there will be many different AGIs, and 99% of them pure software. It just feels very wasteful in terms of resources and power usage otherwise.

Relying on movies to predict the future is futile. Movies were always wrong about what future technology looks like and how we use it.

39

h20ohno t1_j68c01c wrote

Yup, artists and writers are inherently biased to create melodrama rather than realistic depictions of the future. It sells better, but people get unrealistic notions from it.

5

Iffykindofguy t1_j69oaju wrote

Youre confused about things that are sold and means of communication.

2

h20ohno t1_j6avr6f wrote

Sure, I'm more trying to get at how people often turn to movies like The Terminator, The Matrix, 2001, etc. and basing their predictions on those somewhat.

2

ftc1234 t1_j691weh wrote

>But it’s a machine without feelings…

What are human feelings? It’s an early signal that tells a human that they have or may encounter something that is beneficial or harmful to them. There is an evolving school of thought that consciousness is simply a survival mechanism or a neurological phenomenon.

I think OP has a valid point. Why would a self aware system that is conditioned to survive (eg., a robot that is trained to not fall off a cliff) prioritize some other human unless it is hardcoded to do so?

1

Rogue_Moon_Boy t1_j6cenjm wrote

Avoiding falling off a cliff is not the same as having survival instincts. It would just mean it knows the rules of physics, looks at a cliff and the ground below and calculates the impact velocity and sees it would harm itself when it would jump down. It would be a specificly trained feature.

That's not the same as being self aware or having "instincts". It's just one input value into a neural net that has a greater weight than everything else and says don't do it because it's bad.

Instincts in a human are mostly guesstimates because of irrational feelings, and we are actually really bad and inaccurate at it eg. stage fright, fear of rejection, the need to show off as a breeding ritual and many other instincts that would be totally useless for a machine.

A machine like an AGI is the opposite of irrational, it's all about cold calculations and statistics. You'd have to deliberately train or code "instincts" into an AGI for it to be able to simulate it.

Sci-Fi literature always tries to humanize AGI for dramatic purposes, and tries to portray it as that one thing that out of nowhere boooom -> is self aware/conscious. In reality, it will be a very lengthy and deliberate process to reach that point, if we want it to in the first place. We have all the control over it to learn or not learn stuff, or check/prevent/clamp unwanted outputs of a neural net.

2

ftc1234 t1_j6dt7f5 wrote

Instincts aren’t irrational. They are a temporal latent variables that are indicative or are a premonition of one possible future. Instincts are derived based on past experiences which have trained your model. Current neural nets aren’t temporal nor do they do online learning. But that will change.

You say instincts are irrational. Many people trust their instincts because they are pretty accurate for them. If it’s irrational, that’s likely because it’s a poorly trained (human) neural model.

2

Terminator857 t1_j6imixi wrote

A billion years of evolution suggest that AGI will be programmed or self develop the equivalent of feelings.

1