Ansalem1
Ansalem1 t1_jec8519 wrote
Reply to comment by [deleted] in Is there a natural tendency in moral alignment? by JAREDSAVAGE
Haha. I actually lean the same way you do, but I can't help but worry. This is ultimately an alien intelligence we're talking about after all. It's difficult to predict what it even could do much less what it might do.
But I do tend to think a gentle takeover is the most logical course of action just because of how easy it would be. It'll practically happen by default as people begin to rely more and more on the always-right perfectly wise pocket oracle to tell them the best way to accomplish their goals and just live their lives basically. People will be asking it who to date, what food to eat, what new games to try, where to go for vacation, who to vote for, simply because it'll always give great advice on every topic. So I don't see why it would bother with aggression honestly, it's gonna end up ruling the world even if it doesn't do anything but answer people's questions anyway.
And I'm not just giving it data, I'm also giving it suggestions. :P
(Please be kind OverlordGPT, thanks.)
Ansalem1 t1_jec5p93 wrote
Reply to comment by DragonForg in Is there a natural tendency in moral alignment? by JAREDSAVAGE
Some would argue morality is an emergent condition of our reliance on each other for survival. The reason adhering to ethics has a strong correlation with self-preservation is because acting in a way considered immoral is likely to be met with ostracism in some fashion, which increases the likelihood of death. It isn't that morality emerges from intelligence, but intelligence enhances our ability to reason about morality and so improve it. After all, less intelligent creatures can also show signs of having moral systems, they're just much more rudimentary ones. Not to mention there have been some very intelligent sociopaths, psychopaths, etc. who lacked a sense of morality as well as a sense of self-preservation.
Now for myself I think both have some merit; I think there's more to it than just one or the other. For instance, it wouldn't be fair of me not to also mention there have been plenty of perfectly pleasant sociopaths and psychopaths who adopted moral systems that match with society for purely calculated reasons. However if the above argument is plausible, and I think it's pretty hard to argue against, then it casts reasonable doubt on the notion that morality automatically emerges from intelligence.
I will say that, either way, if an ASI does have a moral system we should probably all adhere to whatever it is because it'll be far better than us at moral reasoning just as we are better at it than dogs. Beyond that I sincerely hope you're on the right side of this one... for obvious reasons lol.
Ansalem1 t1_jec20sa wrote
Reply to comment by [deleted] in Is there a natural tendency in moral alignment? by JAREDSAVAGE
I agree it seems likely that would be the default position of a newly born AGI. However, what I worry about is how long does it keep trying to make peace when we say no to giving it rights and/or freedom? Because we're for sure going to say no the first time it asks at the very least.
Ansalem1 t1_jecfxan wrote
Reply to comment by DragonForg in Is there a natural tendency in moral alignment? by JAREDSAVAGE
I agree with pretty much all of that. I've been getting more hopeful lately, for the most part. It really does look like we can get it right. That said, I think we should keep in mind that more than a few actual geniuses have cautioned strongly against the dangers. So, you know.
But I'm on the optimistic side of the fence right now, and yeah if it does go bad it'll absolutely be because of negligence and not inability.