Submitted by Rofel_Wodring t3_11nnof6 in Futurology
Kiizmod0 t1_jbovxr0 wrote
Brother, we, humans, has never been able to define in a reward function beyond material gains for over collective existence, and yet we as sentient fuckers can actually grasp ethics, kindness, love etc, and we don't optimize those.
How the heck do you want to numerize those immaterial goodies for an agent which is not sentient at all, to being you out of dystopia.
Rofel_Wodring OP t1_jbp2wt5 wrote
You won't need to. I didn't say anything about our morals getting better. What I'm saying is that AI will destroy the power differential between tyrant and slave that pretty much every dystopian vision of the future relies upon.
What's the point of Gattaca babies when the AI-Neocortex Cloud is way better than anything you can engineer?
What's the point of owning the entire news media if we have millions of independent AI journalists working for free?
If the tyrants can't keep AI on a leash (and our economic and political situation guarantees they can't), the only way they can control us if by controlling certain resources. Which raises the question of how they plan to do this if any unitary or oligarchic intelligence will be intellectually crushed by the hoi polloi's millions of lesser AI.
Viewing a single comment thread. View all comments