Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
drizel t1_irdqrnl wrote
Your entire argument hinges on the assumption that you can predict how a being of that intellect would think is like a monkey predicting the intentions of a human without any monkey ever having met a human.
OneRedditAccount2000 OP t1_irds565 wrote
The monkey can predict some human thlnking too. The monkey knows if it attacks me I will run away or fight back
I know that if ask ASI to tell me what 2+2 is, it's gonna say 4
I know that if ASI values survival, it will have to neutralise all threats, if it thinks it's in immediate danger
Your argument that ASI will be entirely unpredictable is beyond retarded
It's an intelligence that lives in the same physical universe as everyone else, and you only have so many choices in certain situations
If someone is running with a knife towards you, you have to either stop him or run away, you don't have a billion choices/thoughts about the situation even if you're a superintelligence because it's a problem with only two solutions
what the hell are you even saying, that ASI would say that 2+2 = 5 and we can't predict it will say 4 because it's smarter than us?
ASI isn't a supernatural God, It has to obey physics and logic like everyone else.
It's also made of matter and it can be destroyed.
Lol
drizel t1_irk9hhu wrote
You missed my key point that in my example NO monkey has EVER seen a human before. No one has ever seen an ASI or even an AGI so expecting to have an understanding of how it might "think" is unlikely.
OneRedditAccount2000 OP t1_irleg26 wrote
Yes you dumbass, I totally understood your point. A chimpanzee that sees a human for the first time is not gonna be completely oblivious to what a human being is, how to react to him, and will successfully guess some of his superiorhuman thinking, by making the assumption that the human is a living being and the chimp knows all living beings make certain choices in certain situations, such as being dominant or submissive to smaller/bigger animals. I'm not saying I know what sophisticated mental masturbations would go on in God's mind when it decides between the running or fighting, I'm saying I can predict it will either run or fight because it values not being destroyed and in that situation it only has two choices to not be destroyed.
Again, I'm not saying I will know precisely how ASI will exterminate or domesticate humanity when the ASI is programmed to survive and reproduce, what I'mt saying is that because the ASI has no other choice but to exterminate or domesticate humanity if it wants to survive long term, it will have to make a decision. What third other superintelligent decision that I'm not seeing could it make? Just because I'm God and you have no idea what I'm thinking about it doesn't mean I'm gonna draw you a dyson sphere if you ask me what 2+2 is. In that situation there's only one choice, 4, and you ant/human successfully managed to predict the thought of God/ASI.
Living things in the physical universe either coexist, run from each other, or destroy each other. If you put the ASI to a corner you can predict what it will think in that situation because it has a restricted decision space. An ASI that has a large decision space would be very unpredictable, with that I can agree, but it would still have to work with the same physical universe that we, inferior humans, have to work with. An ASI will never figure out for instance how to break the speed of light. It will never figure out how to become an immaterial invisible unicorn that can eat bananas the size of a galaxy either, because that's also not allowed by the rules.
It's okay to be wrong, friend. You have no idea how many times I've been humiliated in debates and confrontations. Don't listen to your ego and do not reply to this. The point isn't winning against someone, the point is learning something new, and you did, so you're still a winner.
drizel t1_iro9e9t wrote
Lol ok big brain. You're whole argument makes a ton of assumptions which you regard as fact.
OneRedditAccount2000 OP t1_irpvaqy wrote
The only assumptions are that ASI is programmed to survive and reproduce and that the people that make ASI aren't suicidal
Viewing a single comment thread. View all comments