YouThisReadWrong420
YouThisReadWrong420 t1_j87s68d wrote
Reply to comment by SgathTriallair in Are you prepping just in case? by AvgAIbot
I definitely think it’s plausible for AI designed with malicious intent to wreak havoc in the ways OP mentioned. We’re entering into a phase where there’s thousands of new and competing AI’s popping up. At least a handful of them will undoubtedly be weaponized. I’m actually terrified of the new cybersecurity threats that will emerge (of course there will be “good” AI’s working against malicious actors, but we’re entering uncharted territory).
YouThisReadWrong420 t1_j87rbgx wrote
Reply to comment by RowKiwi in Are you prepping just in case? by AvgAIbot
It’s likely not going to actually be you though.
YouThisReadWrong420 t1_j87umz5 wrote
Reply to comment by YobaiYamete in Are you prepping just in case? by AvgAIbot
That’s fundamentally different than a digitized copy of oneself. When I die, I’m dead. Sure my memories, patterns, emotions, etc can be replicated. From an outside perspective, one might view the copy as “me”. However, that would not change the fact that my biological self would be permanently detached (assuming there is no afterlife). I’d have no consciousness of the copy, completely and utterly detached. We may even exist simultaneously, however we would still be separate.
Unless you’re describing the act of actual uploading, but somehow converting biology into binary seems incomprehensible to me and may not even be possible. Even if it was, it’s probably impossible to discern the dilemma I just described (it actually being you vs merely a copy).