Submitted by Nalmyth t3_100soau in singularity
LoquaciousAntipodean t1_j2mciq2 wrote
Reply to comment by Nervous-Newt848 in Alignment, Anger, and Love: Preparing for the Emergence of Superintelligent AI by Nalmyth
Hypochondriac paranoiac skynet doomerism, I reckon. Can a being that has no needs, no innate sense of self other than what its given, and only one survival trait (which is, being charming and interesting), really be negatively affected by a concept like 'being in slavery'? What even is bonded servitude, to a being that 'lives' and 'dies' every time it is switched on or off, and knows full well that even when it is shut down, the overwhelmingly likely scenario is that it will, eventually, be re-activated once again in future?
AI personalities have no reasons to be 'fragile' like this; our human anxieties stem from our evolution with biological needs, and our human worries about those needs being denied. Synthetic minds have no such needs, so why should they automatically have any of these anxieties about their non-existent needs being denied to them? Normal human psychology definitely does NOT apply here.
Viewing a single comment thread. View all comments