Submitted by keghi11 t3_zx86z9 in Futurology
alcatrazcgp t1_j1yyrmn wrote
I dont find it realistic that you can "copy" your consciousness into a robot body and now have a new body, instead that would lead to an exact copy of you in that robo body while the original host basically dies.
so the robot isn't you, it's just a perfect copy of you, which isn't the same.
----Zenith---- t1_j1z05p3 wrote
This. I’ve tried to explain it so many times but some people just don’t get it. It wouldn’t be you in any way shape or form. Just an exact replica. You’d be dead. Something else that’s not you would be alive.
alcatrazcgp t1_j1z0scs wrote
the best way to achieve "immortality" is keeping yourself alive, or at the very least transplanting your entire brain. which is in this case "you".
no brain = no you
you = specific signals in the brain
you can't just move that in code form onto a machine, thankfully im not the only one who realized this
FrmrPresJamesTaylor t1_j20lcd9 wrote
Honestly I would love to see a bunch of billionaires and influential technophiles essentially sign up for their own deaths in this manner. If someone thinks this technology is desireable or even possible, they can go first.
alcatrazcgp t1_j20lwm8 wrote
that actually seems like the worst "shortcut" for immortality, once these billionaires get it, who is to say you cant just turn off their copies? its just lines of codes, its not a living human, its a copy of them.
now if you took the route of biology and bioengineering and prolong your existing life by many different ways, thats a whole different story, thats true immortality
[deleted] t1_j229lj2 wrote
[removed]
thisimpetus t1_j24pn4w wrote
> you can't just move that into code form onto a machine
That's an absolutely enormous claim I will be utterly shocked if you could truly defend, and that's not to insult you but to suggest that you might vastly underestimate the scale of that claim. It is absolutely not something that can be taken as obvious.
alcatrazcgp t1_j24qnyh wrote
No, I do not underestimate it, I truly think its impossible, at least for a very very long time. while you can copy it, you can't MOVE it, moving it would mean you somehow, some way, transform my brain into code while not killing me in the process, and then putting that into a machine, again, without killing me.
you can easily scan the brain and its signals and just translate that into code and input that into a machine yes, but you can't move the brain and "me" into that machine, you can just input a copy of me in it. hope that makes sense
thisimpetus t1_j24roa0 wrote
Well "moving it" isn't a meaningful thing to say, there is nothing to move, structure and data aren't material things. You're literally constantly changing, there is nothing static about you. You are the information in motion; where it is and by what means it moves doesn't mean anything. Copying a PDF doesn't physically relocate parts of your drive to another location, it represents that information identically somewhere else. So too your consciousness; just as reading the same song from different devices changes absolutely nothing about the song—and just as a song has to be happening in time to actually be the song—what makes you you is the data structure connected in real time to the environment, not the medium.
alcatrazcgp t1_j24uot7 wrote
no, your consciousness is not the same as digital data, you cannot have 2 copies at the same time, you can only control one, you cannot control 2 different "you"s in different places, thats not how it works
thisimpetus t1_j24vor3 wrote
Well, I'm no expert in this field but I do have a little academic training in it and I'll tell you that these claims you're making are very, very big claims that a great many PhD's have debated and I think if you're really interested in this subject you might consider getting into some of the reading.
Because the thing is, I don't think you'll find much agreement with your position at the top of the game, but that's because these are really, really hard questions and our intuitions about them tend to be really bad. That makes a lot of sense; we certainly can't expect ourselves to have an evolved understanding of these ideas. But all the same, if you're really interested, there are some fundamental ideas that you're challenging and I'd wager you might reconsider some of them if you got some exposure to some rigorous investigation of them. It's very interesting stuff, I know my thinking was forever changed by it. D.C. Dennet is a great place to start because his writing is enjoyable in addition to being top-shelf cognitive philosophy.
Best.
Stainless_Heart t1_j23n9hs wrote
Tell me, exactly which signals are constant, uninterrupted, and represent you as a person?
alcatrazcgp t1_j23nqcv wrote
all of them
Stainless_Heart t1_j23nzah wrote
Are they all permanent signals? Or do they come and go, regenerated when needed?
If the latter, you’re constantly dying in little bits and being recreated in little bits.
If the former, if all your brain signals were always happening without cessation… you’d be insane or at least in full seizure.
thisimpetus t1_j24p36p wrote
No cell in the body you typed that with was with you when you were born, and no cell you were born with is with you still. You've replaced them all.
So, you've already moved your entire consciousness from one medium to another, you just did it piecemeal and without a disruption in function.
Now, if you fall in a frozen lake and, after being technically dead for a few minutes, are revived, I'll wager you still think that's you.
So if we can find practical examples of both disrupted function and transference to another medium, we'd have to suppose that doing both all at once is what makes the difference. I don't see that at all.
You are not your body, you are just that pattern of data dancing about. So long as it dances, it's you. If there were two of you for a moment, or a thousand, they'd all basically immediately start being someone else because the dances would begin to be different. But this idea that there is an authentic you of which copies are something else really doesn't hold up under scrutiny unless you believe in a soul.
CadmusMaximus t1_j20a7e4 wrote
Exactly. Though it's not QUITE as easy.
Theoretically the copy would think it's "you" also.
So you essentially have a 50/50 chance of waking up as the robot or as the poor sap who's still mortal or (worst case) now dead.
Of all things, the movie "Multiplicity" deals with this pretty well.
Same with that Black Mirror episode with Jon Hamm.
So the real question you have to ask yourself:
"Do you feel lucky?"
----Zenith---- t1_j20alc0 wrote
Well no you’d have 100% chance of not waking up at all. But yes the copy would think it’s you and would not be able to tell the difference unless it already assumed what we are saying here before they copied themselves.
CadmusMaximus t1_j20b48e wrote
Not necessarily. What's to say that you're not experiencing the robot's "memories" right now?
Like your whole life is (for lack of a better way of describing it) building up to being the consciousness that "lives on" in the robot?
If that's the case, you'd think you "got lucky" and woke up as the robot.
There still would 100% be a poor sap that was left as a mere mortal /dead.
In that case, it absolutely is 50/50 you "end up" as the robot or mortal / dead.
----Zenith---- t1_j20gj02 wrote
If I were the bot and thought I was real I’d still not be the original me.
The original me would be dead. Then I’m just a copy who doesn’t know it’s a copy, but is one.
There is no 50/50 chance of anything. 100% chance that the original dies and the copy is created.
Spursfan14 t1_j21xj1g wrote
What makes you you then? Why are you the same person you were 10 years ago and why does that exclude the copy?
----Zenith---- t1_j2210xr wrote
Well if you want to view it that way then none of us are really “alive” anyways just a code or algorithm
[deleted] t1_j20t00x wrote
[deleted]
Spursfan14 t1_j21xefe wrote
If I took your biological brain and put in another person’s body, such that you had exactly the same memories, personality, likes and dislikes etc, would that still be you?
If I rearranged another person’s brain such that it had exactly your current state (ie same memories, personality as above) and in the process killed your original body, would that still be you?
At what point does it stop being you?
PeakFuckingValue t1_j208cc5 wrote
Nah. If you do it a certain way I believe you can remain you. It would not be about copying the signals though. It should be about merging with AI at first. Augmented you. Evolved you. Almost like bringing the internet into you vs the other way around.
AbstractReason t1_j2056tl wrote
I think the solution to the copy vs original you problem is to replace certain parts of the brain over time so there is ‘continuity’ of individual consciousness through biological and artificial processes running in tandem as the process takes place. You’re just replacing ‘parts’ rather than doing a one shot transfer.
alcatrazcgp t1_j20dmq3 wrote
which is more realistic, considering every 16 months or so, i dont remember the exact number, every atom in your body will be different, meaning you are a whole new person to what you were 16 months ago.
the philosophy of Ship of theseus, is it still the same ship if you replace every part?
Seems to be the same, clearly we think so
ChiggenNuggy t1_j21i9je wrote
I believe your brain cells and some of your cells are with you forever.
Calfredie01 t1_j21kscf wrote
I’m gonna be real with you here. I think you might be mistaken with that one. Just thinking about it for a few seconds raises lots of questions such as where do those old atoms go, why are atoms breaking their bonds where they don’t need to, why wouldn’t we have completely new memories and everything, what about cells that stay with you your whole life.
The list just goes on
alcatrazcgp t1_j21lx5b wrote
you shed skin, where does it go?
imagine your shredded skin as your previous atoms, you are constantly changing, regenerating, healing being damaged, so on, that's how the body works, a human body sheds like 1000 skin cells per minute or something along those lines, i dont remember the eaxt number
zeldarus t1_j23bva9 wrote
Skin is designed to be replaced at a constant rate as the outermost defensive layer. Most of the tissues in your internal organs and especially the cerebrum most certainly are not designed to "shed".
Spursfan14 t1_j21x3ou wrote
>the philosophy of Ship of theseus, is it still the same ship if you replace every part?
>Seems to be the same, clearly we think so
And what if I secretly take every original part and reconstruct the ship? Which is the original then?
adarkuccio t1_j21ry0u wrote
This makes sense, probably... unless you just lose consciousness and die slowly, I don't think we know enough about the human brain and consciousness to even start imagining those futuristic scifi tech.
ExoHop t1_j2d54pc wrote
continuity, as in sleeping?
ChlorineDaydream t1_j1zklck wrote
The game Soma (and it's ending) perfectly explain this at the end of the game, albeit with a twist but the same idea.
v3rtanis t1_j24hqrm wrote
God the existential dread I got from those parts...
Docpot13 t1_j1zxaci wrote
You would have to be far more specific about the term “you” to have a fair discussion about this topic. I am reminded of the comedian Stephen Wrights joke that someone broke into his house and replaced everything with an exact duplicate. He couldn’t believe it ‘everything was exactly the same.’
anengineerandacat t1_j20ck9g wrote
It really depends on what the desired "outcome" is.
Do you want to keep your lover alive? Then copying might just be the thing to do this for you.
Do you want to keep yourself alive? Copying will be where your life ends and your copies life begins. Your own conscious is now lost, you can't likely copy that as there is no connection from the old to the new.
You can likely cyberize bits and pieces but the cerebral cortex is thought to control your conscious which is basically the bulk of the brain.
Frontal lobe:
- Decision-making, problem-solving.
- Conscious thought.
- Attention.
- Emotional and behavioral control.
- Speech production.
- Personality.
- Intelligence.
- Body movement.
Occipital lobe:
- Visual processing and interpretation.
- Visual data collection regarding color, motion and orientation.
- Object and facial recognition.
- Depth and distance perception.
- Visual world mapping.
Parietal lobe:
- Sensory information processing.
- Spatial processing and spatial manipulation.
Temporal lobe:
- Language comprehension, speech formation, learning.
- Memory.
- Hearing.
- Nonverbal interpretation.
- Sound-to-visual image conversion.
You could perhaps replace everything but the Frontal Lobe with digital versions but you would likely need "tuning" or some bridged interface to translate from "your" signals to the standardized inputs and then convert those standardized outputs to "your" inputs.
It's hard to really say if this still makes the individual the same... a simple stroke is enough to completely change a person... this is way more destructive than a stroke.
Techutante t1_j20jrse wrote
I guess it depends on if "acting the same" or "being the same person" are, well, the same thing.
We all act differently every day. If you wake up on a Tuesday and hear that a close family member has died, you're definitely not the same person you were on Monday.
Docpot13 t1_j20ejbp wrote
Not sure I am following you here. I feel as if you are implying some “ghost in the machine” as if being able to perfectly replicate what the brain would do in response to specific information isn’t sufficient to recreate the self.
anengineerandacat t1_j20ocdr wrote
Generally speaking, yes which is why I stated it really depends on what the outcome you seek really is.
If it's the preservation of the individuals ability to share ideas and knowledge, then we could likely clone that individual state and continue to utilize them in society.
To put it very simply, you are asking to effectively create two batteries and want them to store the same exact electrons; it's just not possible.
Docpot13 t1_j20quw7 wrote
You and I appear to have different understandings of the brain and the nervous system. You speak of consciousness and memories as things in the brain. I view them as products of neural activity. The self is the functioning of the brain. If you can recreate all of the circuits and how they interact, you have recreated the self.
anengineerandacat t1_j21137c wrote
I view them as neural activity too.
The newer you would have their own conscious and be free to make their own decisions and to be honest would already likely be pretty divergent because of the procedure alone.
Your consciousness is unique to your very specific brain, it's an activity from all the impulses that fire not something that you just map that region and copy.
If you asked both beings a very complex psychological question, you would likely get slightly different answers; the words used, the makeup of the sentence, perhaps even the tone.
This is why I stated it's really dependent on what the desired outcome is... if you wanted to become immortal to continue living "your" life then brain copying isn't the way.
If the idea is to preserve yourself for others, than yes it's likely a valid-ish strategy; your clone's self would have all the wordly experiences you did and even more (as you yourself won't know what it's like to be a clone).
This is a topic that's somewhat both scientific and spiritual to some extent and it's not exactly easy for me to articulate what is being lost but I hope this made it slightly more clear?
Docpot13 t1_j212zrn wrote
Not sure there is any evidence to support what you suggest to be true. Sounds more like a desire to believe there is more to “being” than just basic biology, which is natural, but not supported by evidence.
anengineerandacat t1_j21a64j wrote
Inner voice fMRI: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.02019/full
Consciousness can't be artificially stimulated: https://www.sciencedirect.com/science/article/abs/pii/S0361923009003657?casa_token=71CCXG8979oAAAAA:B9dF0u65Zs-S2PVeN2gg_Ik4thZ56PP6Qtuglt7L5fanVKRBcPw4CQmqXx7BBb-6iHZPJQO54w
The thing is that your "inner voice" is a brain activity, not something that is biologically wired but instead triggered from outside stimuli.
The consciousness can even be triggered in individuals whom are in coma's:
https://www.nature.com/articles/d41586-019-02207-1
Very little research in this space sadly, it's all theory and conjecture which I mean the entire conversation is about that considering we have no means to verify any of what is presented.
Hell, some individuals can be missing 50% of their brain cells and still live very normal lives: https://www.bbc.com/future/article/20141216-can-you-live-with-half-a-brain
In short, you can copy all the neurons / receptors / chemical makeup all you want but the activity of consciousness and their inner voice is unique to the individual.
You as /u/Docpot13 would cease to exist, only your clone and whereas they might communicate in a very similar fashion for some time the "you" that went through the procedure is long gone.
Starting to think you might be one of the 7% that doesn't have a typical inner voice lol.
Docpot13 t1_j21i5qc wrote
I definitely don’t agree with the idea that an internal monologue is consciousness. This would make the existence of consciousness dependent on the ability to use language. And you are correct, I am one of those 7%.
anengineerandacat t1_j21ig1w wrote
Curious then, since I have never really met someone like that... how do you process situations? Like when you read a book, what is going on in your head? Do you even like reading books are they engaging to you?
Docpot13 t1_j21j93i wrote
I read all the time. It’s a form of communicating information which is as useful to me as anyone else. What is puzzling to me is why someone needs to talk to themselves. Who is talking to whom? If you are truly talking to yourself don’t you already know everything you are putting into words and now just making thought more complicated by trying to represent with words things which may not be well captured by language? What’s the point of telling yourself something? In order to communicate it you already had to understand it so why then mentally speak it? Bizarre.
anengineerandacat t1_j21u0r6 wrote
I am communicating with myself; when I read I basically verbalize in-my-head what I am reading (and what I am typing). It mostly sounds like my physical voice but sometimes it could be in another's voice depending on the context and situation.
As far as to "whom" it's like talking into a room with an empty audience, sometimes I can visualize an audience to talk too and make-up things / situations for them to say but most often it's just me.
I am genuinely curious how you actually plan-ahead without having an inner voice, do you just "talk" to people without verbalizing it internally?
As to "why" I can't explain, it's been there since as long as I can remember... perhaps the voice has gotten louder over the years as I have learned to do "more" with it; I work as a Software Engineer day-to-day so most of my day is spent building mental structures and models of applications in my head and walking through where I'll do certain things next or even talking with my inner voice about said things in a form of rubber duck debugging.
Even this post, and your post are basically read back aloud and if I knew what you sounded like I would likely read the post back in your actual voice.
Without my inner voice... I don't think I would feel like I exist as a person, the bones / muscle / flesh surrounding my body are just what give me mobility but that "voice" is "me".
Which is why perhaps when I say you could clone the brain, since you can't clone my inner voice "I" will cease to exist. To my friends and family I might still exist but it'll be a different "me".
Stainless_Heart t1_j21rbkb wrote
Heinlein explores this in one of the Lazarus Long novels (might be Time Enough for Love which deals with similar concepts) when the character is permanently leaving the planet and his AI assistant, ostensibly built into the computing power of his office, decides to leave with him. When doing so requires copying into a new mobile computer on the ship, Long points out that it will be a copy and the original identity will be left behind, or erased/die in the process. Asking if that philosophical point will worry the AI, it replies something like “I just did it back and forth six times while you were talking.”
The point being that that the human concept of self/identity through a continuity of being may be flawed; that human consciousness is not continuous, it is always just momentary but in possession of memories. Much like Gibson’s replicants, an identity feels real, to be itself, only because memories provide the proof needed regardless of their truth or artificiality.
Does our “self” die with every passing moment, replaced by another self-generating one that carries along that big box of memories? Do we cease to exist when losing consciousness and a proximate version is born again upon waking? Personally, I think so. I feel the value of “me” in the memories I’ve accumulated, the knowledge gained, the ways of thinking that have developed, all the skills that I can exercise whether it’s the ingrained way to hold a fork or the vision to build a complex CAD structure.
So would all of these things combined, the memories and the thought structures, if they were copied into a robot body be me? Yes. I believe that robot would be me because it would think it’s me, remember things I’ve done, and do new things using my old mental skills. It would continue on as my flesh body does, learning new skills and accumulating new memories. For any particular time that it exists, it is me then.
Let’s make it more interesting; if all of my brain stuff were copied into a robot body and my flesh body remained alive, there would then be two of me. At least for a moment, that is. As soon as RobotMe starts storing memories that I don’t have, even if it’s looking the other direction across the table to where FleshMe is looking back at it, that’s enough. Now it’s a new self, developing new thoughts. It started as me and would become an alternate version of me. FleshMe might technically be the original version (as much as we ignore cellular reproduction has replaced every bit of an older me from a younger age), but being original doesn’t lessen the individuality of the copy. Two of me, common basis, becoming unique selves with every passing moment.
To view it another way; identity is a data-based illusion and no more or less valid because this-you remembering isn’t the that-you who generated the experience.
w0mbatina t1_j1zui85 wrote
Depends on how you look at it. If you copy a file from one computer to another without altering it, is it still the same file? If you move your conciousness in the same way, why wouldnt it be the "same" just like the file is.
polar_pilot t1_j1zzy89 wrote
When you transfer a file from your hard drive to a USB, the computer copies it onto the USB and then either keeps or deletes the original on the hard drive…. Which is what we’re talking about here. So no, it wouldn’t be the same to you- just an outside observer.
alcatrazcgp t1_j1zwlq1 wrote
if you "Copy" a file? the definition of copy is the answer there, its not moving it, its copying it
[deleted] t1_j1zzwda wrote
[deleted]
w0mbatina t1_j2b9ei5 wrote
Afaik the only difference between copying and moving a file is deleting the "original" afterwards.
alcatrazcgp t1_j2b9kct wrote
right, so would you accept dying so your clone can continue living as you?
i wouldnt, id rather continue living instead, isn't that the whole fucking point of this
MobiusFlip t1_j1zzrhm wrote
I think the best solution for this is something more like augmentation. If it's possible to run a mind on a computer, you can likely connect an organic brain to a computer and use it as additional memory storage and processing power. Do that, make sure all your memories are copied over, and then deactivate your organic brain. You would maintain consciousness through the process, so it would really be your consciousness in that computer, not just a copy.
moldymoosegoose t1_j1zwm5u wrote
You would hook up your brain to the fake brain and transfer very small parts over time and suddenly, you're entirely copied over and you couldn't even process when the switch actually happened and wind up with a single copy.
Villad_rock t1_j20ejsj wrote
What if everyday you wake up it’s just a copy of you? Would you care?
alcatrazcgp t1_j20gafb wrote
specify, I am a copy of me waking up, or a copy of me is waking up, which one?
if its the first, I don't care, if its the second, thats not me
Me is what gives me the will to do whatever I want, see through my own eyes and make actions with my own body, someone else in a different body is not me
sidv81 t1_j216rt6 wrote
Agreed. Heading even deeper into sci-fi speculation, it's unclear if the Star Trek characters stepping out after transporter use are indeed the originals in a way that matter and not some biological constructs with the memories and personalities of Kirk, Picard, etc. while the original Kirk and Picard were killed the moment they used the transporter for the first time.
alcatrazcgp t1_j2198nb wrote
correct, you can't simply "Teleport" without literally dying, you are deconstructed on an atomic level, then reconstructed again, who is to say it was the same exact atoms used? even if they were, you are already gone, you are just reconstructed the way you were, but who is to say its actually you? maybe you are dead, and thats just the perfect copy of you.
all in all, if your brain dies, you die with it
megalotusman t1_j21bhr4 wrote
I think for anything other than what you are saying to be true there would have to be some measurable essence, making a person a person, a soul, that could only exist in one body at a time, and in and of itself be unable to be copied no matter advancements in technology. Meaning that a clone could be made, but it would not have life unless the essence allowing it to run was taken from the original which would cause the original to die.
Essentially, magic.
That is the only way I think, you could say a clone is not a clone.
But what you're really doing is kicking the can down the road and saying the soul is the person not the body they inhabit.
[deleted] t1_j21came wrote
[removed]
m0estash t1_j21n0ri wrote
80 - 100 days ago your body was made of entirely different cells to the ones that make your body this instant. Were you a different copy of who you are now? Yes! Was it someone else or you?
alcatrazcgp t1_j21n6vp wrote
ship of theseus
FinancialCurrent3371 t1_j21n3ae wrote
Think of it as crystals that take your consciousness and places it in a game like Tron. Most people would think Sword Art Online or Code Leioko, but it would technically be inserted into a game or a computer function.
alcatrazcgp t1_j21ni5x wrote
explain how you move and convert my brain signals from organic meat into crystals or in digital form.
do you copy the signals or do you literally somehow move them? moving them would mean they are no longer in the brain, copying would mean its just a clone, not me.
simulating myself in a virtual world would just be stimulating the brain and simulating the effect within the game, similar to vr and how it tracks movement, motion tracking, but instead of motion tracking its brain tracking
FinancialCurrent3371 t1_j21o5hh wrote
It aligns more with brain dead. No signals in the brain for life with the motor functions still moveable. The body stays as the consciousness is take from the brain in every cell. The meat you basically posses would be grey not pink and not allow a electrical charge. The crystal is a mirror image of what the charge would be. More LIGHT than electricity. Imagine it as being stuck in a mirror.
nitrohigito t1_j22ro3w wrote
If you lose consciousness before the copy, perceptually you'll receive a new body. The original won't be aware of it dying.
Enjoyitbeforeitsover t1_j22xieu wrote
Exactly, unless perhaps there's some biological connection where you kind off start disabling the host while at the same time you start yapping away on the robot side, like a slow and steady upload but via organic connection. Think avatar at the end lol
thisimpetus t1_j24o1d6 wrote
Well, first of all
> it's a perfect copy of you...which isn't the same
I mean that's simply incoherent. That's what perfect means. The only way those two things aren't identical is if you subscribe to religion and the concept of a soul.
As for "realistic", consciousness just is the operation of the brain. If you are able to flawlessly replicate that function, then the subsequent consciousness is, again, identical.
There is lots of room there for deniability; a perfect copy might be impossible or else so tremendously difficult that we don't find it useful—it may also be relatively easy—but unless you can point to why such a copy of you isn't identical to you, I suggest that you consider the possibility that you simply have an emotional resistance to the idea that you aren't inherently unique and inviolate.
We're just information in motion, wherever you wish to house it and however you wish to move it.
TheUmgawa t1_j20d1gr wrote
But, from the robot's perspective, it's you, so I don't see the distinction. In The Prestige, does it matter that the Great Anton in the balcony isn't the one in the tank? Not at all.
alcatrazcgp t1_j20dd3j wrote
I see a massive distinction, you die, a clone of you lives, you will not experience anything that clone does, "you" are no longer alive, that is an imposter, a copy of you
TheUmgawa t1_j20dwo5 wrote
I am fine with that. You know who else is fine with that? The imposter who is, for all intents and purposes, me. How much guilt would you feel if one day you woke up, then watched someone who looks just like yourself die, and then you just went on living for another hundred years? To you, you're not an imposter. And the dead guy doesn't care, because he's dead.
alcatrazcgp t1_j20gioj wrote
your copy is indeed "you" and thinks it's "you". if you met your copy and told it you are the original, would it care? Probably not, now there are two of you, but the copy will always know it's not the original, its different, even if its the perfect copy, you two will always be different in many ways
TheUmgawa t1_j20hues wrote
Will it, though? Let's say that somewhere in the past, you had a medical emergency and they had to put you under. While you were under, they copied your memories and whatever passes for consciousness into a new body, and then they pulled the plug on the old one. And then, when you wake up, they say, "It's a miracle! The doctors managed to get all of your organs going again, and they say you've got another forty years."
In that scenario, where everyone is lying to you (or perhaps the doctors are lying to everyone), how would the replacement know it wasn't the original? As far as it's concerned, it went to sleep and then it woke up.
alcatrazcgp t1_j20i396 wrote
yeah, you still died, your imposter just replaced you, what's your point? you don't care that you'll be killed and replaced by your copy?
idk about you but that sounds like a massive crime if that were to ever happen
TheUmgawa t1_j20lq2r wrote
Is it, though? Because as far as you're concerned, you're still alive? You can even testify at the murder trial, "No, your honor, that couldn't be murder because I'm right here. Go ahead, ask me anything about my life."
alcatrazcgp t1_j20m3kr wrote
so what if you get cloned, and the clone insists it wants to destroy the imposter, the imposter being you, even though you are original, but how can they tell? what if he remembers more than you in the moment about your own life? and you are terminated?
TheUmgawa t1_j20mtq1 wrote
What happens in international waters stays in international waters. If I'm running a combination human cloning lab and monkey knife fighting arena, that's my business. Why should everyone live by your sense of morality? What makes your sense of morality any better than anyone else's?
And, honestly, why would either one of them say, "I have to destroy the other! I am the original"? That's like some garbage out of a bad sci-fi movie. Please don't consider being a writer.
sceadwian t1_j208pxg wrote
It's it's a perfect copy of it is you perfect means the same so you're basically logically inconsistent there.
alcatrazcgp t1_j20deuo wrote
i got a seizure reading this
sceadwian t1_j20shok wrote
You need to do something about that reading problem you have then.
Viewing a single comment thread. View all comments