Brain Recording

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
Akkleptos
Jedi Knight
Posts: 643
Joined: 2008-12-17 02:14am
Location: Between grenades and H1N1.
Contact:

Re: Brain Recording

Post by Akkleptos »

Kitsune wrote:Are actually the same person who went to bed last night. Our own memories are constantly rewriting themselves inside of our brains already.
Now that's an interesting point. Yes, you're not exactly the same person you were the day before, but that's still you, changing. Not someone else who happens to think he's you. "Invasion of the Body Snatchers", anyone?
B5B7 wrote:Then you must have died years ago - the brain you have now is not the brain you had as a baby. It consists of new matter as well as new neural connections, memories, etc.
New neural connections and memories, yes. New matter, that's debatable, as the neurons you have will be esentially the same, minus the ones you've been losing by simply living. See below (emphasis mine):
Wiki wrote:Although neurons do not divide or replicate in most parts of the adult vertebrate brain, it is often possible for axons to regrow if they are severed. This can take a long time: after a nerve injury to the human arm, for example, it may take months for feeling to return to the hands and fingers.
B5B7 wrote:Incidentally, I notice you have The Doctor as your avatar - when the Doctor regenerates is he still the Doctor?
Yes, but then again, not quite. He does undergo some changes. Besides, regeneration is a change that happens in the same organism, no funky "recording mind then uploading it to a different host" shenanigans involved.
Acidburns wrote:Akkleptos, what is it about the recording process that leads you to your conclusion?
The fact that the result is a copy. You are still you and your copy is exactly that: a copy, regardless of how indistinguishable it may be from the original. As I said before, a twin brother who thinks he's you.
Acidburns wrote:Do you believe that in order to be considered the same person a consciousness must always reside on the same physical medium? Or do you object to the idea of interrupting consciousness by storing it?
Yes and yes. Because (for the first) it's still just a copy, and (for the latter) because it's not an "interruption" as much as it is "termination" as far as the original is concerned.
Starglider wrote:
Akkleptos wrote:Okay, take the example I just mentioned, and exclude the part where you're cloned and have your mind uploaded to the clones. Isn't it clear that [subscript]you die[/subscript]?
Yes, because there are now zero copies of me in the universe; even the 'generalised me' is no longer experiencing anything, and never will again.
Well probably. If many-worlds is correct (and any kind of physical infinity will probably do it, not just the branching timelines resolution to quantum uncertainty) then you can't actually kill anyone anyway in either the subjective or truly global sense. You can just confine most future versions of you into a class of universes where they aren't present. But that's a whole other debate.
I'm quite sure I can't follow you there. But what I meant is that is pretty much the same as what Zod said.
Duchess of Zeon wrote:Of course the recording is you; there's no way to tell the difference between the two, is there? They're absolutely perfect recordings, thus the recording is absolutely, unquestionably you, containing all of the details of your brain that exist... At the exact instance of transfer.
Nope, the recording is not you. It's identical, yes. The people around you wouldn't probably care if they deal with you or with nextyou. But should they decide to terminate the original, I'm quite sure you (not nextyou) would probably have an objection or two:

Cloning technician: There, Ms. Zeon, your clone is ready and the mind recording transfer is complete.
Clone Duchess of Zeon: Whoa! It really worked! So you're... me! Or I am you? Cool!
Duchess of Zeon: Heh heh... No, you're me. Well, it doesn't matter.
Cloning technician: Now, the original is superfluous. Goodbye, Ms. Zeon.*points a gun at you*
Duchess of Zeon: Wait --WHAT?
*BANG*
*You die*

... and nothing more...
General Zod wrote: Isn't that just semantics? Regardless of other people's perceptions, the original entity ceases to exist on the moment of transfer. While things might be continuous from the clone's perspective, from the original's they just ended completely. It's like copying a CD. The disc might otherwise be completely indistinguishable from the original in every detail, but if the original were destroyed when it was transferred it's gone for good and the new disc would still just be a copy.
Precisely. Say you take the best ST replicator and reproduce the Mona Lisa, molecule by molecule. Good luck trying to get a thousandth of the money for it when you sell it. The original was the fruit of Leonardo's genious and hard work, painted with his own hands, for weeks or months the focus of all his creative effort. He sweated, grew frustrated, rejoiced when he completed a detail, and suffered before it. The copy, regardless of how perfect, is the product of a fancy machine. And we're not even talking "consiousness" here.
Duchess of Zeon wrote:There's no difference of perspective; you just go to sleep like for a normal surgical procedure and wake up as a computer.
Uh, no. The computer wakes up and thinks it's you. I'm surprised people here have such a difficulty grasping that.
Life in Commodore 64:
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
GENERATION 29
Don't like what I'm saying?
Take it up with my representative:
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

Akkleptos wrote:
Duchess of Zeon wrote:Of course the recording is you; there's no way to tell the difference between the two, is there? They're absolutely perfect recordings, thus the recording is absolutely, unquestionably you, containing all of the details of your brain that exist... At the exact instance of transfer.
Nope, the recording is not you. It's identical, yes. The people around you wouldn't probably care if they deal with you or with nextyou. But should they decide to terminate the original, I'm quite sure you (not nextyou) would probably have an objection or two:

Cloning technician: There, Ms. Zeon, your clone is ready and the mind recording transfer is complete.
Clone Duchess of Zeon: Whoa! It really worked! So you're... me! Or I am you? Cool!
Duchess of Zeon: Heh heh... No, you're me. Well, it doesn't matter.
Cloning technician: Now, the original is superfluous. Goodbye, Ms. Zeon.*points a gun at you*
Duchess of Zeon: Wait --WHAT?
*BANG*
*You die*

... and nothing more...
And this little display of ignorance changes my argument how? You completely ignored my argument, Sir, which is that the recording is you--at the exact instance in time that the recording is made. By the time the recording is done, the passage of time has already caused differences, if you are still conscious and alive. The obvious solution to this is simply to terminate the old body as part of the transferring process, seamlessly and painlessly while unconscious.

Your argument relies on a soul, a fundamental essence in the original body which makes it unique. It isn't, and there's no such thing as a soul. Time must pass, two perspectives must simultaneously exist, for there to be two unique individuals existant--and in the case of an elderly person whose mind is deteoriating, the copy on a computer core is in fact a better representation of the person than the fleshy, decaying form which should be terminated as a matter of course in the midst of the transfer. I would have no objection to being "terminated" in advance, because of course I would continue to exist immediately afterwards--there would be no gap for me. Just the end of my biological form.

The only time you have problems is if you're dumb enough to create the copy and then allow the original instantiation to persist, at which time it starts developing a diverged personality from the moment of transfer. Simultaneous destruction of the old body removes this objection. There is no new unique personality to carry on a distinct existence in that body; there's just you, in the nearby computer mainframe, free of the pain and suffering of old age and not demanding any cropland anymore to keep yourself alive. The only way your argument has validity is if two identical mental forms are allowed to develop separate personalities. This is easily prevented by terminating the old and inferior body that is decaying during the procedure. At that moment, nobody is being killed since the total coherent sapience of the person is preserved, and an argument to the contrary is specious and ridiculous and requires some sort of magic knowledge of a uniqueness from the original which doesn't exist at that moment of time anyway.

Of course an even easier way to do this would be to just get yourself so utterly cyborged, filled with data storage and parallel processing, that the process simply involves shifting some of your brain processes to additional mechanical parts and shutting down the blanked, essentially mind-wiped brain. That might in fact be how such a transfer would be conducted, allowing for full, awake continuity of consciousness and living the old body a conveniently mindless husk, which for that matter is a good description of your argument: a mindless husk, devoid of any reality and relying on simple appeals to emotion.

Uh, no. The computer wakes up and thinks it's you. I'm surprised people here have such a difficulty grasping that.
No, you wake up and you are the computer. I'm frankly shocked that anyone intelligent enough to handle themselves on this board could fall into such a soul-trap as you've managed to. Come off it for a moment; how is the computer unique from you at the moment of the transfer? It isn't, and therefore it is you, and unless the incompetence of the transferring personnel allows it to happen, there is no further unique person in the old body, which can thus be terminated simultaneously. Ideally the process would indeed involve a transfer of consciousness, leaving a consciousless husk to be disposed of, probably through the mediation of steady cyberneticization before the event of full transfer. But this is not genuinely necessary, not even that, for you to remain you. There is nothing special about your body. Get over it.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
User avatar
General Zod
Never Shuts Up
Posts: 29211
Joined: 2003-11-18 03:08pm
Location: The Clearance Rack
Contact:

Re: Brain Recording

Post by General Zod »

The Duchess of Zeon wrote: No, you wake up and you are the computer. I'm frankly shocked that anyone intelligent enough to handle themselves on this board could fall into such a soul-trap as you've managed to. Come off it for a moment; how is the computer unique from you at the moment of the transfer? It isn't, and therefore it is you, and unless the incompetence of the transferring personnel allows it to happen, there is no further unique person in the old body, which can thus be terminated simultaneously. Ideally the process would indeed involve a transfer of consciousness, leaving a consciousless husk to be disposed of, probably through the mediation of steady cyberneticization before the event of full transfer. But this is not genuinely necessary, not even that, for you to remain you. There is nothing special about your body. Get over it.
Except for how your personality is largely formed thanks to biological experiences and impulses over the course of several years, no soul required? It doesn't sound like you're taking into account how much of someone's personality and memories are formed through physiology alone. Unless you're using the assumption that this theoretical computer will be advanced enough to simulate every conceivable detail and experience a given human body is capable of conceiving as well, anyway.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
User avatar
Akkleptos
Jedi Knight
Posts: 643
Joined: 2008-12-17 02:14am
Location: Between grenades and H1N1.
Contact:

Re: Brain Recording

Post by Akkleptos »

Duchess of Zeon wrote:Your argument relies on a soul, a fundamental essence in the original body which makes it unique.
No it doesn't. I couldn't care less about souls (in the metaphysical sense). And I don't see where you find souls in my argument.
Duchess of Zeon wrote:The obvious solution to this is simply to terminate the old body as part of the transferring process, seamlessly and painlessly while unconscious.
I think I see where this confusion comes from. I'm not talking about a "you" in the sense of anyone who walks, talks, and acts like you, shares all of your memories up to the moment of the mind-copying, etc. I'm talking about a personal "you", which means the consciousness, the experiencing of your life. Let's see, as per your scenario: you fall asleep, due to the effects of of the anesthetic. Clone-you or computer-you wakes up (notice the use of a third person verb). For you (personal-experience you, that is) it's all like: "Great! I'm so going to like this... It's grea... smngns... zzzzzzzzz". Then, nothing more. You don't wake up. Fin. Game over. There is no thinking, no seeing or hearing, not even blackness. It was computer-you or clone-you that woke up, even though as far as anybody else is concerned, that's you, 100%.
Duchess of Zeon wrote:At that moment, nobody is being killed since the total coherent sapience of the person is preserved, and an argument to the contrary is specious and ridiculous and requires some sort of magic knowledge of a uniqueness from the original which doesn't exist at that moment of time anyway.
Ah. Total coherent sapience of a person=the person. Riiiight. What if you clone a cat? Original cat dies. You get a perfect facsimilar though. I honestly can't fathom why is it so difficult to get that it's making a perfect copy, then killing the original. It's not about uniqueness. Imagine your scenario but without the anesthetic. Clone is ready, *bang*, nothing more for you. Upon completion of the transfer clone/computer-you wakes up feeling like a million bucks. She does. Not you. For the same reason that if I cut myself with a piece of broken glass it doesn't affect you, nor should it in any way. Because we're two different organisms. Having the same memories, thoughts, reactions etc. doesn't make it any different.
Duchess of Zeon wrote:Of course an even easier way to do this would be to just get yourself so utterly cyborged, filled with data storage and parallel processing, that the process simply involves shifting some of your brain processes to additional mechanical parts and shutting down the blanked, essentially mind-wiped brain.
Now we're getting somewhere! That one might work, as it is happening within the same organism, the same mind (brain functions), not information copied elsewhere.
Duchess of Zeon wrote:There is nothing special about your body. Get over it.
Well, you haven't seen it :lol:
No, seriously, there's nothing particular about my body or my mind, but the copy would still be a copy and not me in the sense that if you pinch him I wouldn't feel it and viceversa. It would be me for all practical purposes, granted, but if you kill personal-me, I cease to experience anything, even if someone exactly like me lives on. No souls, no uniqueness involved.

EDITs: Added one "if", a hyphen, and a period.
Life in Commodore 64:
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
GENERATION 29
Don't like what I'm saying?
Take it up with my representative:
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

General Zod wrote: Except for how your personality is largely formed thanks to biological experiences and impulses over the course of several years, no soul required? It doesn't sound like you're taking into account how much of someone's personality and memories are formed through physiology alone. Unless you're using the assumption that this theoretical computer will be advanced enough to simulate every conceivable detail and experience a given human body is capable of conceiving as well, anyway.

You are correct, that is one of my fundamental assumptions, that any computer powerful enough to run a sapient's full consciousness will also be able to (and in fact probably trivially) emulate all of the physiological aspects of that person's sapience, with capability to access a full virtual reality environment which could perfectly mimic every detail of a human life. Though of course the person in question will start to change and grow toward a more Computational Intelligence mode of existance over time and adaptation to their new environment, but then people don't remain five years old, either.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

Akkleptos wrote:No it doesn't. I couldn't care less about souls (in the metaphysical sense). And I don't see where you find souls in my argument.
Because they're required for your body to have any magical knowledge of the termination of its existence separate from your transferred sapience.
I think I see where this confusion comes from. I'm not talking about a "you" in the sense of anyone who walks, talks, and acts like you, shares all of your memories up to the moment of the mind-copying, etc. I'm talking about a personal "you", which means the consciousness, the experiencing of your life.
Yes, and that you would still be there, exactly the same, just with a wider option of data-inputs and processing protocols that would ultimately cause changes, but, again, growing up from a five year old into an adult also causes changes to your cognitive functions.
Let's see, as per your scenario: you fall asleep, due to the effects of of the anesthetic. Clone-you or computer-you wakes up (notice the use of a third person verb). For you (personal-experience you, that is) it's all like: "Great! I'm so going to like this... It's grea... smngns... zzzzzzzzz". Then, nothing more. You don't wake up. Fin. Game over. There is no thinking, no seeing or hearing, not even blackness. It was computer-you or clone-you that woke up, even though as far as anybody else is concerned, that's you, 100%.
Except that's not what happens. You just seamlessly continue to function as a computer. Again, the claim that you'd simply cease to exist is ridiculous, because you cannot be defined in that way. The human sapience is just a collection of electromechanical impulses which if duplicated remain you. You put someone underwater for thirty minutes in very cold water and get their heart restarted again, what makes them the same person? You look at a person who is 20 years older than they were the last time you met them, so their body has undergone total cell replacement, how are they the same person? You look at someone who used to have the cognitive abilities of a five year old, and now has the cognitive abilities of a thirty year old, like all humans; how are they the same person?

You seem unable to realize that the change from your fleshy to computational existence is just as trivial as many of the changes which go on throughout our biological lives anyway.
Duchess of Zeon wrote: Ah. Total coherent sapience of a person=the person. Riiiight. What if you clone a cat? Original cat dies. You get a perfect facsimilar though.
This isn't about cloning the body, and yes, the coherent total sapience of a person = that person. It's about the consciousness, who makes us who, instead of a simple biological machine.
I honestly can't fathom why is it so difficult to get that it's making a perfect copy, then killing the original.
Because it's not, it's the perfect continuation of consciousness.
It's not about uniqueness. Imagine your scenario but without the anesthetic. Clone is ready, *bang*, nothing more for you. Upon completion of the transfer clone/computer-you wakes up feeling like a million bucks. She does. Not you.
There's no difference. You're assigning a uniqueness to my fleshy body for which there is no empirical evidence and no rational reason to believe actually exists.
For the same reason that if I cut myself with a piece of broken glass it doesn't affect you, nor should it in any way. Because we're two different organisms.
And that's the key of this, we're not talking about different organisms, we're talking about the sapience. The biological organism is irrelevant here, it's the sapience that matters and that's identical (note this DOES presume the computer you're uploaded onto can perfectly simulate all physiological inputs into human behaviour).
Having the same memories, thoughts, reactions etc. doesn't make it any different.
It means you're the exact same person, my friend, and I think it's time you take a breath and realize that, yes, human sapience really does just come down to a bunch of data. Once I believed the same way you do--then I realized that position was false and without evidence, and gave it up.
Now we're getting somewhere! That one might work, as it is happening within the same organism, the same mind (brain functions), not information copied elsewhere.
Why does the copying process make a difference? Why is the Ship of Theseus still the Ship of Theseus when you replace parts gradually over time instead of all at once? What magical soul-quality do the remaining old parts provide to the new parts to make them the same rather than different, as they'd be if every part was replaced all at once the way you're claiming things work?
Well, you haven't seen it :lol:
No, seriously, there's nothing particular about my body or my mind, but the copy would still be a copy and not me in the sense that if you pinch him I wouldn't feel it and viceversa. It would be me for all practical purposes, granted, but if you kill personal-me, I cease to experience anything, even if someone exactly like me lives on. No souls, no uniqueness involved.

EDITs: Added one "if", a hyphen, and a period.
Unfortunately, you have simply not demonstrated with any kind of defensible argument that is the case.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
User avatar
Khaat
Jedi Master
Posts: 1043
Joined: 2008-11-04 11:42am

Re: Brain Recording

Post by Khaat »

Now in the debate I see going on here, noone is questioning the construct/clone/awareness has "full memory" and is not mystically aware of any discontinuity. This construct/clone/awareness might very well believe it is the original without some outside information. But the original is ended, though granted: also unaware of any discontinuity, since continuity (and awareness) has ended for it.

Consider: the Original is not terminated (for whatever reason) and the Copy is activated (call this time "split"). Is there a net doubling of data value? Is the Copy's only value the data it has acquired since the split, or does it have full value, just as the Original? Does the Original lose it's value, since the Copy carries it's data up to the point of the split?
Rule #1: Believe the autocrat. He means what he says.
Rule #2: Do not be taken in by small signs of normality.
Rule #3: Institutions will not save you.
Rule #4: Be outraged.
Rule #5: Don’t make compromises.
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

Now in the debate I see going on here, noone is questioning the construct/clone/awareness has "full memory" and is not mystically aware of any discontinuity. This construct/clone/awareness might very well believe it is the original without some outside information. But the original is ended, though granted: also unaware of any discontinuity, since continuity (and awareness) has ended for it.
The problem with this argument, again, is that it assumes that the original has some kind of special property which makes it a distinct sapience from the 'copy'. Nothing of the sort is true, though. Is the body dead? Certainly. Is the person dead? No, because everything we define as a person has been continuously existant.

Please remember, gentlemen, everything, in physical terms, is ultimately just information. Since the information of that person has always existed, the person has always existed. There is no discontinuity. The entire cosmos is ultimately just information, and your information being temporarily stored while it's compiled somewhere else is frankly not all that different from sleeping.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
User avatar
frogcurry
Padawan Learner
Posts: 442
Joined: 2005-03-13 06:34am

Re: Brain Recording

Post by frogcurry »

Would it be possible to achieve the brain recording affect desired indirectly, by steadily replacing the biological components of your brain over time with biomechanical ones, until there is no human flesh left? This also seems less difficult than making a copy, which requires the ability to not only read a brain but make a whole new one, biological or otherwise, in the exact same pattern.

I'm not suggesting slapping a computer circuit into your brain or anything sudden, more a slow takeover that changes the balance of material continuously without your being aware of it. If you introduced machines that could replace the actual cell organelles over time with new machinery without destroying the neurons as they did so, I have to assume that you could replace quite a lot of the actual matter without losing fundamental mental function. i.e. basic stuff like Golgi apparatus and the cytoskeletons that are just part of the cell life support could be replaced with suitable bio-related nano-tech. A lot of the basic brain matter that isn't neurons or other cells but holds it all together could also be replaced with suitable materials, i.e. to make your brain less fragile to shocks or strokes by replacing the blood supply vessels and the blood itself with synthetic systems.

The thing is, you don't need to "read" the brain to do this, as your machinery can be dumb instruments that just push through the brain, replacing all the targetted organic parts as they encounter them. I don't know enough about how neurons work though, to judge how much of a neuron cell you actually need to have for it to still do its wonderful thinking stuff. The limitation of how beneficial this would be, is how much you need to retain as original old you to keep you still mentally functional (and able to learn, etc).
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

frogcurry wrote:Would it be possible to achieve the brain recording affect desired indirectly, by steadily replacing the biological components of your brain over time with biomechanical ones, until there is no human flesh left? This also seems less difficult than making a copy, which requires the ability to not only read a brain but make a whole new one, biological or otherwise, in the exact same pattern.
Yes, that would be quite viable. It would even remove the objections of the people in this thread. It's just not genuinely necessary for the same person to be existant on both sides, that's what I'm arguing.
I'm not suggesting slapping a computer circuit into your brain or anything sudden, more a slow takeover that changes the balance of material continuously without your being aware of it.
Actually, why wouldn't a series of surgeries with periods of adaptation after each one be acceptable?
If you introduced machines that could replace the actual cell organelles over time with new machinery without destroying the neurons as they did so, I have to assume that you could replace quite a lot of the actual matter without losing fundamental mental function. i.e. basic stuff like Golgi apparatus and the cytoskeletons that are just part of the cell life support could be replaced with suitable bio-related nano-tech.
Yes, but is it necessary? I doubt this--though it might proceed that way, just progressively adding implants which replace brain functions will work fine.
A lot of the basic brain matter that isn't neurons or other cells but holds it all together could also be replaced with suitable materials, i.e. to make your brain less fragile to shocks or strokes by replacing the blood supply vessels and the blood itself with synthetic systems.
That might well be ideal, but it also requires very sophisticated nanotechnology.
The thing is, you don't need to "read" the brain to do this, as your machinery can be dumb instruments that just push through the brain, replacing all the targetted organic parts as they encounter them. I don't know enough about how neurons work though, to judge how much of a neuron cell you actually need to have for it to still do its wonderful thinking stuff. The limitation of how beneficial this would be, is how much you need to retain as original old you to keep you still mentally functional (and able to learn, etc).
None, ultimately. Consciousness is just a representation of information, and if we can emulate the environment that information is being run in inside the body, we can create a functional environment for a person to continue on normally without their body--though certainly after a period of adaptation they'd be liable to move more and more away from normal human behaviour patterns.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
User avatar
frogcurry
Padawan Learner
Posts: 442
Joined: 2005-03-13 06:34am

Re: Brain Recording

Post by frogcurry »

The Duchess of Zeon wrote: Actually, why wouldn't a series of surgeries with periods of adaptation after each one be acceptable?

Yes, but is it necessary? I doubt this--though it might proceed that way, just progressively adding implants which replace brain functions will work fine.
I favour a cellular-level borg approach over a more intrusive method which uses large scale implants, mostly because that way you are not artificially restructing the brain in one moment with total, irreversible replacement of the cells affected. The mental structure of neurons is retained, or at least minimally impacted, whereas what you're suggesting would lead to immediate loss of the affected neurons forever, which might be a problem depending on what neurons exactly that you just lost.

You're probably partly right even in my scenario - I don't imagine that you need all the neurons in your brain equally, i.e. those which deal with control of certain bodily functions, so those could be more rapidly replaced. However, things like memory, stimuli interpretation, etc which play an important role in conciousness - those I don't think you want to lose to a possibly non-identical surgically implanted replacement, as it might fundamentally change your personality. But replacing the cells in a progressive manner means that there isn't going to be a point where you're the Duchness, and then the next second something significant has changed and you are... not quite that person.

The other benefit is that you could probably reverse it if it didn't work out well, and as it'd be progressive and observable you'd have the chance to tell it was going awray before you're whole brain was lost.
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

Uhm, well, you couldn't just cut out part of the brain and replace it with a cybernetic component. You'd rather install the cybernetic component in such a way that you could shift brain functions over to it, and then replace the part of the brain it replaces, with another component, which takes over operations for a second sector of the brain, and so on. Just like when you're rebuilding a highway you build the new stretch before diverting traffic and demolishing the old stretch.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
User avatar
Akkleptos
Jedi Knight
Posts: 643
Joined: 2008-12-17 02:14am
Location: Between grenades and H1N1.
Contact:

Re: Brain Recording

Post by Akkleptos »

Duchess of Zeon wrote:The problem with this argument, again, is that it assumes that the original has some kind of special property which makes it a distinct sapience from the 'copy'. Nothing of the sort is true, though. Is the body dead? Certainly. Is the person dead? No, because everything we define as a person has been continuously existant.
You've just hit the nail on the head. The problem here seems to be that we've been using "person" in different meanings. For you, it's the distinct sapience, memories, reactions, ideas... the mind, for short. For me, a person is all that plus the body. No, I'm not trying to attach any metaphysical or magical importance to the body itself. In fact, let's call it instead a "host", be it biological or otherwise (computer, bio-droid, software...).

Having thus defined the terms, what I'm saying is that while the mind (the person, by your definition) clearly continues to exist, the "host" is killed. This would result in the person (by my definition) getting killed, even though there is a mind that is exactly the same that carries on living, interacting, thinking, etc., since the original (let's call this one "subject A") ceases to experience anything. There is a termination of the continuity of experience. Sure, there is another instance of this same mind (as said previously by someone on this same thread, this language is not fit to talk about these things as accurately as we would like) and for what anyone else cares, it's subject A. Only subject A (body and all the set of phenomena we call mind characteristic to it) ceased to exist -the physical body and that instance of the mind.

So I interpret that the person continues to exist as an individual (subject B). Yes, he can still think, perceive, communicate, pay taxes and work (well, depending on the vehicle, I guess). In that sense, well, yeah, of course. But as far as subject A is concerned, well, he just ceased to perceive, feel, communicate and everything (I'm talking about the 1st instance of the same mind). I don't see how you take the consciousness, self-awareness and -most importantly in my definition- the continuity of experience from a person (in human host, or even in computer form, for that matter) going to sleep and then waking up in a new host. That's magic for me, since those would actually be the memories from instance 1 of the mind being recalled by instance 2 of the same mind. And it's okay, since as per your definition, we could say that the individual (memories, personality, mental habilities and processes) is still around. That's not what I'm arguing. What I'm saying is that poor instance 1 of the mind never wakes up. In it's continuity of experience, there's nothing after going to sleep. That means that a sentient being, instance 1, was effectively eliminated.

What you're saying is especially interesting, since it opens a whole brave new world of possibilities. Imagine Bob hates John's guts and he wants to kill him. If John get's... uh... "duplicated" in body and mind (or just in mind, since that's what the person -your definition- is), then we could let Bob have it's way and dispose of John's original body and the firt instance of his mind. I wonder if the courts of the future would not consider it "murder":

Prosecution: This man has killed a human being, a living sentient being, John!
Defence: What are you talking about? John's right here!
John (instance 2, host 2): Hello there!

If what you're saying is that since the mind persists, well, yeah, nobody is arguing that. What I'm saying is that mind instance 1, host 1, very effectively dies. We could say, by what I think I understand from your argument, that only one set of the person has been eliminated.

What if we eliminate the problem of sentience, or a human mind altogether. A cat, subject A, is cloned and has the entirety of its mental processes (or whatever necessary to insure the preservation of its mental, personal uniqueness) transferred to a fresh clone, subject B. If you kill subject A, of course instance 1 of the common mind is over.

That's why I used the pinching example. Instance 1 (or original mind/person/personality) does not experience anything further after being killed. Instance 2 does, in its host. Taking it back to humans: if you pinch, say, frogcurry, I won't feel a thing, I will not experience the same. Oh, yeah, you might say, but your minds are different. Well, how exactly does having the same exact mind make a difference? Clearly we would be talking about 2 physical (or even logical, if you prefer) subjects. The fact that they both have the same mind is utterly irrelevant. Two living subjects. I think the right expression is that a copied mind in a new host does not produce "you", but rather "another you".

You may be familiar with those claims that, even though helping just one person, saving just one person's life (from drugs, from starvation in a Thirld World country) doesn't make a real difference. And of course, one could always counter: well, for that person, it meant a world of difference. That's what I'm talking about. The continuity of experience. What do I care if my mind/personal uniqueness survives in a new host, if I'm going to die? As you said, my "person" (in your definition) still lives on. But I, Akkleptos, borne of woman on the 30th of March, 1974 -in my continuity of experience- die. No more pizza for me. :(
Would it be possible to achieve the brain recording affect desired indirectly, by steadily replacing the biological components of your brain over time with biomechanical ones, until there is no human flesh left? This also seems less difficult than making a copy, which requires the ability to not only read a brain but make a whole new one, biological or otherwise, in the exact same pattern.

I'm not suggesting slapping a computer circuit into your brain or anything sudden, more a slow takeover that changes the balance of material continuously without your being aware of it. If you introduced machines that could replace the actual cell organelles over time with new machinery without destroying the neurons as they did so, I have to assume that you could replace quite a lot of the actual matter without losing fundamental mental function. i.e. basic stuff like Golgi apparatus and the cytoskeletons that are just part of the cell life support could be replaced with suitable bio-related nano-tech. A lot of the basic brain matter that isn't neurons or other cells but holds it all together could also be replaced with suitable materials, i.e. to make your brain less fragile to shocks or strokes by replacing the blood supply vessels and the blood itself with synthetic systems.
This would work. It all happens within the same instance of the mind. No continuity of experience termination.
Duchess of Zeon wrote:Please remember, gentlemen, everything, in physical terms, is ultimately just information. Since the information of that person has always existed, the person has always existed. There is no discontinuity. The entire cosmos is ultimately just information, and your information being temporarily stored while it's compiled somewhere else is frankly not all that different from sleeping.
Actually, your mind is hard at work when you sleep. Even if not in a vigil state, continuity of experience is still in play.
Duchess of Zeon wrote:Uhm, well, you couldn't just cut out part of the brain and replace it with a cybernetic component. You'd rather install the cybernetic component in such a way that you could shift brain functions over to it, and then replace the part of the brain it replaces, with another component, which takes over operations for a second sector of the brain, and so on. Just like when you're rebuilding a highway you build the new stretch before diverting traffic and demolishing the old stretch.
This would also work, for the reasons previously stated. The person (mind instance) is not copied elsewhere then terminated. It doesn't matter if you end up being 100% cybernetic. There was never a termination of continuity of experience.

EDIT: spelling :P
Life in Commodore 64:
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
GENERATION 29
Don't like what I'm saying?
Take it up with my representative:
Junghalli
Sith Acolyte
Posts: 5001
Joined: 2004-12-21 10:06pm
Location: Berkeley, California (USA)

Re: Brain Recording

Post by Junghalli »

Well, I'm not going to get into a philosophy argument, but personally I sure as hell would never consent to the kind of operation the Duchess described (copy my mind, kill my body under anaesthesia).

Personally, my hope for immortality is bio-immortality. I happen to like being a meatbag, and as far as the continuity of consciousness thing is concerned, let's just say I'd rather not take any chances.
User avatar
Eris
Jedi Knight
Posts: 541
Joined: 2005-11-15 01:59am

Re: Brain Recording

Post by Eris »

Akkleptos wrote:You've just hit the nail on the head. The problem here seems to be that we've been using "person" in different meanings. For you, it's the distinct sapience, memories, reactions, ideas... the mind, for short. For me, a person is all that plus the body. No, I'm not trying to attach any metaphysical or magical importance to the body itself. In fact, let's call it instead a "host", be it biological or otherwise (computer, bio-droid, software...).
I can sympathise with you general scheme as you lay it out here, and I can certainly understand some of your concerns, but given the limitations of the gedankenexperiment, I'm not sure you can avoid attaching some kind of special importance to bodies without some kind of metaphysical principle of identity, as a host or otherwise.

But first, let me make sure I'm clear on what you mean. Marina has propose that the only ultimate (that is, common and peculiar) trait of a person (that is, something we'd call sapient, let's ignore what that might really entail for now) is the collection of their experiences, thoughts, perceptions, and other mental characteristics. And thus, if you could shift all those characteristics from our brains to a computer, or from our brain to another brain, without any non-trivial loss, then that person could be considered, at the moment of transferral, to be the same person. They would not remain the same person, since the experiences would start diverging, which is why you euthanise the first host during transfer to avoid ethical issues. She then claims since one person went in, and the same person came out with a different extension, no one has been killed in the process.

You agree with her to the point where you also claim that our collection of experiences, perceptions, and other mental phenomena are vital to who we are, but want to deny that we could perform such a transfer in principle and come out with the same people, since the particular physical extension is part of who that person is. That is, even if we had the arbitrary technology required to perform such an operation, it would still be killing a person - not just a body or a host, but a person - because that person would disappear with their body, and a new, if in many ways very similar person would come out of the process.

Is that fair? (I'll pretend that it is until corrected for purposes of responding to the rest of your post.)

Note, it occurred to me as I wrote this post, that the true difference between the two view points is not simply of mental vs physical properties, but a notion of identity as a process vs identity as an object. The former being Marina's (and mine, if probably not in the exact same way) position of an identity as being a collection of information, and the latter being yours. Not necessarily a physical object, but some thing that you can attach to a certain person and in virtue of which you can say a person is self-same (themself). Is this still fair? I'll explain a little more about what I mean later on in the post.
Having thus defined the terms, what I'm saying is that while the mind (the person, by your definition) clearly continues to exist, the "host" is killed. This would result in the person (by my definition) getting killed, even though there is a mind that is exactly the same that carries on living, interacting, thinking, etc., since the original (let's call this one "subject A") ceases to experience anything. There is a termination of the continuity of experience. Sure, there is another instance of this same mind (as said previously by someone on this same thread, this language is not fit to talk about these things as accurately as we would like) and for what anyone else cares, it's subject A. Only subject A (body and all the set of phenomena we call mind characteristic to it) ceased to exist -the physical body and that instance of the mind.

So I interpret that the person continues to exist as an individual (subject B). Yes, he can still think, perceive, communicate, pay taxes and work (well, depending on the vehicle, I guess). In that sense, well, yeah, of course. But as far as subject A is concerned, well, he just ceased to perceive, feel, communicate and everything (I'm talking about the 1st instance of the same mind).
Some of this should be quoted above, since I incorporated it while trying to understand part of your fundamental position, but bear with me as it also applies here. I'm not precisely sure what you mean to say, but that's mostly the fault of some of our language being slippery while trying to talk about this. The initial host of subject A would indeed be killed, as their body would be euthanised during the hypothetical process. And thus you claim that the person dies, since their continuity of experience ceases to be, even though an identical mind shows up in subject B. This is somewhat confusing to me, as it appears to be, and correct me if I'm wrong, treating a mind as a discrete entity. It is true that with the death of the host body A, the physical instantiation of those chemical properties that give arise to A's consciousness cease, and that a new set of processes (chemical, or electronic, or otherwise) show up elsewhere. But how are you claiming that these are identical? They're clearly not "identical" in the sense of them being the self-same processes, since they're discontinuous. They're functionally identical as per the experiment's stipulations, but if they're functionally identical, then isn't the person functionally the same person?

To elaborate, if the new collection of consciousness giving processes are functionally identical in subject B as in subject A, what basis is there for treating that new person differently? They function the same way, and consequently give the appearance of being the same person, even to the subject herself. I am not the same person I was when I was born, as my body is continuously regenerating itself, so in a sense I am not the same host as I was back then. Then what is the important difference between the two subjects? Subject A has indeed ceased to perceive, but I can conceptually slice up myself into time-dependant slices, each thinking they're connected to all the past time slices, which progressively cease to perceive as well. What's the difference between the two cases?
I don't see how you take the consciousness, self-awareness and -most importantly in my definition- the continuity of experience from a person (in human host, or even in computer form, for that matter) going to sleep and then waking up in a new host. That's magic for me, since those would actually be the memories from instance 1 of the mind being recalled by instance 2 of the same mind. And it's okay, since as per your definition, we could say that the individual (memories, personality, mental habilities and processes) is still around. That's not what I'm arguing. What I'm saying is that poor instance 1 of the mind never wakes up. In it's continuity of experience, there's nothing after going to sleep. That means that a sentient being, instance 1, was effectively eliminated.
This does make your earlier reluctance somewhat more clear, but I think you're conflating two issues. You speak both of continuity of consciousness and continuity of the host. You claim that it's like magic, because the memories are somehow false ones, due to a change in instantiation, despite preserving the functional identity of the mind. If I'm reading you right, it's the continuity that's important here, so let's split that up. If continuity is necessary to preserve identity through time, it must be because either (i) the mind must be continuous, (ii) the body must be continuous, or (iii) both must be continuous. Let's break this down into each case.

If the mind must be continuous, we have to make some awkward concessions, as we already have cases of discontinuous minds. Let's take the case of a viciously evil murderer. One who rapes babies, tortures black people to death because he thinks it's funny, and does any other acts such that we can all agree that he is not very nice, and that he is responsible for his actions and should be held accountable for them. Now, let's say that they die, and his consciousness ceases to be conscious (for lack of terminology). Five minutes later, he's resuscitated by some skilled EMTs and he comes through with essentially no brain damage, and still considers himself himself, remember everything that happened, and so on. But his consciousness is discontinuous. If we hold mental continuity as a stipulation for self-identity, we cannot prosecute that person for his crimes, or even hold him morally responsible, because he did not do anything. At most we can say that he has false memories of a different person, but he no more did those acts any more than I performed any acts that I dreamt up. I remember doing them, but I did not perform them.

Let's say then that it's physical continuity. This is harder to find a real life example of, since we cannot, at the moment, experience a physical discontinuity. So let me propose a subsidiary thought experiment instead. Take a hypothetical intelligent computer. It seems reasonable to think that these could exist, since we're intelligent, and there's nothing special about our bodies. They're complex, but in theory you could represent us as a long series of messy functions on a computer, even if it meant simulating electron and neuron interactions in a physics modeler. So there is a computer, and it controls a mechanical body. It uses this mechanical body to murder a family, and then saves the current state of its operating state (the sum of its saved to storage data, operating memory, current instructions, etcetera) and then has this saved state transferred to a new and unrelated physical computer, wiping the previous body clean of all trace of its presence, so it's a blank string of zeroes on a computer. It then has itself turned on. Seeing as it just suspended its operations and restarted them up, I think it's persuasive to say (and correct me if you do not agree) that this is the same set of programmes running, that is, the same intelligence that has just moved bodies. Do we want to say it never committed those murders? We can fix up the example so not a single atom is shared in common with the original body, only the information that makes the consciousness operate. If there's anything special about the body that makes a person a person, that is lost and we can no longer hold this intelligence responsible for the acts committed by the previous body.

Of course, it could be a mixture of both. However, seeing as it can't be either alone, I find it implausible to think that it's a combination of both, and cannot think of a decent way to try to argue the case. If you have a method in mind, I entreat you to share your ideas.
What you're saying is especially interesting, since it opens a whole brave new world of possibilities. Imagine Bob hates John's guts and he wants to kill him. If John get's... uh... "duplicated" in body and mind (or just in mind, since that's what the person -your definition- is), then we could let Bob have it's way and dispose of John's original body and the firt instance of his mind. I wonder if the courts of the future would not consider it "murder":

Prosecution: This man has killed a human being, a living sentient being, John!
Defence: What are you talking about? John's right here!
John (instance 2, host 2): Hello there!

If what you're saying is that since the mind persists, well, yeah, nobody is arguing that. What I'm saying is that mind instance 1, host 1, very effectively dies. We could say, by what I think I understand from your argument, that only one set of the person has been eliminated.
I agree it opens up a whole brave new world of possibilities, this conception of an information based personality identity. I don't think you have one of them right there. I think what you're saying is rather like the defense claiming that because John has a new car, Bob could never have destroyed his old car. Allow me to back up and illustrate the important distinction.

In this example, Bob killed John's body, and John was given a new body with a copy of his mind inserted into it. But that's just it - Bob hasn't killed a human being (person), he's killed a human body. John is still there, but his old body is not. I agree this would lead to interesting new case law about the status of a body versus a mind, but I don't think it poses a problem to the notion that we can have a discontinuous but still self-identified and functionally identical person. Even if we agree the mind died, the mind was also brought back, rather like someone who was killed by a murderer, but brought back to life by very skilled EMTs. We can still charge the attempted murderer for all sorts of things, but we don't treat the case as if there suddenly is a new person running around instead of John. The only difference between the EMT case and the postulated case is that the flesh is different. But since the flesh in itself isn't important, as shown by example above, then either it must be because there is some metaphysical specialness to flesh, which both of us want to deny, or because of some flaw in my argument, which I beg you point out to me should there be one.
What if we eliminate the problem of sentience, or a human mind altogether. A cat, subject A, is cloned and has the entirety of its mental processes (or whatever necessary to insure the preservation of its mental, personal uniqueness) transferred to a fresh clone, subject B. If you kill subject A, of course instance 1 of the common mind is over.
Yes, the continuity of consciousness ceases, and then starts up again at some point in the future. I still am not seeing why this is a problem for identity, unless you want to fiat some special attribute of some particular lump of flesh or silicon or what have you.
That's why I used the pinching example. Instance 1 (or original mind/person/personality) does not experience anything further after being killed. Instance 2 does, in its host. Taking it back to humans: if you pinch, say, frogcurry, I won't feel a thing, I will not experience the same. Oh, yeah, you might say, but your minds are different. Well, how exactly does having the same exact mind make a difference? Clearly we would be talking about 2 physical (or even logical, if you prefer) subjects. The fact that they both have the same mind is utterly irrelevant. Two living subjects. I think the right expression is that a copied mind in a new host does not produce "you", but rather "another you".
I would ask you to clarify what you mean by having the same mind. So, in the case of a simple discontinuous jump between bodies like Marina has proposed, you don't run into this problem, I don't think, any more than you run into it in the EMT case. If I pinch a person, then they die and are resuscitated, have I retroactively not pinched them? If we're talking about a single conscious mind that is duplicated into two media, and I pinch one, well, then they're no longer the same person. Their experiences have diverged, making them distinct from one another. The concept of "you" I do not take in this context to be static. An identity, since it's just information over time within the scheme being thought of, is a process, not an object, and so consequently you can have a fork in identity, resulting in two people from one common source, just as much as you can have a code fork produce two distinct but related programmes from a common source.

This is an important difference. Some of the confusion, I think, is due to a definition mismatch. For the notion of "another you" to make sense at all you have to think about an identity as an object in some sense. Obviously not a physical one, but there has to be some definite property, or mental concept, or metaphysical object that states, basically, "This thing here is Meredith" for example. Then when you have the duplication described in the thought experiment being considered, you get the further step, "This new thing over here is another Meredith". This I think gets confusing, and I want to reject that that's a valid move. Instead, I want to claim that the notion that I am the kind of thing that you can attach the appellation "This thing here is Meredith" is a category mistake. Instead, "Meredith" is a process, a collection of information defined by its content as it changes through time. You can have a fork that produces two collections of information that change through time both of which were at one point identical. Does this make them identical? No, of course not. They're related, but as time moves forward they will vary further apart from each other. Likewise, you could freeze that collection of information, analyse it, copy it over to a new body, and restart it. (Loosely speaking, do a brain scan and transfer that information to a new body/computer.) It would still be the same person insofar as a computer programme can be frozen and restarted again and still be, in some sense, the same programme.
You may be familiar with those claims that, even though helping just one person, saving just one person's life (from drugs, from starvation in a Thirld World country) doesn't make a real difference. And of course, one could always counter: well, for that person, it meant a world of difference. That's what I'm talking about. The continuity of experience. What do I care if my mind/personal uniqueness survives in a new host, if I'm going to die? As you said, my "person" (in your definition) still lives on. But I, Akkleptos, borne of woman on the 30th of March, 1974 -in my continuity of experience- die. No more pizza for me. :(
And yet, we don't have continuity of experience in the real world in all cases. And in a very absolute way of death and resuscitation, ignoring fuzzy cases like comas. My very point is that you mind/personal uniqueness is all you are. Your physical host is just something you're attached to, literally and sentimentally. That may be a good reason to keep it, and for you it appears to be so, even if alternatives were available in feasible form, but if you didn't keep it for whatever reason, you would still be you in the important sense of the notion of 'you'.
"Hey, gang, we're all part of the spleen!"
-PZ Meyers
Junghalli
Sith Acolyte
Posts: 5001
Joined: 2004-12-21 10:06pm
Location: Berkeley, California (USA)

Re: Brain Recording

Post by Junghalli »

Eris wrote:They would not remain the same person, since the experiences would start diverging, which is why you euthanise the first host during transfer to avoid ethical issues.
Personally, killing the original seems to me a rather odd way to try to avoid ethical issues. It's a way of (arguably) avoiding the sticky problems with the idea of uploading as immortality, but I'd say it raises more ethical questions than it answers, especially if it's not an inevitable part of the uploading process so it requires people to proactively and unnecessarily destroy one of the "twins". It would seem more ethically "safe" to leave the original alive.
User avatar
Akkleptos
Jedi Knight
Posts: 643
Joined: 2008-12-17 02:14am
Location: Between grenades and H1N1.
Contact:

Re: Brain Recording

Post by Akkleptos »

Eris wrote:I'm not sure you can avoid attaching some kind of special importance to bodies without some kind of metaphysical principle of identity, as a host or otherwise.
I see what you mean, and to that point, I agree. But I hadn't mentioned "Continuity-of-Experience Termination" at that point in my post.
Eris wrote:But first, let me make sure I'm clear on what you mean. Marina has propose that the only ultimate (that is, common and peculiar) trait of a person (that is, something we'd call sapient, let's ignore what that might really entail for now) is the collection of their experiences, thoughts, perceptions, and other mental characteristics. And thus, if you could shift all those characteristics from our brains to a computer, or from our brain to another brain, without any non-trivial loss, then that person could be considered, at the moment of transferral, to be the same person.
Right. The two would be the same person (by Duchess of Zeon's definition), which I agree with. The thing is how you do the shift. You're not "transplanting" the mind, you're copying it, and then deleting the original. As long as the original suffers Continuity-of-Experience Termination, that's a person who died, right there, even if a perfectly copy (or new instance, if you prefer) is still around.
Eris wrote:They would not remain the same person, since the experiences would start diverging, which is why you euthanise the first host during transfer to avoid ethical issues. She then claims since one person went in, and the same person came out with a different extension, no one has been killed in the process.
Yes, they wouldn't remain exactly the same person since experiences would start diverging. Imagine you live in a universe A, and in some other universe B that happens to be identical to ours, down to the direction and momentum of every single electron. Could you care less what happens to the "you" in universe B? As long as you are still alive, right? But if someone were to tell you that it's okay if they kill you, since there's already a fully identical "you" in universe B? And if they were to seamlessly integrate him into your own universe in your stead, wouldn't you mind? Even if as far as your family, friends et al can't tell the difference, and for all they care it's you they're interacting with?

And as for the ethical issues: isn't the ethical issue of a sentient being (and a human, at that) gets killed or erased because you happen to have a backup much heavier than the issue of having two identical people running around? Some would say that "diverging experiences" would make them different persons, eventually. How long does that take? How many nanoseconds of different memories (and the reactions to them) takes for the copy and the original to be considered "different people". Is it okay to kill someone because at a certain moment there is a perfectly accurate copy of him? Even worse, if you're the original, would you be okay with them killing you because there is already a backup?

Zeon mentioned something interesting: you go to sleep, then you wake up as a clone/computer. Now THIS is magic and metaphysical. Isn't it more like "you go to sleep, then another you wakes up as a clone/computer". Again, as far as everyone aroud is concerned, that's you. Only that you never woke up. That happened in another mind, sustained by thought processes separate from those occuring in your original mind.
Eris wrote:You agree with her to the point where you also claim that our collection of experiences, perceptions, and other mental phenomena are vital to who we are, but want to deny that we could perform such a transfer in principle and come out with the same people, since the particular physical extension is part of who that person is.
Again, Zeon definition: Yes, the person (the whole lot of things you mention) is still with us. But for the original, if killed/brainwiped, it's night-night baby. He would most certainly object, if he understood the implications.
Eris wrote:They're functionally identical as per the experiment's stipulations, but if they're functionally identical, then isn't the person functionally the same person?
You said it, not me. Functionally identical does not mean the same, as a lightbulb might be functionally identical to the next, and that doesn't mean they're the same thing, even if they're completely interchangeable. They're two separate things. Whatever happens to each of them affects one and only one of them, not both, thus, they're separate entities, regardless of how identical and interchangeable we see them to be.

In other words, for some weird reason, you become Functionally identical to me in a nanosecond. At that point, Q/God/The Doctor/Hiro Nakamura stops time, gets us in the same room and asks us which one of us should be terminated. Is there any doubt as to where either of our right index fingers would be pointing?

Ah! The thing is that a person is not a program file that can be copied. Or maybe it is, in a certain way, just as certain OS won't let you copy a file or a program that's being used (try deleting "explorer.exe" or "taskbar.exe" while running Windoze). The file is being used and run by the processor. The processor being the infinitessimal point of reality (here/now perception) in the Continuity of experience. You can't move/delete the file until you make the processor stop using it. That means Continuity-of-experience Termination . When you do that, experience ceases. If you kill/wipe the brain, that instance of the Zeon-person is gone, even though there might be a million others just like it out there. That one and int's perception of here-and-now are gone. As far as he goes, it's zero, zilch, nada forever more.
Junghalli wrote:Personally, killing the original seems to me a rather odd way to try to avoid ethical issues. It's a way of (arguably) avoiding the sticky problems with the idea of uploading as immortality, but I'd say it raises more ethical questions than it answers, especially if it's not an inevitable part of the uploading process so it requires people to proactively and unnecessarily destroy one of the "twins". It would seem more ethically "safe" to leave the original alive.
Precisely.

EDIT: Added a "w" in "two"
Life in Commodore 64:
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
GENERATION 29
Don't like what I'm saying?
Take it up with my representative:
User avatar
Darth Ruinus
Jedi Master
Posts: 1400
Joined: 2007-04-02 12:02pm
Location: Los Angeles
Contact:

Re: Brain Recording

Post by Darth Ruinus »

Junghalli wrote: Personally, killing the original seems to me a rather odd way to try to avoid ethical issues. It's a way of (arguably) avoiding the sticky problems with the idea of uploading as immortality, but I'd say it raises more ethical questions than it answers, especially if it's not an inevitable part of the uploading process so it requires people to proactively and unnecessarily destroy one of the "twins". It would seem more ethically "safe" to leave the original alive.
Think about all the trouble another you could cause you. This other "instance" of Junghali has access to bank accounts, social security accounts, health care information, credit card information etc etc. He could commit crimes, and DNA evidence (If he has a body that leaves DNA evidence) would point to both of you. Suppose he sleeps with your girlfriend and gets her pregnant. How would you determine who is the father? Who gets parental rights? Both of you, or only one? What if you don't like sharing parental rights? He may also suddenly decide he no longer likes sharing a family with "Junghali Original Instance" and might attempt to run you away. Should we clone your family too, just so he can have his own? Of course not, that would open up a whole lot of new problems.

Removing the first "instance" would remove all those problems.
"I don't believe in man made global warming because God promised to never again destroy the earth with water. He sent the rainbow as a sign."
- Sean Hannity Forums user Avi

"And BTW the concept of carbon based life is only a hypothesis based on the abiogensis theory, and there is no clear evidence for it."
-Mazen707 informing me about the facts on carbon-based life.
User avatar
Akkleptos
Jedi Knight
Posts: 643
Joined: 2008-12-17 02:14am
Location: Between grenades and H1N1.
Contact:

Re: Brain Recording

Post by Akkleptos »

Darth Ruinus wrote:Killing the first "instance" would remove all those problems.
There, fixed it for you :wink:

How about killing the duplicate? You have seniority, after all.

Wouldn't it be much easier to grant the clone a legal status similar to a child, or a relative, rather than your own (even though it is you, to any practical effect)? How about saving the procedure only for times when a body becomes almost completely non-viable, then, even thought the first instance dies (it was going to die anyway) another instance of the same mind can carry on, no legal conflicts?

EDIT: Oh, question marks!
Life in Commodore 64:
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
GENERATION 29
Don't like what I'm saying?
Take it up with my representative:
User avatar
Eris
Jedi Knight
Posts: 541
Joined: 2005-11-15 01:59am

Re: Brain Recording

Post by Eris »

Akkleptos wrote:
Eris wrote:But first, let me make sure I'm clear on what you mean. Marina has propose that the only ultimate (that is, common and peculiar) trait of a person (that is, something we'd call sapient, let's ignore what that might really entail for now) is the collection of their experiences, thoughts, perceptions, and other mental characteristics. And thus, if you could shift all those characteristics from our brains to a computer, or from our brain to another brain, without any non-trivial loss, then that person could be considered, at the moment of transferral, to be the same person.
Right. The two would be the same person (by Duchess of Zeon's definition), which I agree with. The thing is how you do the shift. You're not "transplanting" the mind, you're copying it, and then deleting the original. As long as the original suffers Continuity-of-Experience Termination, that's a person who died, right there, even if a perfectly copy (or new instance, if you prefer) is still around.
Well, the exact circumstances around how exactly you do this are subject to some very fine distinctions, but that's the case with many many medical processes. I would agree you're not transplanting the person, but that's because I don't believe there is a person to transplant, in the sense of moving a person from one body to the other. The reason I was so making such a big row over the notion of process-rooted identity is exactly this issue. If you save all the details of the state of a process, shut down that process temporarily, then transfer the information to a second body and restart it, how is this an essentially different process? And if the process itself doesn't object to it being shut down on a first body and restarted in a second, how can you say that the process (person) rather than just the body was killed? Some people won't want to go through this because they might feel squirrelly about losing their own continuity, but people also feel squirrelly about receiving blood transfusions. Whether or not people feel right is orthogonal to whether or not they're the same person.

You are also still running this argument based on the notion that continuity is somehow essential to a person remaining that self-same person. Given that I rebutted the notion that that's the case, I object to your using that argument till you get a coherent picture of why that's important.
Eris wrote:They would not remain the same person, since the experiences would start diverging, which is why you euthanise the first host during transfer to avoid ethical issues. She then claims since one person went in, and the same person came out with a different extension, no one has been killed in the process.
Yes, they wouldn't remain exactly the same person since experiences would start diverging. Imagine you live in a universe A, and in some other universe B that happens to be identical to ours, down to the direction and momentum of every single electron. Could you care less what happens to the "you" in universe B? As long as you are still alive, right? But if someone were to tell you that it's okay if they kill you, since there's already a fully identical "you" in universe B? And if they were to seamlessly integrate him into your own universe in your stead, wouldn't you mind? Even if as far as your family, friends et al can't tell the difference, and for all they care it's you they're interacting with?
You're confusing convergent and divergent notions of identity. Given how I've built up process identity, what you're describing is two identical processes that are thermodynamically isolated from each other, with one then overwriting in some sense the other. The case I described, and the case brought up in the thought experiment, is where two individuals split apart from one, not two merge into one. Furthermore, you're still confusing the issue between processes and entities. How would "I" even notice the change? One moment, there's a me who I can self-identify with all my memories. The next moment there's a me with all the same memories who I can self-identify. That's states of affairs are one and the same, and you could argue that this situation you describe happens every single femtosecond with no one noticing. Even without alternate worlds, one could argue that we are only ever experiencing our moment in time, and we are not, in a way that preserves identity, the same person as we were before. How is this case different than the one you describe? Nevermind whether I would mind or not, how would I notice? In the situation you describe it's not even clear that you are introducing a relevantly different person than myself.
And as for the ethical issues: isn't the ethical issue of a sentient being (and a human, at that) gets killed or erased because you happen to have a backup much heavier than the issue of having two identical people running around?
The notion that you make a special exception for a human above and beyond a sapient creature smacks of speciesism, but I'll ignore it for now. The notion that a person is getting killed and erased when this happens is still missing my point. If you made a new "person object" and then destroyed the old "person object" then yes, we could take moral issue against it, since a person object would be destroyed. But that's not what I'm suggesting. I'm suggesting that we shuffle around a process onto a new instantiation medium. We put one person to sleep, make some adjustments to the world, and then wake up that person, just with a new body. We still have the person around, so no one dies in a morally important sense. And since I don't subscribe to a person object theory I simply don't have this moral problem you're bringing up. You have to show first that there aren't such process identities before you can show me wrong.
Some would say that "diverging experiences" would make them different persons, eventually. How long does that take? How many nanoseconds of different memories (and the reactions to them) takes for the copy and the original to be considered "different people".
How should I know? There's been no research that's been done since we're talking about the nature of identity itself, not the details of how long it takes between identity shifts. But observe this important point: if you're arguing with me about how long it takes for identity to become differentiable after a process split, you've already agreed with my main point that identity is a process.
Is it okay to kill someone because at a certain moment there is a perfectly accurate copy of him? Even worse, if you're the original, would you be okay with them killing you because there is already a backup?
Without his or her permission? Of course not, no more than you're permitted to destroy his or her car just because there's another perfect replica, or you're allowed to cut off their arm just because they can get a replacement. The laws of morality are still in force with process people just as much as they are with object people.
Zeon mentioned something interesting: you go to sleep, then you wake up as a clone/computer. Now THIS is magic and metaphysical. Isn't it more like "you go to sleep, then another you wakes up as a clone/computer". Again, as far as everyone aroud is concerned, that's you. Only that you never woke up. That happened in another mind, sustained by thought processes separate from those occuring in your original mind.
I already have stated I don't believe that the notion of "another you" is coherent, and argued for it by proposing the notion of process identity, backed up with arguments about why continuity (required for object identity) is not tenable. Show me why I'm wrong.
Eris wrote:You agree with her to the point where you also claim that our collection of experiences, perceptions, and other mental phenomena are vital to who we are, but want to deny that we could perform such a transfer in principle and come out with the same people, since the particular physical extension is part of who that person is.
Again, Zeon definition: Yes, the person (the whole lot of things you mention) is still with us. But for the original, if killed/brainwiped, it's night-night baby. He would most certainly object, if he understood the implications.
I would not object, if I understood the implications, so you're notion that any original would is not true. Remember, as I said morality still applies. We can't do this if you don't agree, so if you're attached to your current body, feel free to keep it. But that does not mean that you're suddenly not you if you have a new one. You're also still assuming that identity is an object. I have argued why this isn't tenable. Show me how I'm wrong.
Eris wrote:They're functionally identical as per the experiment's stipulations, but if they're functionally identical, then isn't the person functionally the same person?
You said it, not me. Functionally identical does not mean the same, as a lightbulb might be functionally identical to the next, and that doesn't mean they're the same thing, even if they're completely interchangeable. They're two separate things. Whatever happens to each of them affects one and only one of them, not both, thus, they're separate entities, regardless of how identical and interchangeable we see them to be.

In other words, for some weird reason, you become Functionally identical to me in a nanosecond. At that point, Q/God/The Doctor/Hiro Nakamura stops time, gets us in the same room and asks us which one of us should be terminated. Is there any doubt as to where either of our right index fingers would be pointing?
We're now two separate processes with differing experiences. Why should one of us be terminated? Just because Q/whoever is being a dick, doesn't mean that the theory of process identity is false. Also, the notion of functional identity allows a way out. If the two processes are functionally identical, just merge them. Neither will notice a cessation of existence and you conflate the two into one. Now, that is, mind, totally beside the point. Just because you can fiat the swamp man weirdness, doesn't mean that our identities aren't processes, for which I have given an argument, and which you still have not shown is silly.

Further notice that this argument you pose can be used against you as well. Say you are suddenly cloned by Q and asked which of you should die. Which one, under your theory, has primacy?
Ah! The thing is that a person is not a program file that can be copied. Or maybe it is, in a certain way, just as certain OS won't let you copy a file or a program that's being used (try deleting "explorer.exe" or "taskbar.exe" while running Windoze). The file is being used and run by the processor.
We can't do this at the moment, it's true. Imagine that. We can't make fusion reactions either, which is why we use thought experiments like this one instead of producing technological implementations. But the thing that is a person is most eminently, in theory, replicable. Simply build a computer programme that will emulate physics, put in all the data about the current locations of the atoms in your body, and run it. In essence, emulate reality. We can't do that right now! Even if we could do it, maybe you're right, and we'd have to put the person into a state of death, but minimal decay while we did the analysis. We aren't technologically advanced enough right now to know these things. But that doesn't mean we can't, or that the process theory of identity is incorrect. Again, you're not addressing the core of my argument.

Of course, if you want to argue we can't copy people, you're really not only not arguing against my notion of identity, but against the thought experiment in itself, that it can't be instantiated and is thus absurd. I gave a three point argument about why continuity can't be part of it, and thus proposed process based identity to fix up the problems that arise without having continuity. One of the consequences is that if people consent, these kinds of procedures are moral and the people that come out of them are the same people as went in. Either argue against why process identity is wrong, or why this isn't a consequence of the theory. You are currently doing neither.
The processor being the infinitessimal point of reality (here/now perception) in the Continuity of experience. You can't move/delete the file until you make the processor stop using it. That means Continuity-of-experience Termination . When you do that, experience ceases. If you kill/wipe the brain, that instance of the Zeon-person is gone, even though there might be a million others just like it out there. That one and int's perception of here-and-now are gone. As far as he goes, it's zero, zilch, nada forever more.
And yet, I already argued why termination of continuity of experience does not mean the person is gone. Have you a rebuttal for that argument, other than to repeat ad nauseum that I must be wrong because it feels that way to you?
Junghalli wrote:Personally, killing the original seems to me a rather odd way to try to avoid ethical issues. It's a way of (arguably) avoiding the sticky problems with the idea of uploading as immortality, but I'd say it raises more ethical questions than it answers, especially if it's not an inevitable part of the uploading process so it requires people to proactively and unnecessarily destroy one of the "twins". It would seem more ethically "safe" to leave the original alive.
Precisely.
If either of you have an argument against process identity or the consequences of process identity, please share them, as I've seen none so far. The very point is that you aren't "killing" the person, even if it doesn't intuitively look that way. Plenty of other things are unintuitive. How fixed points result in the true sentences in formal logical systems that cannot be proven, but that doesn't make it any less the case.
"Hey, gang, we're all part of the spleen!"
-PZ Meyers
User avatar
Darth Ruinus
Jedi Master
Posts: 1400
Joined: 2007-04-02 12:02pm
Location: Los Angeles
Contact:

Re: Brain Recording

Post by Darth Ruinus »

Akkleptos wrote:How about killing the duplicate? You have seniority, after all.
Why would you kill the duplicate? The duplicate is the one that is going to have the upgraded body and everything. The duplicate is the one with the superior hosting system, so the original instance would be the logical choice to remove.

Or kill, whatever term you think fits.
Wouldn't it be much easier to grant the clone a legal status similar to a child, or a relative, rather than your own (even though it is you, to any practical effect)?
I'm pretty sure no sentient grown man/woman/being would find it reasonable to be labeled a child and not be able to vote, get married, have sex, buy beer/cigarettes, drive etc etc.
How about saving the procedure only for times when a body becomes almost completely non-viable, then, even thought the first instance dies (it was going to die anyway) another instance of the same mind can carry on, no legal conflicts?
Why? It is my choice to improve myself. There is no reason you can come up to me and tell me "You cannot read! You cannot work out! You cannot practice art!" This is the same thing. If I want to be transferred over to a better body, that can think faster/better then that is my choice.

Even if you think this kills me, it is my choice when I should die too.
"I don't believe in man made global warming because God promised to never again destroy the earth with water. He sent the rainbow as a sign."
- Sean Hannity Forums user Avi

"And BTW the concept of carbon based life is only a hypothesis based on the abiogensis theory, and there is no clear evidence for it."
-Mazen707 informing me about the facts on carbon-based life.
Junghalli
Sith Acolyte
Posts: 5001
Joined: 2004-12-21 10:06pm
Location: Berkeley, California (USA)

Re: Brain Recording

Post by Junghalli »

Eris wrote:If either of you have an argument against process identity or the consequences of process identity, please share them, as I've seen none so far. The very point is that you aren't "killing" the person, even if it doesn't intuitively look that way.
Allow me to put it this way.

Let's say that some guy walks up to you right now and puts two guns to your head. He informs you that one is loaded and one is not, you won't know which one. He also informs you that there is a copy of you in another universe, which is completely identicle to you, has identicle life experiences, memories, everything, and is experiencing the exact same thing you are right now. He will shoot you with one gun and the person in the other universe with the other, so your experiences will be completely the same up to the moment of death or continued consciousness.

He asks whether it matters to you if he uses the loaded gun or not.

Will you say no? You should. After all, by your definition as long as your lives do not diverge until the moment of death the person in the universe is you, and you will not die.

If he does kill you, should he be acquited for murder, on account of the fact you didn't actually die since there's an identicle you out there somewhere? Again, the answer should be yes.

I suspect that realistically your answers will probably be different though.

Heck, maybe you're right, and survival of a duplicate is survival of you. But then maybe not. Let's just say for myself I'm real paranoid when it comes to wanting to not die and would rather not take the chance. Other people can, if they feel like it, but you'll never get me to a sign a consent form for a copy/euthanize operation on myself.
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

Junghalli wrote: Other people can, if they feel like it, but you'll never get me to a sign a consent form for a copy/euthanize operation on myself.

What if you're 96 and your kidneys just failed?
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
Junghalli
Sith Acolyte
Posts: 5001
Joined: 2004-12-21 10:06pm
Location: Berkeley, California (USA)

Re: Brain Recording

Post by Junghalli »

The Duchess of Zeon wrote:What if you're 96 and your kidneys just failed?
I curse how non-uploading related medical technology has apparently remained stagnant, so I can't get a cloned organ transplant.

More seriously, can't I go on dialyses?
User avatar
The Duchess of Zeon
Gözde
Posts: 14566
Joined: 2002-09-18 01:06am
Location: Exiled in the Pale of Settlement.

Re: Brain Recording

Post by The Duchess of Zeon »

Junghalli wrote:
The Duchess of Zeon wrote:What if you're 96 and your kidneys just failed?
I curse how non-uploading related medical technology has apparently remained stagnant, so I can't get a cloned organ transplant.

More seriously, can't I go on dialyses?
The point is that biological organisms have fixed lifespans, whereas information has a functionally unlimited lifespan, you could maybe live 40 billion years as a CI with some planning, until the universe has decayed to the point that there's just not enough energy left due to dispersal of essentially everything that you run out of power and die.
The threshold for inclusion in Wikipedia is verifiability, not truth. -- Wikipedia's No Original Research policy page.

In 1966 the Soviets find something on the dark side of the Moon. In 2104 they come back. -- Red Banner / White Star, a nBSG continuation story. Updated to Chapter 4.0 -- 14 January 2013.
Post Reply