Designing a test to determine continuity of consciousness

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Post Reply
User avatar
PeZook
Emperor's Hand
Posts: 13237
Joined: 2002-07-18 06:08pm
Location: Poland

Post by PeZook »

Resinence wrote: c) Set a high standard for personhood such that you should be terrified of "dieing" every time you go to bed at night. Which is actually a really creepy thought :? Though at least it's not Fucking Scary like Shrooms story, if the integration between the new components and the wetwear was that bad I'd become a luddite and go live in a shack far away from the Insane Transhumanists.
Hmmpf. Ya know, this poses an interesting question...

How od you know if you are still you after you wake up? After all, if somebody ground up your old body into paste while you were sleeping, and made a perfect clone (or even not that perfect) with all the memories you had while going to sleep...

How are you going to tell the difference? PeZook2312342 has all the memories of former PeZooks, and thus...can't say if he's a clone or not.

Scary thought.
Image
JULY 20TH 1969 - The day the entire world was looking up

It suddenly struck me that that tiny pea, pretty and blue, was the Earth. I put up my thumb and shut one eye, and my thumb blotted out the planet Earth. I didn't feel like a giant. I felt very, very small.
- NEIL ARMSTRONG, MISSION COMMANDER, APOLLO 11

Signature dedicated to the greatest achievement of mankind.

MILDLY DERANGED PHYSICIST does not mind BREAKING the SOUND BARRIER, because it is INSURED. - Simon_Jester considering the problems of hypersonic flight for Team L.A.M.E.
User avatar
Shroom Man 777
FUCKING DICK-STABBER!
Posts: 21222
Joined: 2003-05-11 08:39am
Location: Bleeding breasts and stabbing dicks since 2003
Contact:

Post by Shroom Man 777 »

It's not the same book. Even if there are two identical books made by the same publisher, and I lay them to my left and to my right, they might be exactly the same, but they aren't. There is still the book on my left, and the book on my right. If the book on the left catches fire and the book on the right doesn't catch fire...

Slap me with a fish and attack my (exact) clone with an elephant, I feel the wet fish hitting me against my cheek. I don't feel my face getting bitten off my elephant and my elephant-bitten clone dies, not me. Sucks to be him. You can ask me what it feels like to have my face bitten off by an elephant, and I wouldn't know the answer because that wasn't me who got bit by an elephant. That's how I know I'm me.

My mental identity did not get eaten by an elephant. His did. I don't care if he has the same thoughts or memories as me. I don't care if the entire world is populated by Shroom Man 777s.

Unless our minds are interlinked via a Zerg hive mind or by telepathy or by a soul, I am me. He is he. Different. Me is me.

The mind is inseparable from the organic brain, can you prove it wrong?

I think the burden of proof lies in proving that there is something intangible that can be separated from the body/brain.

PeZook wrote:Hmmpf. Ya know, this poses an interesting question...

How od you know if you are still you after you wake up? After all, if somebody ground up your old body into paste while you were sleeping, and made a perfect clone (or even not that perfect) with all the memories you had while going to sleep...

How are you going to tell the difference? PeZook2312342 has all the memories of former PeZooks, and thus...can't say if he's a clone or not.

Scary thought.
You can't tell the difference. Why? Because you're not waking up. You are dead. Game over, man. No continues. And you're out of coins.

And you wouldn't give a shit what PeZook 2312342 thought, what his memories are, or anything. Because you're dead. He won't know he's a clone, it doesn't matter. You're either gonna be watching him from Heaven or Hell, or you're not because there's no afterlife and you've returned to that mental state you were in before you were born.

It is a scary thought.

I think it would be less scary if PeZook2312342 woke up and found YOUR (PeZook) corpse lying beside him. Or if he woke up and walked to the bathroom and found YOU (PeZook) drowning in the bathtub and both your eyes met in sudden realization.

Either way, you're still dead. It's just about as scary as being eaten by an elephant. You're dead, you won't care or give two shits if PeZook2312342 woke up and found out his life has been a dystopian lie of cyberpunk proportions. You wouldn't give a fuck if PeZook2312342 fought against the corporation and the guvmint and brought freedom from the clones and delivered them from The Island. You won't because you are dead. Dead. Dead. Dead.

None of it is happening to you. You have ceased to exist.

Even IF PeZook2312342 is you.

PeZook2312342 is you.

You are not PeZook2312342.

How can you be? You are not reincarnated into his body. There is no intangible factor that allows such existence.
Image "DO YOU WORSHIP HOMOSEXUALS?" - Curtis Saxton (source)
shroom is a lovely boy and i wont hear a bad word against him - LUSY-CHAN!
Shit! Man, I didn't think of that! It took Shroom to properly interpret the screams of dying people :D - PeZook
Shroom, I read out the stuff you write about us. You are an endless supply of morale down here. :p - an OWS street medic
Pink Sugar Heart Attack!
User avatar
PeZook
Emperor's Hand
Posts: 13237
Joined: 2002-07-18 06:08pm
Location: Poland

Post by PeZook »

Shroom, you don't get it.

How can you tell if you're the exact same Shroom who went to sleep yesterday? You can't. You never will, because if we postulate the capability for a perfect upload, then some mad genius may just as well be killing you every night, and tomorrow's Shroom will think he peacefully went to sleep last night, but in fact he is only a few hours old, grown in a vat during the night.

The gist of this is: there is no magical quality which makes you somehow special, perception wise. If you have all the memories and experience you accumulated over the years, and a brain capable of thought, that makes "you".

That said, I still want to take the Ship of Theseus to my shiny new robot body, because...well, because I'm still scared of the possibility of not waking up :P
Image
JULY 20TH 1969 - The day the entire world was looking up

It suddenly struck me that that tiny pea, pretty and blue, was the Earth. I put up my thumb and shut one eye, and my thumb blotted out the planet Earth. I didn't feel like a giant. I felt very, very small.
- NEIL ARMSTRONG, MISSION COMMANDER, APOLLO 11

Signature dedicated to the greatest achievement of mankind.

MILDLY DERANGED PHYSICIST does not mind BREAKING the SOUND BARRIER, because it is INSURED. - Simon_Jester considering the problems of hypersonic flight for Team L.A.M.E.
User avatar
Shroom Man 777
FUCKING DICK-STABBER!
Posts: 21222
Joined: 2003-05-11 08:39am
Location: Bleeding breasts and stabbing dicks since 2003
Contact:

Post by Shroom Man 777 »

I wouldn't know right now, with my poor pod person brain - I would never know. But tell that to the poor sorry Shroom who's now decomposing in the ditch while his unaware doppleganger carries on with his life. Will that make him feel any better?

I don't know shit. But he isn't exactly happy about it.

Now I will go to sleep and die. Your subsequent posts will be replied to by another person who is the same person.

Goodbye.



So, yes. There's no way to tell. Unless if the Original Shroom knew he was being killed. Now, that won't make a lick of difference to me right now - since I'm not dead, and I'm living willy-nilly happily as a babelicious little body snatcher. It won't matter. Won't matter especially if I won't have any recollection of how the Original Shroom was screaming in abject horror as his intestines slid out of that hideous gash in his abdomen, as the writing tentacles of that great cabbage constricted around his arms and legs while he watches feebly as the overgrown potato births an exact replica of him.

It won't matter to me if the Original Shroom's organs were liquified while he was still alive and screaming, converted to precious bodily fluids so that the Space Cabbage will have something to nourish her beautiful new born baby Shroom Clone (me) with.

It honestly doesn't. To me.

Don't know about that other poor sap though. Too bad he's now a compost heap.
Image "DO YOU WORSHIP HOMOSEXUALS?" - Curtis Saxton (source)
shroom is a lovely boy and i wont hear a bad word against him - LUSY-CHAN!
Shit! Man, I didn't think of that! It took Shroom to properly interpret the screams of dying people :D - PeZook
Shroom, I read out the stuff you write about us. You are an endless supply of morale down here. :p - an OWS street medic
Pink Sugar Heart Attack!
User avatar
Steel
Jedi Master
Posts: 1122
Joined: 2005-12-09 03:49pm
Location: Cambridge

Post by Steel »

PeZook wrote:Shroom, you don't get it.

How can you tell if you're the exact same Shroom who went to sleep yesterday? You can't. You never will, because if we postulate the capability for a perfect upload, then some mad genius may just as well be killing you every night, and tomorrow's Shroom will think he peacefully went to sleep last night, but in fact he is only a few hours old, grown in a vat during the night.

The gist of this is: there is no magical quality which makes you somehow special, perception wise. If you have all the memories and experience you accumulated over the years, and a brain capable of thought, that makes "you".

That said, I still want to take the Ship of Theseus to my shiny new robot body, because...well, because I'm still scared of the possibility of not waking up :P
Noone is arguing that you couldn't have been replaced in your sleep but wouldnt know it. That is certainly true.

What is certain is that the person who got replaced the night before DID know about it. For them, death was exactly the same as if the new replica had not been created.

If there exists an event which we know as death, the existence or creation of a clone does nothing to change that event from the perspective of the deceased.
User avatar
PeZook
Emperor's Hand
Posts: 13237
Joined: 2002-07-18 06:08pm
Location: Poland

Post by PeZook »

:P

Mindfuck accomplished! :D
Image
JULY 20TH 1969 - The day the entire world was looking up

It suddenly struck me that that tiny pea, pretty and blue, was the Earth. I put up my thumb and shut one eye, and my thumb blotted out the planet Earth. I didn't feel like a giant. I felt very, very small.
- NEIL ARMSTRONG, MISSION COMMANDER, APOLLO 11

Signature dedicated to the greatest achievement of mankind.

MILDLY DERANGED PHYSICIST does not mind BREAKING the SOUND BARRIER, because it is INSURED. - Simon_Jester considering the problems of hypersonic flight for Team L.A.M.E.
User avatar
Shroom Man 777
FUCKING DICK-STABBER!
Posts: 21222
Joined: 2003-05-11 08:39am
Location: Bleeding breasts and stabbing dicks since 2003
Contact:

Post by Shroom Man 777 »

Is that to me, or to Steel?

Either way...


JOIN US
Image "DO YOU WORSHIP HOMOSEXUALS?" - Curtis Saxton (source)
shroom is a lovely boy and i wont hear a bad word against him - LUSY-CHAN!
Shit! Man, I didn't think of that! It took Shroom to properly interpret the screams of dying people :D - PeZook
Shroom, I read out the stuff you write about us. You are an endless supply of morale down here. :p - an OWS street medic
Pink Sugar Heart Attack!
User avatar
RogueIce
_______
Posts: 13387
Joined: 2003-01-05 01:36am
Location: Tampa Bay, Florida, USA
Contact:

Post by RogueIce »

So on this subject, what if there are two of me, RogueIce and RogueIce 2.0? Say I have a job, house, family, car, stock portfolio, whatever. Who gets it? The old me or the new me? What if new me never knows it's a clone, and old me falls off a cliff on the way home and nobody ever finds the body, while new me makes it there safe and sound. Is he thus entitled to my job, house, family, car, and stock portfolio?

What if we both show up at the same time, and there's no way to tell the difference (we're both metal constructs or something)? Then what? Do we flip a coin? Alternate?
Image
"How can I wait unknowing?
This is the price of war,
We rise with noble intentions,
And we risk all that is pure..." - Angela & Jeff van Dyck, Forever (Rome: Total War)

"On and on, through the years,
The war continues on..." - Angela & Jeff van Dyck, We Are All One (Medieval 2: Total War)
"Courage is not the absence of fear, but rather the judgment that something else is more important than fear." - Ambrose Redmoon
"You either die a hero, or you live long enough to see yourself become the villain." - Harvey Dent, The Dark Knight
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

Shroom Man 777 wrote:It's not the same book. Even if there are two identical books made by the same publisher, and I lay them to my left and to my right, they might be exactly the same, but they aren't. There is still the book on my left, and the book on my right. If the book on the left catches fire and the book on the right doesn't catch fire...
The books are physically different, but information-wise they are identical. This is the key issue in this whole discussion. For people of the cognitive science or AI persuasion, the mental identity is purely a product of information. If you adopt that viewpoint, then destroying one copy of the same mind does not destroy that mind, since the information is not lost. If you believe that the mind depends, in some manner, on the exact matter used in a particular manifestation, then it's a whole different debate. The trouble with this, very esoteric debate, is that noone clearly defines what they mean by the different terms they use. I'd also make the claim that, despite what they might say, a lot of proclaimed materalists believe, implicity, in some sort of soul. They just use terms like "personal perspective" or "concious experience", but what they are speaking about it just as inscrutable as any religious definition of soul.
User avatar
Sikon
Jedi Knight
Posts: 705
Joined: 2006-10-08 01:22am

Post by Sikon »

Shroom Man 777 wrote:I mean, a perfect copy and replacement is possible...theoretically. But in implementation, there are going to be problems
For an engineering challenge as difficult as this, the large problem will tend to be solved by breaking it down into a series of smaller and easier problems, perfecting the solution for each, only then gradually progressing up to working on a hundred-billion neuron human brain.

That may mean first managing gradual replacement of a 300-neuron nematode brain, comprehensively testing and refining until the result approached perfection. Then someday manage such on a brain as complex as a fish. Eventually, later manage such on brains as complex as those of mice, of cats, etc.

That includes reproducing the effect of hormones, everything involved in emotions, etc.

(I'm assuming a perspective like my own of desiring initially no noticeable changes to how one was biologically, then only later trying improvements like the ability to moderate anger or fear with a thought).

Besides, if one is the average person undergoing the procedure in a future scenario where the technology is available, one probably isn't the first person undergoing it or even one of the first million people undergoing it. The procedure may have been tested and refined over countless past human subjects, after the prior implementation on animals.

This is not to say the replacement process would be absolutely perfect, but the brain isn't absolutely perfect to begin with, e.g. a few out of one's 100 billion neurons dying all the time. Given enough technological refinement, it could be good enough that one wouldn't notice the difference.

The brain has some amount of plasticity, adaptability, and ability to remap itself, so to speak, as observed in studies of people recovering from brain injuries. Biological organisms are imperfect from top to bottom but just real good as still functioning despite some imperfections. For example, it's even the case that one of a person's legs can be a little longer than the other, by up to millimeters if I recall correctly.

Of course, self-replicating nanorobots are quite a complex engineering challenge, not something I would anticipate in the near-term future short of Singularity-style general AI development or vast allocation of R&D resources over time, but I'm just observing the preceding is definitely possible within the laws of physics. (Strictly speaking, they don't have to be self-replicating, but having them grow and reproduce like biological cells is the probable route for producing a hundred billion of them to actually be economically affordable, more like growing a hundred billion bacteria than manually assembling robots one by one).
Shroom Man 777 wrote:You are trapped in a body that's no longer your own, as the parts of "you" are being replaced by more cybernetic components while the fleshy bits of your brain are removed.
"Fleshy bits" of brain being removed is poor terminology, sounding almost like a ghoul tearing out chunks. This is just a matter of a small number of the hundred billion microscopic neurons in one's brain being replaced per day, with no pain, no noticeable effect, and no harm observed by the individual.

If viewed in an x-ray, it might look a little like a tumor slowly growing in one's brain over the years, except it wouldn't be harmful.

Even in the adult brain, it is known now that there is a little continuing neurogenesis of thousands of new neurons a day (contrary to some outdated info in decades-old books). Every day some of your 100 billion neurons die and some new ones appear. You remain yourself. Gradual neural replacement would be little different in principle.

That's most obvious in an alternate possible technique of gradual brain replacement through new neurons genetically engineered to be less susceptible to senescence, given sufficiently advanced biotechnology. However, it's also true if they are nanorobotic replacements, which provide the potential for improved capabilities and longevity beyond anything biological. (After everything is replaced, the fun can begin, like "slowing down time" by speeding up one's brain to think many times faster if or when desired, since biological neuron networks operate on timeframes of milliseconds, but artificial ones could be capable of functioning on timeframes such as microseconds).
Shroom Man 777 wrote:And as less and less of you remain, you become less and less of a person while the cybernetic component becomes more you (to the outside world).

In the end, all that's left of "you" inside your mind is a feeble blabbering marginalized thought-process no different from a baby or that of an old person and darkness encroaches you as you finally achieve "immortality" as that last fleshy bit of your brain is replaced
No, your thought-process is based on the new nanotech neurons just like it incorporated the new biological neurons when one grew from being a child to an adult. You'd just need to watch out that you weren't starting to notice anything going wrong like memory loss, e.g. though it still beats dying you don't want to end up with the equivalent of Alzheimer's disease, but good enough technology should be able to avoid that.

Besides, this is a voluntary medical procedure, not a horror movie. If the person experienced anything wrong, if negative side-effects were noticed, he or she could stop the gradual process at any time.

Predating modern plastic surgery, The Island of Doctor Moreau is typical of the dystopic perspective common in fiction by portraying the ability to surgically modify animals and humans being used for horror and creating monsters. Yet, in the real world, there's a lack of motivation for such, and the actual use of plastic surgery is often to restore victims of serious burns and other injuries to a more normal appearance ... aside from other uses of less practical value or necessity but still not horrible.

Likewise, if the technology for gradual neuron replacement is eventually developed, it will tend not to be used to create horrible experiences but to extend the enjoyment of life, like other medical technologies.
Shroom Man 777 wrote:that's gonna be interesting (story) material
The tendency to look for dystopic perspective for fiction is understandable. After all, a story that is all happiness, peace, and light without conflict would tend to be boring compared to entertaining fiction with suffering, angst, and violence. The mirror universe Star Trek episodes were among the best.

However, it isn't necessary for technology to be the problem. For example, Star Trek is sometimes criticized from how often the conflict in the plot comes from events seeming to be the result of engineering incompetence. Technology is the backdrop of a story's setting, but conflict can come from other sources, like war, interpersonal conflict, crime, etc.

To me, it seems that dystopic portrayal of new technologies has become so common as to be like a cliché. For example, it's boringly predictable when genetically-engineering people appear in a TV show to usually have them portrayed as the bad guys and condone racist/speciest discrimination against them.

I'm reminded of how one web page illustrates the common tendency towards looking more for problems than for solutions:
THE DOCTOR'S DILEMMA

Suspend your disbelief and imagine the following miracle to have occurred: A young doctor working in a hospital discovers that he has the power to cure anyone under the age of seventy of any sickness or injury simply by touching the patient. Any contact, however brief, between any part of his skin and the skin of the patient will cure the disease.

He has always been devoted to his work, and he wants to use his gift to benefit humanity as much as possible. However, he knows that the gift is absolutely non-transferable, will last for his lifetime only, and will not persist in tissue separated from his body. This was explained by the angel or flying saucerite who gave it to him.

What will happen if he uses his gift?

What should he try to do and how should he go about it?

What is the most favorable result that can be expected?

I consider myself a member of the scientific rather than the literary culture, and my idea of the correct answers to the above questions reflects this. However, in order to mislead the reader, I shall give some pessimistic scenarios and related literary exercises.

LITERARY EXERCISES IN PESSIMISM AND PARANOIA

1. The doctor uses his gift, the other doctors are jealous and disbelieving and drive him from the hospital. He cures patients outside, they get him for quackery and put him in jail where he can't practice. Even in jail, he cures people, and the prison doctor has him put in solitary confinement. Even there he cures a guard of cancer and then the little daughter of the warden of the prison. This arouses the fears of the insecure, narrow minded, brutalized and bureaucratized prison doctors to the extent that they have him sent to a hospital for the criminally insane to be cured of his delusion. There, they lobotomize him. Write scenes in which doctors disbelieve cures taking place before their eyes, self justifying speeches by people who decide to imprison him even though they know better, and the report justifying his commitment to the mental hospital.

2. His gift is judged sacrilegious by the church of your choice. Fanatics are aroused by preachers, and our hero is burned at the stake. Write a speech justifying burning the doctor as a lesser evil compared to letting him go on violating God's law that man must suffer disease and death.

3. His gift is judged holy by a religion that gets control of him, and its use is surrounded by so much ritual that hardly anyone gets cured. Describe the ritual; make it beautiful.

4. People keep coming to him until he is exhausted, but there is always an emergency case more touching than all that have gone before and eventually he dies of exhaustion. Write his speech saying that he realizes he can cure more people if he gets some sleep, but true morality requires him to treat the immediate emergency.

5. He forms an organization for curing people and at first works very hard but gradually gets lazy, is corrupted by desire for money, power, fame and women, requires more and more flattery and obsequiousness, eventually strives single-mindedly for power, develops cruel tastes, comes to dominate the country, and is finally assassinated. Write speeches for him justifying his increased demands at various stages. Write the self-justifying speech of the assassin.

6. He is taken over by the U.S. government which either:

a. keeps him to cure members of the ruling military-industrial complex and to co-opt leaders of the people. Describe the subtle way in which a revolutionary is co-opted in the guise of being given a say in how the gift shall be used. Write the speech of a revolutionary refusing to be cured of his wounds after unsuccessfully trying to blow up the doctor.

b. devises a system of boards to allocate the use of his ability in the fairest possible way, but its operation is frustrated by injunctions and demonstrations by paranoid groups (your choice as to whether the groups are left, right or center) that cannot be convinced that his services are being allocated fairly. Write speeches charging that any of the following groups are not getting their fair share: Blacks, veterans, the poor, Southerners, policemen. Make up lists of demands on behalf of these groups.

[...]

12. In order to destroy his gift the doctor tricks some scientists into skinning him alive. Explain why he does this.

13. He brings about universal health and the population explodes.

14. Universal health is achieved, but when he dies medicine has been neglected, immunities are gone and plague wipes us out.

15. Write a great American novel combining as many of the above catastrophes as possible.

16. Write an impassioned letter to him urging him to keep his gift secret.

I believe that all the above catastrophes would be avoided and the gift made into a great benefit. Those readers who consider themselves as members of C. P. Snow's scientific culture should try to work out the best solution for a day or so before going to the solution page.

[...]

[...]

[...]

[...]

[...]

[...]

SOLUTION TO THE DOCTOR'S DILEMMA

A solution requires morality, common sense, and technology.

You flunk on moral grounds if you propose not to cure anybody.

Any attempt to cure as many people as possible gets a B. To get an A, you must do the arithmetic and see that it is possible to cure almost everybody for a while.

Clearly the gift is finite. The doctor will eventually die, and his patients will face disease again as they will anyway when they reach seventy. This is no reason not to get the maximum benefit.

It turns out that he can cure everyone in the world whose disease or injury can be diagnosed in time to bring him to the doctor. The solution is technological.

Approximately 60,000,000 people under seventy die each year, i.e. two people die each second. We build a machine that can move 12 people per second past him on each of ten moving belts. A mechanism should be provided to stop the motion of the finger of the patient momentarily so that it touches the doctor rather than brushes his skin.

On the basis of the arithmetic the doctor need only spend 1/60 th of his time curing people, i.e. 24 minutes per day.

In order to reduce transportation costs it might be desirable to build a number of machines in different regions of the world and for the doctor to make trips to these machines, say once a month, to get the slow diseases, and to fly the emergency cases to wherever he happens to be.

It would not be very difficult for the doctor to get this solution adopted given a reasonable degree of persuasiveness either on his own part or on the part of some former patients he could recruit to help him. Doctors are often skeptical, but we have postulated a miracle that would convince almost all of them. Politicians are often shortsighted and bureaucrats bumbling, but what would be required in this case is simple enough so that they could do it. It is not possible to predict whether any important opposition to the use of the gift would develop. If so, it might be necessary to protect the doctor from assassination and the equipment from sabotage, and even then, there would be some risk of disaster.

I have not postulated any mental or physical side effects but it would be necessary to watch for them as well as for possible adverse social side effects.

The use of this gift would contribute to the population problem but not so much as one might think. In the U.S. 4,000,000 people are born each year but less than 1,000,000 under 70 die each year and most of these are past the child-bearing age. Elimination of death under 70 would require for stabilising the population that couples limit themselves to an average of say 2.1 children rather than the 2.2 children that might be allowable otherwise.

In countries with larger death rates of young people the population effect would be larger, but ordinary medicine is already having a similar effect.

Some people find the above solution repulsive because it involves a big machine with moving belts which would probably be noisy. Maybe they don't like a technological solution to what has been conceived as a moral problem.

Other people think that a law of nature is surely being violated - namely, a law that says that any apparently worthwhile innovation involving technology surely must have harmful side effects at least equal in magnitude to the apparent benefit.

There remains, however, the literary problem. Namely, imagine that the above analysis is correct and that the problem would be solved. Imagine further that the doctor, while posessing the gift of healing, is not a super-organizer or super-hero of any sort. How could one make literature of such a situation. The pessimistic and paranoid fantasies of the previous section make much better literature at least by present literary standards.
From here.
Image
[/url]
Image
[/url]Earth is the cradle of humanity, but one cannot live in the cradle forever.

― Konstantin Tsiolkovsky
User avatar
Ohma
Jedi Knight
Posts: 644
Joined: 2008-03-18 10:06am
Location: Oregon
Contact:

Post by Ohma »

petesampras wrote:I'd also make the claim that, despite what they might say, a lot of proclaimed materalists believe, implicity, in some sort of soul. They just use terms like "personal perspective" or "concious experience", but what they are speaking about it just as inscrutable as any religious definition of soul.
I fail to see how the people who argue that creating an exact robot replica of them, and killing the original in the process will result in the original them being dead and their robot doppelganger going on to be relatively indistinguishable to the outside world, but still a separate entity from the original biological person, who is dead, can be said to be the ones who believe in a soul.
Oh, Mister Darcy! <3
We're ALL Devo!
GALE-Force: Guardians of Space!
"Rarr! Rargharghiss!" -Gorn
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

Ohma wrote:
petesampras wrote:I'd also make the claim that, despite what they might say, a lot of proclaimed materalists believe, implicity, in some sort of soul. They just use terms like "personal perspective" or "concious experience", but what they are speaking about it just as inscrutable as any religious definition of soul.
I fail to see how the people who argue that creating an exact robot replica of them, and killing the original in the process will result in the original them being dead and their robot doppelganger going on to be relatively indistinguishable to the outside world, but still a separate entity from the original biological person, who is dead, can be said to be the ones who believe in a soul.
Right, but that is a strawman. If I kill you then you are dead. That is biology. If I scramble your cerebral cortex, but leave your lower functions in tact, then you are not dead. You have no mind, no 'conciousness', but you are not dead. Dead and alive are biological states. The discussion here is about conscious identity, not life or death.

The question is not - "If I kill you, but there is a perfect copy of you, are you still alive?" - that would be a pointless question. Since life and death are biological terms.

The question is - "If I kill you, but there is a perfect copy of you, has anything meaningful been lost?" - Where the term 'meaningful' here refers to the perspective of consciousness.

If you take the view of AI and cognitive science, that the mind is defined purely by the information it contains and processes, then the answer must be - no.
User avatar
Steel
Jedi Master
Posts: 1122
Joined: 2005-12-09 03:49pm
Location: Cambridge

Post by Steel »

petesampras wrote:
Ohma wrote:
petesampras wrote:I'd also make the claim that, despite what they might say, a lot of proclaimed materalists believe, implicity, in some sort of soul. They just use terms like "personal perspective" or "concious experience", but what they are speaking about it just as inscrutable as any religious definition of soul.
I fail to see how the people who argue that creating an exact robot replica of them, and killing the original in the process will result in the original them being dead and their robot doppelganger going on to be relatively indistinguishable to the outside world, but still a separate entity from the original biological person, who is dead, can be said to be the ones who believe in a soul.
Right, but that is a strawman. If I kill you then you are dead. That is biology. If I scramble your cerebral cortex, but leave your lower functions in tact, then you are not dead. You have no mind, no 'conciousness', but you are not dead. Dead and alive are biological states. The discussion here is about conscious identity, not life or death.

The question is not - "If I kill you, but there is a perfect copy of you, are you still alive?" - that would be a pointless question. Since life and death are biological terms.

The question is - "If I kill you, but there is a perfect copy of you, has anything meaningful been lost?" - Where the term 'meaningful' here refers to the perspective of consciousness.

If you take the view of AI and cognitive science, that the mind is defined purely by the information it contains and processes, then the answer must be - no.
I dont think anyone here has ever disagreed on the case of a net loss of information.

All people are saying is that the individual has perished. In exactly the same way as if you stick a spoon in their ear and give their brains a good twirl they are dead regardless of whether there is a clone backup/ computer backup/ identical version in a mirror universe.

You seem to be saying that people have a magical spirit, and if you destroy their physical body, yet at the same time create a new one the undetectable spirit flies into it and the person is none the wiser. However if there is no new body created then the magical spirit goes off to play with the elves. Patently shit.
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

Steel wrote:
petesampras wrote:
Ohma wrote: I fail to see how the people who argue that creating an exact robot replica of them, and killing the original in the process will result in the original them being dead and their robot doppelganger going on to be relatively indistinguishable to the outside world, but still a separate entity from the original biological person, who is dead, can be said to be the ones who believe in a soul.
Right, but that is a strawman. If I kill you then you are dead. That is biology. If I scramble your cerebral cortex, but leave your lower functions in tact, then you are not dead. You have no mind, no 'conciousness', but you are not dead. Dead and alive are biological states. The discussion here is about conscious identity, not life or death.

The question is not - "If I kill you, but there is a perfect copy of you, are you still alive?" - that would be a pointless question. Since life and death are biological terms.

The question is - "If I kill you, but there is a perfect copy of you, has anything meaningful been lost?" - Where the term 'meaningful' here refers to the perspective of consciousness.

If you take the view of AI and cognitive science, that the mind is defined purely by the information it contains and processes, then the answer must be - no.
I dont think anyone here has ever disagreed on the case of a net loss of information.

All people are saying is that the individual has perished. In exactly the same way as if you stick a spoon in their ear and give their brains a good twirl they are dead regardless of whether there is a clone backup/ computer backup/ identical version in a mirror universe.


You seem to be saying that people have a magical spirit, and if you destroy their physical body, yet at the same time create a new one the undetectable spirit flies into it and the person is none the wiser. However if there is no new body created then the magical spirit goes off to play with the elves. Patently shit.
I have said nothing of the sort. Care to provide exact quotes of mine to back up that accusation?

There is no need for a magic spirit. For the simple reason that there is no special individual experience that needs to get transfered in the first place! All there exists to mental phenomena, based on all the available evidence, is information. Subjective experiences, the redness of red, qualia, whatever you want to call them - there is zero evidence that it exists. Our brains are machines processing information. No need for any magic uplink to a new body if you die. All that was ever there, in terms of your mental identity, was information.
Gigaliel
Padawan Learner
Posts: 171
Joined: 2005-12-30 06:15pm
Location: TILT

Post by Gigaliel »

Ohma wrote: I fail to see how the people who argue that creating an exact robot replica of them, and killing the original in the process will result in the original them being dead and their robot doppelganger going on to be relatively indistinguishable to the outside world, but still a separate entity from the original biological person, who is dead, can be said to be the ones who believe in a soul.
Because you're assigning the Original special properties do to its place in space/time and what medium it's running on? This being opposed to the mental side that suggests a mind is independent of the brain as it can be run on a computer with no experimental difference at all?

Remember the first giganto Starglider post? At the end of it, the conclusion is that an AI would have to model the possible futures of itself and the other self as the same person in order to answer what its subjective future will be.

As for the robot clone thing consider if we have you two identical ones made from a template. Then they have different experiences and their minds diverge. One has its mind erased and then replaced with the template and the other has its mind wiped back to the point it diverged from the template.

Why is the one who suffered amnesia have any special claim to its perspective surviving over the one who is now 'dead' and has been replaced? The differences between the two were their new experiences-those are now gone. Where did their perspectives go? Why is there any difference? The physical body?

This is the point I'm not getting. Why is your material body imperative to your perceptive? People have said that if an identical copy of your mind replaces in your sleep, you die. Right. So, if I turn your brain off and then turn it back on later, you died? Despite no empirical difference?

To emphasize, if we take for granted that all phenomena is by definition physical- then so is self. This is why identical copies must be considered the same person. Thus, death is identical to amnesia.

Consider the death scenario: You were also 'alive' (frozen/paused) somewhere else when the other you died. You do not remember this death because there was no chance to sync your memories. The other you is the same person, but with more memories. When the other you died, all the new memories were destroyed. The other you still has all the old ones. Amnesia.

There, no magical transportation or spirit bull you keep bringing up. All I did is to identify their minds as the same person because their identities were, well, identical

Now, where is this different perceptive coming from if not the minds, which are identical? I can't see a definition that doesn't involve an AI dying billions of times if it switches which folder it's in billions of times. Biting this bullet would be acceptable, but I'm not sure how you'd go about show it's an objective fact (as is being argued) rather than something subjective.
User avatar
Resinence
Jedi Knight
Posts: 847
Joined: 2006-05-06 08:00am
Location: Australia

Post by Resinence »

I'm epic failing at my attempts to explain what I'm thinking (and how to argue it to others), so now I've just dumped how I see things in my head into a slideshow, since pictures make it easier for me (and maybe others) to understand. I'm mostly trying to explain how I see the issue of flash clones and how it doesn't necessarily mean death, which the last slide has most of my thoughts on, but it's all related.

Link (300kilobyte, Flash)
“Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation.” - Oscar Wilde.
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

Gigaliel wrote: There, no magical transportation or spirit bull you keep bringing up. All I did is to identify their minds as the same person because their identities were, well, identical
This, to my mind, is key to the whole thing. If you have two copies of your mental identity nothing needs to be transfered if one is destroyed. The information was already in the other copy.

It comes down to whether you accept mental phenomena as purely information constructs. I think, actually, a better question to get the two sides to see each other position, is not to do with death and re-birth issues but...

"If you have two identical copies of a mind, both completely unaware of each others existance, are they the same mind?"

I think one side of the camp will view them as not being - since they would think that to be the same mind some shared mental connection would be needed. For the other side, the same mind means the same information. It is no different than asking whether two copies of Moby Dick are the same story.
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

I've tried to lay out as simply as possible the cognitive science / AI position on this, as i understand it...

Premise 1 - A mind is defined purely as a product of the information it stores and processes.

Premise 2 - If you have two copies of the same mind and destroy one, no information is lost.

It is a purely logical consequence of these two premises that..

If you have two copies of the same mind and destroy one, the mind is not lost.

So, if you disagree with that statement, you must disagree with one, or both, of the two premises stated.
User avatar
Sikon
Jedi Knight
Posts: 705
Joined: 2006-10-08 01:22am

Post by Sikon »

As far as I know, the many-worlds interpretation of quantum mechanics with parallel universes is possibly, maybe true, so there might be multiple copies of my mind right now. That doesn't change my interest in avoiding death. I still want to avoid destruction of this particular copy of the information pattern, whether or not other identical or near-identical consciousnesses exist out there too.
Image
[/url]
Image
[/url]Earth is the cradle of humanity, but one cannot live in the cradle forever.

― Konstantin Tsiolkovsky
User avatar
Singular Intellect
Jedi Council Member
Posts: 2392
Joined: 2006-09-19 03:12pm
Location: Calgary, Alberta, Canada

Post by Singular Intellect »

petesampras wrote:I've tried to lay out as simply as possible the cognitive science / AI position on this, as i understand it...

Premise 1 - A mind is defined purely as a product of the information it stores and processes.

Premise 2 - If you have two copies of the same mind and destroy one, no information is lost.

It is a purely logical consequence of these two premises that..

If you have two copies of the same mind and destroy one, the mind is not lost.

So, if you disagree with that statement, you must disagree with one, or both, of the two premises stated.
Pointing out premise one is incorrect is easily done, which also invalidates number two.

Tell me, say you know a fellow named 'Bob'. Bob is forgetful, which most people would consider a significant attribute of who he is. Is that merely information, or something else? What if Bob is has very fast reflexes, and thus makes him a ping pong champion, a sport he loves. This is argueably also something very significant that makes Bob the person who he is. Is that also just a result of information? What about his disposition? What if he's prone to violent outbursts, or suffers bouts of depression. Is this merely information in his mind, or something else?

What makes up an individual is obviously much more complicated than the ridiculasly simplied idea of just 'information'.

It's even far more complicated when you take into account people do change. Maybe Bob suffered episodes of depression, but has since changed and is now much happier than he was. What if Bob gained weight by working out, becoming healthier and more fit. It's extremely easy to point out how all these changes might consitute a 'new' Bob...but do we consider him an entirely different entity because of that, or an altered version? Improved version? What?

If you intent to define a person as merely information, then wow, my two harddrives are two different people since both contain large quantities of information.
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

Bubble Boy wrote:
petesampras wrote:I've tried to lay out as simply as possible the cognitive science / AI position on this, as i understand it...

Premise 1 - A mind is defined purely as a product of the information it stores and processes.

Premise 2 - If you have two copies of the same mind and destroy one, no information is lost.

It is a purely logical consequence of these two premises that..

If you have two copies of the same mind and destroy one, the mind is not lost.

So, if you disagree with that statement, you must disagree with one, or both, of the two premises stated.
Pointing out premise one is incorrect is easily done, which also invalidates number two.
Invalidating premise one would not invalidate premise two, they are seperate statements.

Tell me, say you know a fellow named 'Bob'. Bob is forgetful, which most people would consider a significant attribute of who he is. Is that merely information, or something else? What if Bob is has very fast reflexes, and thus makes him a ping pong champion, a sport he loves. This is argueably also something very significant that makes Bob the person who he is. Is that also just a result of information? What about his disposition? What if he's prone to violent outbursts, or suffers bouts of depression. Is this merely information in his mind, or something else?
Everything you have expressed here can be represented adequately using information, with no need to invoke any special mind properties. Depression can be described via the processing of information, as can violent outbursts. There is no reason, in theory, that you could not program a sufficiently powerful computer to have violent outbursts according to the same pattern that Bob does. Thus the violent outbursts can be generated purely through information processing. The brain does processes information. There is no need to invoke special mind properties, apply occams razor and information processing is all you need.

What makes up an individual is obviously much more complicated than the ridiculasly simplied idea of just 'information'.

It's even far more complicated when you take into account people do change. Maybe Bob suffered episodes of depression, but has since changed and is now much happier than he was. What if Bob gained weight by working out, becoming healthier and more fit. It's extremely easy to point out how all these changes might consitute a 'new' Bob...but do we consider him an entirely different entity because of that, or an altered version? Improved version? What?
This is circular reasoning because it is only a problem if you insist that your mental identity is a fixed and non-continous phenomena; which in turn is only necessary if you take the view that there exists a fundamentaly unique and individual experience. There is no reason why one must do this. I am clearly less the person I was 10 years ago than the person I was 5 years ago. Were we to live forever, one might imagine that we might eventually become entirely different people.
If you intent to define a person as merely information, then wow, my two harddrives are two different people since both contain large quantities of information.
False reasoning. That the mind is information does not mean that everything containing information is a mind.
User avatar
Ohma
Jedi Knight
Posts: 644
Joined: 2008-03-18 10:06am
Location: Oregon
Contact:

Post by Ohma »

petesampras wrote:I am clearly less the person I was 10 years ago than the person I was 5 years ago. Were we to live forever, one might imagine that we might eventually become entirely different people.
However that doesn't mean that 10-years-younger you is equivalent to dead. Were that the case, you would not exist (unless something more or less impossible happened).
Oh, Mister Darcy! <3
We're ALL Devo!
GALE-Force: Guardians of Space!
"Rarr! Rargharghiss!" -Gorn
petesampras
Jedi Knight
Posts: 541
Joined: 2005-05-19 12:06pm

Post by petesampras »

Ohma wrote:
petesampras wrote:I am clearly less the person I was 10 years ago than the person I was 5 years ago. Were we to live forever, one might imagine that we might eventually become entirely different people.
However that doesn't mean that 10-years-younger you is equivalent to dead. Were that the case, you would not exist (unless something more or less impossible happened).
As has been stated repeatedly. Alive and dead are biological phenomena. If I got into a car accident and lost all higher brain functions I could still be alive, but I would have no mind. Equally, there is no reason a non living machine could not, in theory, have a mind.
User avatar
Singular Intellect
Jedi Council Member
Posts: 2392
Joined: 2006-09-19 03:12pm
Location: Calgary, Alberta, Canada

Post by Singular Intellect »

To sum up my point of view, if I was asked what would I lose by being copied 100% mentally and physically (while my original self is destroyed in process), I would assert I'm losing my specific subjective frame of reference of self. My copy wouldn't lose that, but would actually just be obtaining it since his subjective frame of reference of self is just beginning after completion of his creation.

Obviously a subjective frame of reference of self is utterly impossible to transfer with the destruction method (copying would be easy), by virtue of two of me capable of existing at once and both possessing such.

The only way such a frame of reference isn't lost is if the process is not a method of duplication, but a method of alteration. Hence, why things like learning to play ping pong or growing up doesn't make me a seperate entity because the frame of reference for self is unchanged. Destruction of an entity would clearly destroy that subjective frame of reference, even if another subjective frame of reference is created.
User avatar
Ohma
Jedi Knight
Posts: 644
Joined: 2008-03-18 10:06am
Location: Oregon
Contact:

Post by Ohma »

petesampras wrote:As has been stated repeatedly. Alive and dead are biological phenomena. If I got into a car accident and lost all higher brain functions I could still be alive, but I would have no mind. Equally, there is no reason a non living machine could not, in theory, have a mind.
And I or others have argued that AIs couldn't?
Oh, Mister Darcy! <3
We're ALL Devo!
GALE-Force: Guardians of Space!
"Rarr! Rargharghiss!" -Gorn
Post Reply