Brain Recording

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Re: Brain Recording

Post by Kuroneko »

Stas Bush wrote:
Kuroneko wrote:What's the importance of the replacement being slow as opposed to arbitrarily fast or even (hypothetically) instantaneous?
The creation of a second instance of yourself.
What are you talking about? No copies were involved at that point. The only purpose of that scenario was to ensure consistency with the case of copying--the later variation.
Stas Bush wrote:So as long as two instances do not exist simultaneously, the transfer is different from when both exist simultaneously. Is that really a "nebulous" criteria"? I don't think so.
If that's your criteria of personhood, which is the real issue, then you contradict yourself in very next breath:
Stas Bush wrote:
Kuroneko wrote:If the original matter is reassembled into the same configuration, ... .
Both are the same person.
If both are the same person, then one person exists prior to the "clone-disposal" and one afterward. How does that square with your use of "murder"? Do you mean that such terms (incl. personal pronouns, presumably) refer to something more specific than individual persons?
Stas Bush wrote:You may say that from the human perspective, something existing less than the timeframe of a neuron firing may as well be nonexistent - but that's not so from the objective perspective.
That has absolutely nothing to do with anything I've said here. The short timeframe is only to ensure there is no divergence between mind states and hence (by one definition) the same personal identity and hence (also by one definition) the same person.
Stas Bush wrote:In that case you seem to think that if two copies exist simultaneously for less than the timeframe of human nerve activity, one of them in fact has not objectively existed.
Perhaps it might seem that way, but if you think logically about what I've actually said, I'm sure you'll realize that it is not stated in or logically by any of my posts in this thread. I'm only talking about personhood; that physically separated objects can nevertheless be the same person is something you've explicitly admitted above, so at this point I'm not even sure what you're arguing against. It would help quite a bit if you explicitly defined your position (cf. (1)-(3)).
"The fool saith in his heart that there is no empty set. But if that were so, then the set of all such sets would be empty, and hence it would be the empty set." -- Wesley Salmon
User avatar
K. A. Pital
Glamorous Commie
Posts: 20813
Joined: 2003-02-26 11:39am
Location: Elysium

Re: Brain Recording

Post by K. A. Pital »

Kuroneko wrote:What are you talking about? No copies were involved at that point. The only purpose of that scenario was to ensure consistency with the case of copying--the later variation.
Replacement is different from copying, since no copies are created at this point. That's a critical difference and it's easy to grasp, it stays true for conscious entities and simple material objects alike.
Kuroneko wrote:If both are the same person, then one person exists prior to the "clone-disposal" and one afterward. How does that square with your use of "murder"? Do you mean that such terms (incl. personal pronouns, presumably) refer to something more specific than individual persons?
One personality exists in multiple instances. This is uncommon for sentient entities, because no examples of such have been hitherto produced. But it's fairly common with inanimate material objects - identical tools, etc. Why do I use "murder"? Well, I can use "destruction" - I don't really think we're dealing with something else than just matter and energy here.

The closest instance where we can come to an experimental simulation of this situation is, say, biological twins at the moment of conception. Quite likely there are no unique differences in their biological structure for a timeframe after it happens; it is also likely that their brain is not self-aware. If we terminate one of them before any differences manifest, can we say that no destruction has occured and we're dealing with the same object, "man", whatever, as the destroyed one?
Kuroneko wrote:The short timeframe is only to ensure there is no divergence between mind states and hence (by one definition) the same personal identity and hence (also by one definition) the same person.
Hah. But the subjective perception of both instances of one person is irrelevant here, since we objectively know both exist, they occupy different positions in space.
Kuroneko wrote:I'm only talking about personhood; that physically separated objects can nevertheless be the same person is something you've explicitly admitted above
It can be the "same" person, but it's definetely not the same material object. The existence of two material objects, even identical, doesn't allow us to state that it's some sort of "uploading" instead of admitting that this is a creation of a physical copy and the destruction of the other copy.

Yeah, I see your position. You seem to consider it irrelevant for "personhood" because brain activity has not occured and the "persons" have not diverged. I prefer to not use "personhood" or "youness" or other descriptors of self-perception. From the point of an omniscient objective observer, or a logical machine observing the process, a copy was created and one of the copies was destroyed. That is all. The self-perceptions of both copies are irrelevant.

My position is fairly materialistic, I don't even see a need to differentiate between conscious and plain material objects here.
Lì ci sono chiese, macerie, moschee e questure, lì frontiere, prezzi inaccessibile e freddure
Lì paludi, minacce, cecchini coi fucili, documenti, file notturne e clandestini
Qui incontri, lotte, passi sincronizzati, colori, capannelli non autorizzati,
Uccelli migratori, reti, informazioni, piazze di Tutti i like pazze di passioni...

...La tranquillità è importante ma la libertà è tutto!
Assalti Frontali
User avatar
K. A. Pital
Glamorous Commie
Posts: 20813
Joined: 2003-02-26 11:39am
Location: Elysium

Re: Brain Recording

Post by K. A. Pital »

To explain the use of terms: let's consider identical weapons, say, the V-1 missile. All missiles are serial, for the sake of a logical machine we can assume they are absolutely identical (margin of error by humans is irrelevant here, because there would be a margin of error in copying the brain as well).

Why do we refer to the surviving examples of the missile as "surviving"? Does it imply that other instances of the same missile are different in some way from those which are not destroyed?

This is why I used "murder" when referring to the destruction of identical copy of a human. The destruction of humans is most of the time called "murder". You may object and say that the termination occured in a such small timeframe when the subject copy did not experience consciousness. That is irrelevant - if someone destroys a person during it's sleep, or loss of consiousness, or coma, it would still be called "murder", wouldn't it?

You may feel the word has bad connotations; in that case I'd like to see a solid explanation why the destruction of a second human being is not murder.
Lì ci sono chiese, macerie, moschee e questure, lì frontiere, prezzi inaccessibile e freddure
Lì paludi, minacce, cecchini coi fucili, documenti, file notturne e clandestini
Qui incontri, lotte, passi sincronizzati, colori, capannelli non autorizzati,
Uccelli migratori, reti, informazioni, piazze di Tutti i like pazze di passioni...

...La tranquillità è importante ma la libertà è tutto!
Assalti Frontali
TheLostVikings
Padawan Learner
Posts: 332
Joined: 2008-11-25 08:33am

Re: Brain Recording

Post by TheLostVikings »

Stas Bush wrote:
Kuroneko wrote:What are you talking about? No copies were involved at that point. The only purpose of that scenario was to ensure consistency with the case of copying--the later variation.
Replacement is different from copying, since no copies are created at this point. That's a critical difference and it's easy to grasp, it stays true for conscious entities and simple material objects alike.
Replacement and copying is the exact same process, and you can not prove that isn't so.

Each time you go to replace an individual neuron with an replacement copy, it logically follows that you must first have made a copy of the original, otherwise how exactly are you getting that identical replacement?

Whether you can safely copy them all at once without risking loss of data, or must slowly copy them one by one is irrelevant, and largely a matter of engineering and available technology. Thus if you believe that gradual replacement is possible you must also concede that sufficient advanced technology would make the "copy" action identical.

Q.E.D.
User avatar
K. A. Pital
Glamorous Commie
Posts: 20813
Joined: 2003-02-26 11:39am
Location: Elysium

Re: Brain Recording

Post by K. A. Pital »

TheLostVikings wrote:Replacement and copying is the exact same process, and you can not prove that isn't so.
Really? Does it come to your mind that the existence of two different instances of the object in spacetime is copying, while when such a copy does not exist, it's replacement? Two different sentient entities do not exist in different points of spacetime, ergo, there has not been a copy of the human in question.
TheLostVikings wrote:Each time you go to replace an individual neuron with an replacement copy, it logically follows that you must first have made a copy of the original, otherwise how exactly are you getting that identical replacement?
Copying a part of an object does not mean we have created a copy of the entire object. To demonstrate: if we created half of a glass ball, that's not a copy of the ball. It's even more striking in case of a brain - individuals copies of neurons aren't a copy of the brain until they are assembled in the extact same pattern as the brain in question
TheLostVikings wrote:Thus if you believe that gradual replacement is possible you must also concede that sufficient advanced technology would make the "copy" action identical.
I never assumed the copy is not identical. You might as well say the copies of V-1 are identical, in my example above. After all, they are just copies, right?

To elaborate more on "indentical" scenario. Assume we made two clones via the Angier Tesla machine, and both awake in an indentical maze, totally incapacitated. Both are driven on a railway along the maze, receiving the exact same sensory input - for the sake of logical machine, individual variations of sight, hearing and smell here are just an error.

At the end of the maze, two identical men with guns stand. One kills the arriving man, the other does not. Was murder commited?
Lì ci sono chiese, macerie, moschee e questure, lì frontiere, prezzi inaccessibile e freddure
Lì paludi, minacce, cecchini coi fucili, documenti, file notturne e clandestini
Qui incontri, lotte, passi sincronizzati, colori, capannelli non autorizzati,
Uccelli migratori, reti, informazioni, piazze di Tutti i like pazze di passioni...

...La tranquillità è importante ma la libertà è tutto!
Assalti Frontali
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Re: Brain Recording

Post by Kuroneko »

Stas Bush wrote:Replacement is different from copying, since no copies are created at this point.
Some answers to one have logical implications to possible answers on the other (a matter of consistency), which is (as I've said more than once before) the only reason I'm bothering to ask about it in the first place. That is because copying can be achieved by replacement immediately followed by reassembly of the original bits. Your continued avoidance of the no-copy case is therefore unwarranted.
Stas Bush wrote:One personality exists in multiple instances.
Fair enough. A huge chunk of this exchange could've been avoided if you've simply said that you interpret personal pronouns (and, I gather, various terms like 'murder') as refering to instances of persons rather than persons. Which was question (3) in the first post you've replied to, and an issue I've repeatedly asked you to clarify.
Stas Bush wrote:If we terminate one of them before any differences manifest, can we say that no destruction has occured and we're dealing with the same object, "man", whatever, as the destroyed one?
In asmuch as there was a person in the first place at that point, we can say that no destruction of a person has occurred in those circumstances, but that person was in a sense diminished by that destruction. Your position implies this as well, since you've stated that having the same state implies being the same person.
Stas Bush wrote:Hah. But the subjective perception of both instances of one person is irrelevant here, since we objectively know both exist, they occupy different positions in space.
For the third time: my argument was absolutely nothing to do with subjective perceptions of those instances, either self-perception or perception by any other being. That they have the same state is an objective criterion.
Stas Bush wrote:It can be the "same" person, but it's definetely not the same material object.
I've no idea why you keep on pressing on that straw-man. No one here has claimed that they can't be materially distinguished; on the contrary, the Duchess and Starglider seemed to have repeatedly gone out of their way to acknowledge this fact.
Stas Bush wrote:From the point of an omniscient objective observer, or a logical machine observing the process, a copy was created and one of the copies was destroyed. That is all.
It's not quite all--from the same viewpoint, no relevant information was lost or gained in that process.
Stas Bush wrote:The self-perceptions of both copies are irrelevant.
I fully agree with that statement.
Stas Bush wrote:I don't really think we're dealing with something else than just matter and energy here.
And that's really the crux of the matter. I don't feel attached to the particular atoms I'm made of, and indeed any part of me can hypothetically be replaced with something else entirely as long is functions in a sufficiently close manner. Hence, I see "me", and personal identities in general, in terms of information or functional structure rather, so that we are dealing with something else than matter and energy--information and function. Matter and energy are only important in as much as they enable that function.
"The fool saith in his heart that there is no empty set. But if that were so, then the set of all such sets would be empty, and hence it would be the empty set." -- Wesley Salmon
User avatar
K. A. Pital
Glamorous Commie
Posts: 20813
Joined: 2003-02-26 11:39am
Location: Elysium

Re: Brain Recording

Post by K. A. Pital »

Kuroneko wrote:A huge chunk of this exchange could've been avoided if you've simply said that you interpret personal pronouns (and, I gather, various terms like 'murder') as refering to instances of persons rather than persons. Which was question (3) in the first post you've replied to, and an issue I've repeatedly asked you to clarify.
The point between "person" and "instance of person" is really moot. Consider the twins example. Both are born to a surgeon. He stabs one in the neck, but the other lives. Until the point of the destruction of one, their sensory input has been identical; their brains and bodies are genetically identical. Is that infanticide or is it the erasure of "another me"?

Do we consider the potential continued existence of Copy A as worthwhile as a characteristic and having it's own personal implications, or not? What if Q decided that Earth is too far from his liking, and for an instance created another fully identical Earth 200,000 LY away, making 14 billion humans exist at one point in time, and then "erase" the original 7 billion along with their Earth, is that just erasure or mass murder - the potential future existence of these another 7 billion is a personal characteristic or not?
Kuroneko wrote:It's not quite all--from the same viewpoint, no relevant information was lost or gained in that process.
Do we decide on the basis of only gain or loss of information? In the infanticide of a twin, no genetic or memory information that would not be duplicated was lost. Does that make it valid?

The question is: does the future existence of Copy A alongside Copy B have a FUTURE informational value - i.e. more information would be gained if both exist into the future than in case just one survives? Does it have a moral value as well?

If not, then thank you for the discussion - I believe we stand roughly on the same position but have different views on the value of the existence of identical objects.
Lì ci sono chiese, macerie, moschee e questure, lì frontiere, prezzi inaccessibile e freddure
Lì paludi, minacce, cecchini coi fucili, documenti, file notturne e clandestini
Qui incontri, lotte, passi sincronizzati, colori, capannelli non autorizzati,
Uccelli migratori, reti, informazioni, piazze di Tutti i like pazze di passioni...

...La tranquillità è importante ma la libertà è tutto!
Assalti Frontali
Forum Troll
Youngling
Posts: 104
Joined: 2009-02-15 05:00pm

Re: Brain Recording

Post by Forum Troll »

Starglider wrote:It's really quite tiring; hopefully future intelligences designed with sensible and comprehensive reflection capabilities will regard these debates with the same bemused amusement they regard religion.
You believe this is going to happen in reality?

What scans and uploads the brain? This isn't a microns-thick specimen slice but an object billions of times the volume.

MRI has a resolution of a fraction of a millimeter when doing an object that big. Field strength of equipment went from 0.3 tesla to 3 tesla to increase the resolution. Some units are 14 tesla, but too high is unsafe for humans. Even a sloppy virtual copy of an individual's brain would take thousands of times the resolution.

You might as well ask for a quadrillion teslas. Not going to happen.

Usage of x-rays, CAT scans? Again there's a reason such are a fraction of a millimeter resolution instead of perfect. Putting a relatively mild x-ray dose into the target brain to get current resolution is one thing, but seeking literally billions of times more bits of data and hitting a 1E15 cubic micron object with too many rads? Hot irradiated goo wouldn't constitute a brain any more.

Particle beams? There are 1E14 neural connections in the brain, and it'd take vastly more than one particle impact and scattering each to even roughly map them out, even if assuming acceptability of a sloppy imperfect copy while skipping molecular resolution or any knowledge of chemical distributions on that level of detail.

You can't use low-energy particles here as they have to be able to burn through the centimeters of material in their way. So again you probably dumped way too many particles and joules of hard radiation into what you were trying to scan again.

The Standard Model of current physics may or may be right, and there might be new particles discovered in the future. Yet it would be foolish to make any assumption that one of them would somehow be far better in this application than current particle choices, an idea screaming of wishful thinking rather than unbiased extrapolation.

Brain scanning for perfect uploads screams of a giant no-limits fallacy.

Ultrasound? Can't be infinite frequency or astronomical enough to obtain the resolution desired here. Infrared? Wouldn't pass through centimeters thickness of the brain.

Some futurists propose nanorobots to examine the brain from the inside up close, but the tech to do that would mean the tech to keep it alive without trying to attempt a sudden upload.
Forum Troll
Youngling
Posts: 104
Joined: 2009-02-15 05:00pm

Re: Brain Recording

Post by Forum Troll »

Starglider wrote:The thing is though that you can always conduct a series of thought experiments that show that if you accept so much as replacing a single neuron with a transistor without 'killing the self', you must inevitably accept flash uploading as logically equivalent.
Not convinced such are equivalent.

Killing one neuron doesn't kill a person. A few neurons out of 1E11 die every week. Killing them all at once, vaporizing the whole brain, would be totally different. If one neuron at a time is replaced, one couldn't be said to die in a particular moment even if the replacement was terribly imperfect in reality, but a terribly imperfect replacement of the whole brain at once couldn't even be truly called a copy, just a vague resemblance.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Brain Recording

Post by Starglider »

Forum Troll wrote:
Starglider wrote:It's really quite tiring; hopefully future intelligences designed with sensible and comprehensive reflection capabilities will regard these debates with the same bemused amusement they regard religion.
You believe this is going to happen in reality?
Creation of rational general AI yes, extremely likely, there are various things that could kill the current human civilisation (e.g. biowarfare) but that just means that the next civilisation or if necessary the next evolved sapient species builds one. Human uploading, extremely likely if we build AGI and don't kill ourselves in the process.
What scans and uploads the brain? This isn't a microns-thick specimen slice but an object billions of times the volume.
It is not necessary to scan the entire brain volume in one go as long as you are not concerned about preserving the human copy. The standard approach (typically discussed by transhumanists as the 'first generation' of the technique) is to perfuse the brain with vitrification (and optionally staining) agents then cool it below glassification temperature. You can then scan the brain in a large number of flat layers using mechanical or laser ablation. This is rather time consuming even with a massively parallel scanner array, but that's ok, there's no particular rush. Simple optical microscopy will work if the structure of the dendrite/axon trees is all that's required; some of the brain simulation people say that it is, because they can infer synapse properties from network structure, but conservatively you're going to want to measure synapse properties (specifically, relative density of the different neurotransmitter receptors). That means electron microscopy (SEM or REM) or AFM; AFM is easier to make massively parallel but what you gain in that you probably lose by making your layers so thin.
MRI has a resolution of a fraction of a millimeter when doing an object that big. Field strength of equipment went from 0.3 tesla to 3 tesla to increase the resolution. Some units are 14 tesla, but too high is unsafe for humans.
This is true only for single-coil machines that rely on magnetic gradient for localisation. Modern phased-array machines have already exceeded the resolution possible by magnetic gradient alone; I am certainly not an expert on MRI, but I have debated the software side of brain simulation with people who are and most of them seem quite confident that massively parallel phased arrays will eventually deliver the resolution required. Of course this still assumes that synapse properties can be inferred from structure and a predictive model; fMRI could potentially help with this, but neuronal scale fMRI would be very hard.
Usage of x-rays, CAT scans?
I've never seen anyone (seriously) propose CAT scanning as an upload technique, but actually it probably could work on a vitrified brain. You could dice it into small cubes and scan as slowly as necessary to avoid heating so much that it thaws. Similarly you could in principle slice into relatively think (say millimeter) layers and scan each of those with infrared; the problem is not so much reassembling the scans into a whole-volume model, that's a relatively straightforward stiching operation, the problem would be doing the slicing without losing significant volumes of brain. Really it doesn't seem worth the hassle, a full-brain progressive ablative scan is the best bet if you don't have adequate MRI.
Brain scanning for perfect uploads screams of a giant no-limits fallacy.
Actually the limiting factors are simply cost and time. We already have technology sufficient to image the entire brain at a molecular level, but a scan with current AFM technology would take an impossibly long time (I have not done the maths but I suspect billions of years). Fortunately nearly all of the relevant technology is subject to Moore's-law type scaling, which we have already observed in the rapid progress in imaging over the last two decades or so.
Some futurists propose nanorobots to examine the brain from the inside up close, but the tech to do that would mean the tech to keep it alive without trying to attempt a sudden upload.
I'm not sure what you're implying here. Certain of the more optimistic nanotech people seriously believe that nanorobotics is going to deliver this capability before external bulk imaging, but frankly I don't think those people are being realistic about the timescales for engineering development of nanorobotics (unless general AI comes along). However the physical plausibility of the relevant nanotechnology has been demonstrated quite adequately by computer models. There is no need for any special technology to 'keep the brain alive'; you simply attach an interface to a human and allow it to start threading their neural network with nanofibres. The model used in thought experiments tends to be 'a robot swims up to a neuron and measures or replaces it' but that's oversimplified for clarity, realistic designs tend to work by creating an artificial network in parallel with the existing biological one (it can be much smaller and lighter since it's externally powered and can use real processors and conductors). In passive mode this can be used to flash-upload at any time simply by capturing a structure and activity snapshot (this is a prerequisite for 'perfect' uploading of a conscious human; bulk scanning techniques imply actual or effective unconsciousness). In active mode you can replace neurons (with simulated equivalents) on a progressive basis, e.g. for 'gradual uploads' for the squeemish, or implement any of the interesting hypothetical splits discussed here.
Killing one neuron doesn't kill a person. A few neurons out of 1E11 die every week. Killing them all at once, vaporizing the whole brain, would be totally different.
In that case please answer my question of 'exactly how much of your brain can I flash upload in one go without killing you'. Of course you may not subscribe to the ludicrous binary notion of selfhood that certain other individuals in this thread hold to.
If one neuron at a time is replaced, one couldn't be said to die in a particular moment even if the replacement was terribly imperfect in reality, but a terribly imperfect replacement of the whole brain at once couldn't even be truly called a copy, just a vague resemblance.
If the difference is externally noticeable, then the technology is imperfect, no question. Early human-developed techniques will undoubtadly exhibit significant deviation, but that's fine, the technology will improve steadily (AGI developed versions may bypass this).
User avatar
K. A. Pital
Glamorous Commie
Posts: 20813
Joined: 2003-02-26 11:39am
Location: Elysium

Re: Brain Recording

Post by K. A. Pital »

I once again must ask what the hell is "selfhood" and are we truly justified in killing a second human if both have absolutely identical genetical information and nigh-identical information input in their brain? Do we hold their future existence as having any value? Do we look on it from an objective viewpoint, or are we just happy that the brain cannot subjectively notice, due to biological limitations, that it was copied and one of the copies was immediately destroyed, making the being unable to comprehend it's own destruction?

I assume it would be perfectly reasonable for me to go and kill one of the twins at moment of birth, after all, no unique genetic or memory information is lost, and an excessive mouth to feed for the human civilization is an unnecessary burden. Right? Does someone who has no time or no ability to comperehd it's own destruction have a "selfhood"? If yes, why - because it logically follows from your assumption that he has none. If no, see above.
Lì ci sono chiese, macerie, moschee e questure, lì frontiere, prezzi inaccessibile e freddure
Lì paludi, minacce, cecchini coi fucili, documenti, file notturne e clandestini
Qui incontri, lotte, passi sincronizzati, colori, capannelli non autorizzati,
Uccelli migratori, reti, informazioni, piazze di Tutti i like pazze di passioni...

...La tranquillità è importante ma la libertà è tutto!
Assalti Frontali
Forum Troll
Youngling
Posts: 104
Joined: 2009-02-15 05:00pm

Re: Brain Recording

Post by Forum Troll »

Starglider wrote:Human uploading, extremely likely if we build AGI and don't kill ourselves in the process.
Upgrading for immortality is more tolerant of imperfection and more reasonable than trying flash uploading.
MRI has a resolution of a fraction of a millimeter when doing an object that big. Field strength of equipment went from 0.3 tesla to 3 tesla to increase the resolution. Some units are 14 tesla, but too high is unsafe for humans.
This is true only for single-coil machines that rely on magnetic gradient for localisation. Modern phased-array machines have already exceeded the resolution possible by magnetic gradient alone; I am certainly not an expert on MRI, but I have debated the software side of brain simulation with people who are and most of them seem quite confident that massively parallel phased arrays will eventually deliver the resolution required. Of course this still assumes that synapse properties can be inferred from structure and a predictive model; fMRI could potentially help with this, but neuronal scale fMRI would be very hard.
Can you reference a publication for assumptions and math? "Brain simulation" is a bit nebulous here, since what is useful for medical or AI research is a lot different from what could debatedly constitute perfect uploading and immortality.

I'd be curious to see, but so far it looks like they may be just those who incorrectly assume Moore's Law applies everywhere.

If you have a billion coils instead of the handful in current phased array coil equipment, do you get a corresponding increase like a billion times the data per unit volume, or is scaling less than linearly proportional, subject to diminishing returns or running into physical limits long before?

Nanometer resolution has long been possible from nanometers away, but nanometer resolution from centimeters (many millions of nanometers) away on a 1E15 cubic micron object?
It is not necessary to scan the entire brain volume in one go as long as you are not concerned about preserving the human copy. The standard approach (typically discussed by transhumanists as the 'first generation' of the technique) is to perfuse the brain with vitrification (and optionally staining) agents then cool it below glassification temperature. You can then scan the brain in a large number of flat layers using mechanical or laser ablation. This is rather time consuming even with a massively parallel scanner array, but that's ok, there's no particular rush.
Much more likely to be possible.

However, even aside from the terrible economics and astronomical resource requirements, in the process the person would suffer a clear time of death, disruption of the original electrochemical potentials and activity.

Afterward, depending upon the amount of imperfection in the scan, either that person is restored to life, or, debatedly, an AI imperfectly resembling him with partially similar personality and memories is created.

Vitrification to somewhat reduce freezing damage as done in cryopreservation, ablation, ... the whole process is going to tend to have imperfections on the micro scale on the very inhomogeneous frozen brain.
Actually the limiting factors are simply cost and time. We already have technology sufficient to image the entire brain at a molecular level, but a scan with current AFM technology would take an impossibly long time (I have not done the maths but I suspect billions of years). Fortunately nearly all of the relevant technology is subject to Moore's-law type scaling, which we have already observed in the rapid progress in imaging over the last two decades or so.
Moore's Law? Not quite so applicable.

Even for CPUs, over recent decades, the size of a common PC processor made for hundreds of dollars has remained like a 1 square centimeter die and a substrate of microns to sub-micron thickness. Transistor number in that area increased due to finer lithography, but it's as impossible now as it was 30 years ago to cheaply produce a huge volume of anything manufactured to submicron complexity.

Expressed in terms of volume, the cost relative to the tiny thin area of CPU has remained proportionally like billions of dollars per liter.

More directly relevantly here, 50 years ago, electron microscopes imaging regions of a small number of cubic microns at nanometer scale existed. Now electron microscopes aren't thousands of times cheaper, still millions of dollars each for fancy ones. They are not subject to the same Moore's Law scaling.

Ordinarily, to image an entire cross-section of a frozen brain at once, with AFMs which do like a 100 micron by 100 micron region at a time, you'd need like a million of them or else spend eons moving a lesser number around, and that's just the tip of the iceberg when going through so many thousands of layers.

A quadrillion cubic micron object is about as unaffordable to image to sufficient resolution now as it was several decades ago.

You need rather so advanced technology as to be able to throw out the window the slightest resemblance of economic and practical limits as known today. That's maybe possible someday, but other methods for life extension are easier.
In active mode you can replace neurons (with simulated equivalents) on a progressive basis, e.g. for 'gradual uploads' for the squeemish, or implement any of the interesting hypothetical splits discussed here.
Yes, once the tech existed.

Before then it could be possible to counter the decline with aging biologically. The flexibility of the brain and its ability to restructure is enormous. In a famous case one guy lost most of his original brain volume but survived, with it reduced to a centimeter or two thick layer from extreme hydrocephalus. Moderately advanced biotechnology might allow adding neurons to the brain or even to its exterior after removing or rearranging some of the skull.

Eventually going electronic would offer obvious advantages, although enough nanorobotic technology is probably a far harder goal for near-term timeframes than using biological cells which already self-replicate. Nanorobots have to be self-replicating. Else there's no damn way you'd be able to economically afford to make billions of them per person, as the complex machines can't cost even 1 cent each.
Killing one neuron doesn't kill a person. A few neurons out of 1E11 die every week. Killing them all at once, vaporizing the whole brain, would be totally different.
In that case please answer my question of 'exactly how much of your brain can I flash upload in one go without killing you'. Of course you may not subscribe to the ludicrous binary notion of selfhood that certain other individuals in this thread hold to.
A non-binary view may be appropriate.

A perfect copy is doubtful magic, with far more limits at greater scales than the heisenberg uncertainty principle alone, for anything operating in the real world with messy 3D biological structures and chemical distributions. So, if one assumes imperfections, the goal should be to minimize the degree of imperfection, errors, and damage.

Analogy and example: How great of a brain injury can somebody take and stay the same person? That's about impossible to precisely define. People have survived a lot in recorded medical cases, but are they the same person, especially if their personality changed drastically in the more extreme cases?

Yet I know enough for pragmatic decision-making. I know I want to minimize the degree of a brain injury I take.

I would feel better about anything that killed 0.1% of my neurons annually over many years than suddenly losing 10% in an accident. Likewise, I would prefer an imperfect replacement of a small fraction of neurons per year over trying to (imperfectly) upload them all at once.

There might even be some personality changes and memory loss even in the former case, but, at least then, it would be so gradual that one could try to track such, keep journals, etc. Either one never died, or one died so slowly spread out over the years that it would be hard to call it death at all, with never any particular day to fear.
Last edited by Forum Troll on 2009-02-20 11:16am, edited 1 time in total.
Forum Troll
Youngling
Posts: 104
Joined: 2009-02-15 05:00pm

Re: Brain Recording

Post by Forum Troll »

Stas Bush wrote:I once again must ask what the hell is "selfhood" and are we truly justified in killing a second human if both have absolutely identical genetical information and nigh-identical information input in their brain? Do we hold their future existence as having any value?
I'd value the life of an imaginary mental clone of myself as a brother, actually more than a random stranger. The identical twins example is good. Similarity doesn't eliminate value.

3 methods and their ethics:

A) Imagined flash-uploading with a perfect whole-brain scan.

If such was physically possible, the only ethical thing would be to let both the original and the copy continue to live afterward, unless viewed acceptable for one to voluntarily suicide.

For an imperfect, let alone a perfect reproduction, the physical possibility is rather debatable anyway for getting billions of times as much data on an object of that volume as current MRI resolution limits.

B) Vitrification, then electron microscopy / AFM scanning of a frozen brain ablated layer-by-layer.

The person's brain is automatically destroyed, no decision to make if doing the process as it is unavoidable.

If the copy is considered the same person, the original just transferred their mind to a new substrate. If it is not considered necessarily the same person or truly a full copy (perhaps especially considering imperfections), they committed suicide.

C) Gradual replacement, the Ship of Theseus approach.

No ethical problems really. Otherwise there just is loss of original neurons from aging and the brain shriveling anyway, so adding new neurons or nanotech is better.
Post Reply