MarshalPurnell wrote:Human thought is chemical patterns firing in a biological computer, the brain.
Just to clarify, human thought is primarily ionic conduction down neuron cell walls, with chemical messengers transmitting activation from one cell to another via diffusion (a
laughably inefficient scheme by technological standards, but diffusion is evolution-friendly and elemental metals aren't). However there are many complex chemical reaction chains that play a role in longer term processes, by influencing synapse properties (e.g. glial cells outnumber neurons in the brain by a factor of 10:1, and are known to have some role in information processing, but as yet we have a very limited understanding of it).
That organic computer does weird things like draw associations between different data that does not have any logical basis, requires periods of down-time to rest, and so.
That isn't particularly weird. A lot of AI systems do the same thing; the stereotype of computers being 'logical' is perhaps a legacy of 1960s/70s AI research and the origins of computer science in formal logic. I personally believe that pure probabilistic logic is the
best way to do (high-level) thought, and with correct structuring is also the most hardware-efficient, but that is probably a minority view right now in AI. As for resting, modern computers do go into a low power state when not needed. The brain chews up a lot of biochemical resources, so minimising that at night (when historically humans couldn't do much anyway) makes sense.
But functionally, "you" exist as a kind of program, shaped by life-experiences and genetic predispositions, being run on this organic computer.
Yes. I do wonder if it would be easier to shut these crypto-dualist 'continuity flaw' people up if the brain operated digitally (i.e was clocked and synapse properties were calibrated in discrete increments), so there was literally no difference between a copy and an original instead of an insignificantly small one. Probably not, it isn't a rational argument after all.
A computer program that is shut down on one computer, deleted, and then transferred to another computer is still objectively the same program, at least until it gets patched, and I think the position of those in favor of uploading is that the same applies to the "you" that is a program run on a brain.
The lengths to which some people go to deny this are amusing. I recall some fluff text for the game 'Mass Effect' that stated that 'true AIs rely on a quantum black box that cannot be copied or transmitted over a network'. Of course there was no sane reason for it, just a ludicrous attempt to rationalise the 'one body == one unique individual' intuition. To be fair, the respected mathematician and cod philosopher Roger Penrose wrote a couple of books saying how the human mind just had to be dependent on quantum effects, and lots of otherwise intelligent people believed him (some still do), despite the fact that his argument was really nothing more than 'the mind is mysterious, and quantum entanglement is mysterious, so they must be closely linked!'. His bullshit was quickly disproved by actual neurologists, yet those books are
still popular with faux-philosophers.
On the other hand, computer programs do not, so far as we know, have any subjective experiences.
Subjectivity is something you have to explicitly design in. It's also a complicated thing that exists in many forms and levels, rather than a simple binary property. Does a chimp have subjective experience? A dog? A cockroach? We already have many AI programs with more subjective experience than a cockroach; they have an explicit, internal self-environment embedding model (and in some rare cases reflective self-models as well). Most computer software doesn't have it because it simply doesn't need it.
Would you the program being run in a particular instance, subjectively experience life if deleted and then booted onto a new instance?
Of course. Any attempt to argue otherwise is venturing into the same realm of nonsensical rationalisation as 'qualia'. It's really quite tiring; hopefully future intelligences designed with sensible and comprehensive reflection capabilities will regard these debates with the same bemused amusement they regard religion.
And I'm not sure there's any way to prove that subjective experience would match the objective reality,
Maybe but damnit we're going to try. We're going to deconstruct the brain to the molecular level, we're going to document every little process, we're going to shove it in the faces of these crypto-dualists until they can't hide behind pseudoscientific rationale any more. They can still just be blatantly superstitious of course but at least then we can point and laugh.