AI Ethics

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

AI Ethics

Post by Starglider »

I was debating with a user called 'Tarzing' here, who displayed a fair amount of cluelessness followed by a moral position I find both bizarre and disgusting. I gave up on this thread out of disgust, hoping that it was obvious to everyone with a clue how wrong this guy is, but maybe I was too hasty. If I owned the forum, I would consider his statements logically equivalent to advocating slavery and mass-muder of humans, not to mention simple racism, and instaban him. The fact that he's a sado-masochist and claims he doesn't mind the same being done to him just makes it worse.

This train wreck started when someone started claiming that the best way to make AIs was to let them evolve in a simulated world, create some sort of test to find out whether they meet your needs (including loyalty), then delete all the ones that fail and implant the ones that do into robot bodies. Apparently this is moral regardless of how humanlike/sapient these intelligences are. Here are some choice statements of his (bolding mine). Anyone here willing to defend this twit, or care to help in destroying his position?

The sopilist crap at the beginning really should've tipped me off;
Tarzing wrote:
Starglider wrote:
Whitehawke wrote: I must be missing something here. When the AIs die, some (most?) simply die. The ones who would be thrilled to do a particular job are given the opportunity to do that job. How, exactly, is this immoral?
So killing sapients is just fine in your book as long as it isn't particularly painful? And it's fine to decree arbitrary lifespans whether they like it or not. Ok, I decree that you will have a 40 year lifespan, and then you will have a massive brain hemorage and die instantly. Don't bother complaining, it's perfectly moral by your own definition and since I'm actually an avatar of the intelligence running the simulation you believe is reality you can't do anything to stop me.
Um how do you know reality isn't EXACTLY like that?

Actually oddly enough I've certainly followed that train of thought before. "Reality" is a kind of test, if you are enlightened enough to figure out the true nature of reality, you pass and leave the reality simulation, into the real world. If you don't figure it out, you either get kicked out of the simulation and failed, deleted, or reborn into the simulation. Part of the reason I liked (well, liked) this theory is that the world often makes very little sense, you'd have to be pretty stupid to believe in it sometimes.

Oh, what kind of test? It's like a test, to see if your mind is strong enough to recognize when it's being deceived. Imagine like, a test for secret agents or something, who have to have tremendous mental willpower to resist sophisticated interrogation techniques. If you can't figure out that reality isn't real (and the way out of reality), you just suck, you fail.

In the end I decided I liked the Buddhism philosophy better (which doesn't attempt to answer questions about the nature of reality, only questions about the nature of the mind).
While I've seen plenty of people waste time with pointless navel gazing, this is the first time I've seen someone navel-gaze their way into a rationalistion for toturing and killing an indefinite number of sentient beings for personal benefit and amusement. His true position quickly becomes apparent:
Tarzing wrote:It becomes clear, that the Intelligences with physical bodies, are "real" and the ones without, are mere simulations, to be treated as such. To me this morality sounds good enough, an intelligence which can't be separated from the physical realm without killing it, has the right to live. Those which can be freely copied, deleted etc, don't have any right to live, because their existence is just so damn shallow (this would actually make a good definition for "Synthetic AI's", if an AI can be copied whole-cloth into a hardware device, and doesn't have significant capacity to evolve in that device, it can be morally killed towards any ends).
Yep, being 'shallow' (i.e. sanely designed) means you have no rights, no matter how intelligent you are, how human your emotions are, whether you have self-awareness etc.

I start pulling his arguments apart:
Starglider wrote:
Tarzing wrote:I strongly suspect that the innately non-linear nature of hardware (or wetware, if you please), gives it much more processing power than software
'Inherently non-linear' = what? Do you mean 'analogue'? If so the benefits of digital operation massively outweigh those of analogue operation - note that brains are digital in the amplitude domain anyway, they're just analogue in their gate settings and clockless (advanced hardware is probably clockless/asynchronous too).

In any case software is massively more powerful than fixed function hardware, because it can adapt itself exactly to the task. Completely reconfigurable hardware (i.e. advanced FPGAs) would effectively remove the hardware/software distinction.
Tarzing wrote:As such (real) AI's will be bound to bodies, it wont be possible to download, copy or delete them.
This is nonsense. It would have to be a hopelessly awful hardware design not to be able to give a state dump. It would be a nightmare to develop, you couldn't mass produce it, it would just be pointless.
Tarzing wrote:]I also say that it wont be possible to copy humans into computers! While you might be able to emulate an existing human in a computer, it would be the whole clone/death thing.
Well you're wrong. Probably in several different ways at once, but particularly in how continuity of consciousness works. Even your silly model doesn't rule out gradual uploading anyway.
Tarzing wrote:If all the intelligences bound to physical bodies, are magnitudes more powerful than AI's in software, then morality probably gets a lot simpler. The software AI's become unimportant, sort of like Sims really.
You cannot distinguish morally between intelligences based on minor implementation details.
He initially presents his notions as semi-hypothetical, but he quickly drops the pretense of 'if it works like this... then these conclusions follow' and starts declaring his "slavery is a-ok if I can declare it 'not real'" position to be objective truth.
Tarzing wrote:
Starglider wrote:
Tarzing wrote: Oh I do deal with it. If I am a simulation and I freely acknowledge that possibility, I'm perfectly happy to be terminated at any time! I can thus condone terminating simulations under me without any hypocrisy.
That is a highly bizarre moral position. You're saying it's fine to kill you or not based on implementation detail which neither of us have any way of detecting? Despite it having zero effect on your cognitive capabilities, your ability to feel and emphasise, your dreams and desires etc?
It's NOT a bizarre moral position. It's basically saying "I don't care what the reality of nature is, I'm going to act my actions are significant".
Tarzing wrote:
Starglider wrote:
Tarzing wrote:Is it wrong to wake up in the morning because that wipes out all the beings in the dream world? Wink.
There aren't any 'beings in the dream world', there are just memories your brain has confabulated. Humans have nowhere near the ability to completely encapsulate/simulate another sapient.
What if dreams are reality and reality is dreaming?

I say it is okay but not on the basis of the feeble processing power of my brain. I say it's okay because the dream characters exist in isolation from reality, I'm the only one who can interact with them, thus I can do what I please with them without feeling guilt.

If I myself am in that position then whoever is dreaming me can do whatever they want with me including ending my existence, I don't mind.
Tarzing wrote:
Starglider wrote:
Tarzing wrote: I really don't see what the big deal is. THE SAPIENTS IN THE HIGHEST REALITY ARE THE ONLY ONES WHO MATTER.
You haven't done anything to justify this statement as anything other than a bizarre personal pronouncement. Though I'd note that going around torturing your simulations for personal amusement is going to ensure that you're never let out of the box into reality Wink
Tarzing wrote:May sound harsh but anything else brings up too many ethical issues.
You can't be serious, you're saying 'I've adopted this utterly ridiculous ethical system because I can't be bothered to think things through? It's too hard! My head hurts! Let's just declare the problem irrelevant!'
Are you familiar with the "Fourteen Unanswerable Questions" (of Buddhism, which deal with the nature of reality)? Wink
(I personally was rather impressed and relieved that the Buddha had declared a class of questions to be unanswerable, that's some good honest intellectual integrity so rarely seen)

I'm talking about the unknowable nature of reality. If you can find a way to end my existence without my suffering and without making any other sapient suffer, then go ahead. I don't mind. Because if you do do it, I wont know it. You see: My suffering does not depend on the ability of a higher being to delete me, because I don't know the probability. This holds true as long as the nature of reality is unknowable.
More semi-sopilist idocy. Incidentally I had classed Buddhism as one of the most harmless religions, but this guy is actually trying to use it to justify (currently theoretical, but plausible in the near future) horrors on a scale beyond the dreams of the worst burn-the-heretics Christians. Later Tarzing tries to show something approaching mercy, but fails pathetically:
Tarzing wrote:It is not in fact okay to go around deleting sapients at will if other sapients who knew them, endure. You can create a being in a void, then delete that being. Or you can shut down an entire simulation. But you can't go around torturing beings willy-nilly, unless they perceive that as "Just the way reality is". If we are a simulation, then it's morally okay in the Upper-Reality, to make simulations of worlds which aren't paradise, because our reality is not paradise, and they made it. If we are Alpha-Reality, I don't see why we should be obligated to make simulations which are nicer than our reality.
I haven't even bothered including his many and gross technical errors, in both hardware and AI design, despite the fact he claims;
I am an AI programmer (job title I didn't choose it Razz), although I only do expert systems and have the intellectual integrity to not claim that as AI programming.
Nor have I bothered copying over his inane and unworkable ideas about how to ensure AI loyalty, which I also ripped apart and which he wisely chose not to try and support further. Of course the fucker has been picking and chosing which of my morality points he bothers to respond to as well.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

I'd note that it's my position that it's certainly possible, although somewhat difficult, to make very intelligent and apparently 'self-aware' AGIs that do not actually have humanlike sapience (or emotions or goals or 'qualia' or desires to better themselves or anything else like that), which we can use as tools and not worry about causing harm to a being worthy of moral treatment. I smacked down Ender's pathetic objections to this in a debate in the HAB a while back (following on from the best design for robotic soldiers). However that is a red herring with regard to the above debate, which is focusing on the morality of using and abusing intelligences even if they are completely humanlike in structure (e.g. human uploads). I take some comfort from the fact that in practice any attempt to do this will almost certainly fail horribly in the long run and end up with people like Tarzing being tried for 'crimes against sapient life' by the very intelligences they tried to enslave.
User avatar
dragon
Sith Marauder
Posts: 4151
Joined: 2004-09-23 04:42pm

Post by dragon »

Who knows once those ai's that live into the new bodies might just be a tad upset if they find out we killed others of their kind as a form of a test.
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Starglider wrote:
Tarzing wrote: I also say that it wont be possible to copy humans into computers! While you might be able to emulate an existing human in a computer, it would be the whole clone/death thing.
Well you're wrong. Probably in several different ways at once, but particularly in how continuity of consciousness works. Even your silly model doesn't rule out gradual uploading anyway.
While I agree that he's wrong that it isn't (in principle) possible to copy humans into computers, I'm not sure what that has to do with continuity of consciousness. If I non-destructively copy you, there are now two diverging consciousnesses, no?
Tarzig wrote:If we are a simulation, then it's morally okay in the Upper-Reality, to make simulations of worlds which aren't paradise, because our reality is not paradise, and they made it. If we are Alpha-Reality, I don't see why we should be obligated to make simulations which are nicer than our reality.
Note that this position is nearly identical to that of the Christian who feels that it's okay for God to kill us because he made us. You're arguing with a religious line of "reasoning," which means I doubt you're going to get very far.
User avatar
NecronLord
Harbinger of Doom
Harbinger of Doom
Posts: 27384
Joined: 2002-07-07 06:30am
Location: The Lost City

Re: AI Ethics

Post by NecronLord »

Starglider wrote:The fact that he's a sado-masochist and claims he doesn't mind the same being done to him just makes it worse.
Oi. S&M doesn't have anything to do with the perverse position this guy's espousing. At all.
Superior Moderator - BotB - HAB [Drill Instructor]-Writer- Stardestroyer.net's resident Star-God.
"We believe in the systematic understanding of the physical world through observation and experimentation, argument and debate and most of all freedom of will." ~ Stargate: The Ark of Truth
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Turin wrote:While I agree that he's wrong that it isn't (in principle) possible to copy humans into computers, I'm not sure what that has to do with continuity of consciousness. If I non-destructively copy you, there are now two diverging consciousnesses, no?
'Continuity of consciousness' is one of those irritating philosophical terms that doesn't have a hard definition and is normally invoked by either the delusional (e.g. outright religious dualists or crypto-dualists) or the outright ignorant. Not that I'm letting this forum off on that regard, because I've seen plenty of people bleating about transporters 'interupting continuity of consciousness' (when they don't just whine about it killing people) in the archives.

But anyway, he's claiming that the upload cannot be regarded as the same person as the original in any moral or legal sense. In actual fact when you clone someone perfectly the two intelligences both have perfectly valid (and equal) claims to be a continuation of the original; their cognitive content is (initially) identical and causally dependent on all the same past events (assuming the cloning is perfect). Subjectively, you experience a 50/50 change of 'becoming' the original or the clone.
Tarzig wrote:If we are a simulation, then it's morally okay in the Upper-Reality, to make simulations of worlds which aren't paradise, because our reality is not paradise, and they made it. If we are Alpha-Reality, I don't see why we should be obligated to make simulations which are nicer than our reality.
Note that this position is nearly identical to that of the Christian who feels that it's okay for God to kill us because he made us. You're arguing with a religious line of "reasoning," which means I doubt you're going to get very far.
Correct. One of the nice things about Buddhism is that AFAIK it does not normally suffer from the Stockholm syndrome inherent in Abrahamic religions, but this guy has managed to put it right back in. I didn't bother bringing this point up though, since the mod for that forum is a Mormon preacher with a demonstrated intolerance for anti-religious speech, and it probably would've gotten me banned.
Paradox244
Youngling
Posts: 84
Joined: 2007-09-02 05:45pm

Post by Paradox244 »

It sounds like he's saying there that because he wouldn't mind it being done to him, it's ok for him to do it to others. Kind of like justifying child abuse by saying that you were abused as a child yourself.
Admiral Valdemar wrote:Ooh, I'm quaking in my boots. What's he going to do? Rhetoric us to death?
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: AI Ethics

Post by Starglider »

NecronLord wrote:Oi. S&M doesn't have anything to do with the perverse position this guy's espousing. At all.
*laughs*

I apologise. :) Still, what's your term for people who say 'I'm ok with torturing and killing people because I'm ok with people torturing and killing me! This is the only view that makes sense!' Other than 'fuckwit'?
Paradox244
Youngling
Posts: 84
Joined: 2007-09-02 05:45pm

Post by Paradox244 »

S&M doesn't make statements about the moral validity of torture, does it? It's similar, but I wouldn't say it's the right way of describing his position.
Admiral Valdemar wrote:Ooh, I'm quaking in my boots. What's he going to do? Rhetoric us to death?
User avatar
Resinence
Jedi Knight
Posts: 847
Joined: 2006-05-06 08:00am
Location: Australia

Post by Resinence »

While I agree that he's wrong that it isn't (in principle) possible to copy humans into computers, I'm not sure what that has to do with continuity of consciousness. If I non-destructively copy you, there are now two diverging consciousnesses, no?
Yes and no. It has to do with the disturbing fact that the "you" that existed 10years ago is "dead" in that your probably not the same consciousness. The brain copies over/forgets old stuff that you never use, so you go through several versions of yourself over a lifetime. Like how when you meet someone again after a number of years they seem somewhat different - they are. I believe what starglider is talking about when he says "gradual" is allowing the concsiousness to move itself into the construct or whatever without forcibly copying it. You could accomplish this by making the brain somehow write all new information into the artificial area, but obviously allowing access to the old wetware. Eventually you will have ALL of the consciousness on the hardware, and it won't be a "death" thing anymore than the you now is constantly "dieing" through changes.
“Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation.” - Oscar Wilde.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Resinence wrote:It has to do with the disturbing fact that the "you" that existed 10years ago is "dead" in that your probably not the same consciousness.
I don't see how this is disturbing. Consciousness literally cannot exist without time and change; it's a time-extended phenomenon by its very nature, because pulses take time to travel around the brain and you can't think a single thought in less than a few hundred milliseconds. Structural change in the brain takes longer, but I'd find someone stuck in exactly the same mindset indefinitely much more disturbing than gradual growth and change.
I believe what starglider is talking about when he says "gradual" is allowing the concsiousness to move itself into the construct or whatever without forcibly copying it. You could accomplish this by making the brain somehow write all new information into the artificial area, but obviously allowing access to the old wetware.
Pretty much; I was in fact referring to a standard rebuttal to the anti-upload people, the 'Morravec transfer'. This involves replacing neurons with functionally equivalent electronic replacements one by one. Eventually the whole brain is running on digital neurons and you can losslessly copy the lot onto any substrate you like (or just leave it where it is and enjoy the benefits of backups, easy interfacing, full reflection, easy expansion of cognitive capacity etc). You can do this as quickly or slowly as you like. Clearly this is more difficult than a 'flash upload' (e.g. freeze someone's brain, destructively scan it with a laser to the molecular level, create a computer simulation from the data). Given sufficiently advanced technology there's (in principle) no measureable or perceptible difference in the results, and I'd consider them equivalent with the exception of some highly speculative many-worlds stuff.

But anyway, the point is that 'oh noes you have to kill someone to upload them' is not a valid objection to uploading in principle, unless you think that replacing neurons with functionally equivalent engineered microstructures is a gradual form of death. As you say, this is pretty much impossible to justify without classifying normal human existence as 'constant ongoing personality death'.
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Starglider wrote:'Continuity of consciousness' is one of those irritating philosophical terms that doesn't have a hard definition and is normally invoked by either the delusional (e.g. outright religious dualists or crypto-dualists) or the outright ignorant.
Fair enough, but for purposes of discussion I'm fairly certain there's a distinction that can be made between my subjective consciousness and the objective continuation of "an intelligence." (I think you're agreeing with this below, in fact.)
Starglider wrote:But anyway, he's claiming that the upload cannot be regarded as the same person as the original in any moral or legal sense.
Which is obvious nonsense unless one persists in retaining some dualist foolishness about a soul. Although, I wonder what he would claim as a mechanism behind the soul not being duplicated / copied / uploaded. Does he think you could duplicate a human being perfectly (again, in principle) and it would result in... what, an "empty" dead body? Or some kind of soulless zombie?
Starglider wrote:In actual fact when you clone someone perfectly the two intelligences both have perfectly valid (and equal) claims to be a continuation of the original; their cognitive content is (initially) identical and causally dependent on all the same past events (assuming the cloning is perfect).
Ok, I agree with all of this, right up to...
Starglider wrote:Subjectively, you experience a 50/50 change of 'becoming' the original or the clone.
What's the mechanism behind the shift in perspective? If you're non-destructively copied, why would what you perceive as your consciousness shift to the copy? Your copy would certain perceive having been shifted (so long as there's some discontinuity in its physical state/environment as well, which there always would be unless you copy it into a simulation). If consciousness is simply a chain of causal states, why does the state of the original "you" change?
Starglider wrote:I didn't bother bringing this point up though, since the mod for that forum is a Mormon preacher with a demonstrated intolerance for anti-religious speech, and it probably would've gotten me banned.
Sounds like a fun place. :roll:
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

You guys both posted while I was posting. If you're performing a gradual upload, then my point is moot.
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Turin wrote:You guys both posted while I was posting. If you're performing a gradual upload, then my point is moot.
(Apologies for the triple post.) Note that I recognize that "gradual" is an arbitrary measure. I use the word only to distinguish it from "instantaneous" copying, which isn't really possible in reality (just sci-fi), unless the copying is destructive.
User avatar
Resinence
Jedi Knight
Posts: 847
Joined: 2006-05-06 08:00am
Location: Australia

Post by Resinence »

Turin wrote:
Turin wrote:You guys both posted while I was posting. If you're performing a gradual upload, then my point is moot.
(Apologies for the triple post.) Note that I recognize that "gradual" is an arbitrary measure. I use the word only to distinguish it from "instantaneous" copying, which isn't really possible in reality (just sci-fi), unless the copying is destructive.
True instantaneous copying also wouldn't really be like the "clone/death" scenario, your viewpoint would simply shift, you could think of it like a computer with 2 processors but only one is used, but then switches to the second one instantly on the next cycle, copying all cache to the new processor instantly as well, will the running software be "deleted"? It's pretty much philosphical bullshit once you get down to it, but really the only "destructive" upload is where you are killed and a perfect copy is not made. However, if you think of "your consciousness" as part of your body as well as your brain, and not just software, thats where this "oh noes i will die keep it away" stuff comes from. Ugh, this stuff gives me a headache :(
“Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation.” - Oscar Wilde.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Turin wrote:Fair enough, but for purposes of discussion I'm fairly certain there's a distinction that can be made between my subjective consciousness and the objective continuation of "an intelligence." (I think you're agreeing with this below, in fact.)
Only in the sense that you-like intelligences are an extreme special case of the general category of 'sapient' or more specifically 'human-like' causal processes. There's only one perceivable Turin intelligence around at any one time right now, but over time clearly there are lots of slightly different 'yous'. All mind-copy technology is really doing is extending that to having multiple you-like intelligences around at once. Of course 'Turin-like' is a continuous (and highly multidimensional) metric measured relative to a specific static snapshot of 'Turin'. Gradual split/merge/clone processes are possible as well as instant ones, in the same sense that gradual uploading is (theoretically) possible as well as instant uploading. The ethics of the former are a little more complex but not too bad.
Starglider wrote:Although, I wonder what he would claim as a mechanism behind the soul not being duplicated / copied / uploaded.
His claim centers on analogue being king, anything digital being a meaningless fake (no matter if the difference is impossible to measure). He resembles one of those idiot high-fi elitists who decided to transfer their broken arguments from audio reproduction to philosophy ('you will never be able to beat the warmth of vinyl with your cold mechanical DVD-audios!' -> 'you will never be able to replicate the spirit of real neurons with your cold digital computers!').
Does he think you could duplicate a human being perfectly (again, in principle) and it would result in... what, an "empty" dead body? Or some kind of soulless zombie?
Apparently not; that's the really scary part. He's admitted that the uploads are just as sapient as the humans, they have real emotions, qualia etc. But because they're digital and easily manipulated (given the power to do so), they have no moral value. Later he implies that once you've cut someone's senses off from external reality, you can morally do whatever you like with them, because 'you are their world'. It's simultaneously relativist bullshit and 'might makes right' fascism, a pretty impressive accomplishment in moral idiocy.
Starglider wrote:Subjectively, you experience a 50/50 change of 'becoming' the original or the clone.
What's the mechanism behind the shift in perspective? If you're non-destructively copied, why would what you perceive as your consciousness shift to the copy?
I'm abstracting a bit here; imagine dematerialising someone on a transporter pad, then rematerialising two identical people. Or freezing someone cyrogenically, such that all neural and chemical activity stops, making a molecular-level copy via nondestructive means, then thawing out and reviving both humans. Neither would experience any more 'shift' than the other. Subjectively, both perceive themselves as being the same person as the original, down to continuing the very same train of thought they were having when they were copied.

If they go on to have different experiences, then the correct prediction for the original (if they know this is going to happen) is that 'they' will have a 50% chance of experiencing what the original or first transporter clone experiences and a 50% chance of experiencing what the copy or second transporter clone experiences. However this is stricitly a subjective simplification of the objective reality (something humans do a lot - usually it's useful and harmless, but it's misleading, even worse-than-useless, in cases like this that violate the intuitive assumption of singular personal identity). The universe itself doesn't have probabilities except at the level of measurable patterns in which quantum events occur; it just has frequencies (a fact that makes a lot of poor little orthodox statisticians heads explode when they try to ground probability logic - get with the program guys, learn some information theory). The frequency distribution over you-like intelligences experiencing particular things translates into a subjective probability of 'you' experiencing these things.

This is actually one of the reasons many-worlds is so attractive, it makes the grounding of all this a lot neater, though of course that's no more reason to assume that it's actually correct than it is to assume superstring theory is correct because the maths is cute.
Starglider wrote:Sounds like a fun place. :roll:
It's the forum for the Schlock Mercenary comic. The author is somewhat odd. He's pretty good for science and technology accuracy and the comic itself is quite 'liberal' as religion goes - it at least accepts stuff like aliens and AIs can be sapient and uploading is valid. There's a priest in the main cast, but he doesn't take the job seriously, and the author at least answers the question 'why is there still religion in the far future' sensibly (i.e. the psychological need for it not 'because god is real silly!').

Despite all that, he's still a Mormon preacher, so he really fell for and promotes one of the silliest and most obviously fraudulent (not to mention antifeminist) brands of Christianity. A series of ads from his ad provider were nearly banned for daring to offer a critique of organised religion (with the implication that if the product on offer had critiqued faith itself, they'd be censored). It's a damned shame really.
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Resinence wrote:True instantaneous copying also wouldn't really be like the "clone/death" scenario, your viewpoint would simply shift
Um... how? That was the question I was asking above. The copy would certainly have a continuous subjective experience, and would be indistinguishable from the original, but the original's subjective experience would cease.
User avatar
NecronLord
Harbinger of Doom
Harbinger of Doom
Posts: 27384
Joined: 2002-07-07 06:30am
Location: The Lost City

Re: AI Ethics

Post by NecronLord »

Starglider wrote:*laughs*

I apologise. :) Still, what's your term for people who say 'I'm ok with torturing and killing people because I'm ok with people torturing and killing me! This is the only view that makes sense!' Other than 'fuckwit'?
It doesn't sound like a sexual thing to him at all. More like 'Intellectual dishonesty' I expect. I would imagine he's not really honest about his claims that he has no problems with being killed - either that, or he's got a religious attitude 'god is good, if god sends me to hell, I deserve it for not being pleasing to him'. I remember once having great fun asking a Calvinist is it still right if God decided to change the criteria of 'elect' at whim and send him to hell. :lol:
Last edited by NecronLord on 2007-09-13 12:08pm, edited 1 time in total.
Superior Moderator - BotB - HAB [Drill Instructor]-Writer- Stardestroyer.net's resident Star-God.
"We believe in the systematic understanding of the physical world through observation and experimentation, argument and debate and most of all freedom of will." ~ Stargate: The Ark of Truth
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Hopefully you're not writing at the same time I am again... :lol:
Starglider wrote:
Turin wrote:Fair enough, but for purposes of discussion I'm fairly certain there's a distinction that can be made between my subjective consciousness and the objective continuation of "an intelligence." (I think you're agreeing with this below, in fact.)
Only in the sense that you-like intelligences are an extreme special case of the general category of 'sapient' or more specifically 'human-like' causal processes. There's only one perceivable Turin intelligence around at any one time right now, but over time clearly there are lots of slightly different 'yous'.
Sure. My "current state" changes over time and there's no special thing that makes this continuous except that it's a causal chain of states. The subjective perspective is an emergent property of that state.
Starglider wrote:Apparently not; that's the really scary part. He's admitted that the uploads are just as sapient as the humans, they have real emotions, qualia etc. But because they're digital and easily manipulated (given the power to do so), they have no moral value. Later he implies that once you've cut someone's senses off from external reality, you can morally do whatever you like with them, because 'you are their world'. It's simultaneously relativist bullshit and 'might makes right' fascism, a pretty impressive accomplishment in moral idiocy.
Yeah, that's pretty fucked up. But probably bullshit, too. I'm pretty sure he would object to being instantaneously and painlessly destroyed if it came down to it.
Starglider wrote:
Turin wrote:
Starglider wrote:Subjectively, you experience a 50/50 change of 'becoming' the original or the clone.
What's the mechanism behind the shift in perspective? If you're non-destructively copied, why would what you perceive as your consciousness shift to the copy?
I'm abstracting a bit here; imagine dematerialising someone on a transporter pad, then rematerialising two identical people. Or freezing someone cyrogenically, such that all neural and chemical activity stops, making a molecular-level copy via nondestructive means, then thawing out and reviving both humans. Neither would experience any more 'shift' than the other. Subjectively, both perceive themselves as being the same person as the original, down to continuing the very same train of thought they were having when they were copied.
I guess the specific example matters. If my subjective experience tells me that I am strapped to a chair that copies my brain, and when my consciousness resumes, I'm strapped to a different chair (because I've been copied into a different body), I've experienced a shift that the original has not, which immediately differentiates the two intelligences.
Starglider wrote:If they go on to have different experiences, then the correct prediction for the original (if they know this is going to happen) is that 'they' will have a 50% chance of experiencing what the original or first transporter clone experiences and a 50% chance of experiencing what the copy or second transporter clone experiences. However this is stricitly a subjective simplification of the objective reality
The subjective experience is what I'm concerned about from an ethical stance, however. I don't disagree with you in any way that the copy is "the same" as the original at the point of copying, and is a "real" intelligence. What doesn't follow from the above is what happens in the case of destructive copying. The copy believes it is the original, but the original's subjective experience has ceased.
Starglider wrote:It's the forum for the Schlock Mercenary comic. The author is somewhat odd.
Sounds like from your description that there's some cognitive dissonance going on. Or he just doesn't care about whether not his fiction reflects his real beliefs.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Turin wrote:Sure. My "current state" changes over time and there's no special thing that makes this continuous except that it's a causal chain of states. The subjective perspective is an emergent property of that state.
'Emergent'? That word gets abused a lot by wannabe AI designers to cover for the fact that they don't understand what's going on. The human subjective perspective isn't really 'emergent', it's the result of specific cognitive structures that have evolved because a self-centric model provided a survival advantage (it's simple, but effective, at least for normal human social interaction). The only thing I'd call 'emergence' going on is the self-organising process by which human neural networks are set up (during brain development in childhood; babies have only very limited self-awareness, it develops as they grow up). Adult consciousness is the result of billions of neurons operating together, but it isn't any more emergent than (say) the Windows Vista desktop interface is an 'emergent property' of powering on your computer.
I'm pretty sure he would object to being instantaneously and painlessly destroyed if it came down to it.
Personally I'd like to lock him in a sensory deprivation chamber, tell him I won't let him out until he a) learns to play chess to grandmaster level and b) agrees to be my chessplaying slave, and then say "I control your world, so anything I do to you is moral by definition" every time he objects. A week of that should snap him out of it.
Turin wrote:I guess the specific example matters. If my subjective experience tells me that I am strapped to a chair that copies my brain, and when my consciousness resumes, I'm strapped to a different chair (because I've been copied into a different body), I've experienced a shift that the original has not, which immediately differentiates the two intelligences.
Yes, clearly. But as you say, if the experiences were identical (and they can be exactly identical if the intelligences are running on a deterministic digital substrate) the intelligences will be indentical and experience no differentiating 'shift'.
Turin wrote:
Starglider wrote:However this is stricitly a subjective simplification of the objective reality
The subjective experience is what I'm concerned about from an ethical stance, however. I don't disagree with you in any way that the copy is "the same" as the original at the point of copying, and is a "real" intelligence. What doesn't follow from the above is what happens in the case of destructive copying. The copy believes it is the original, but the original's subjective experience has ceased.
The copy/original distinction is purely a matter of space-time co-ordinates. The content is the same, the behaviour is the same, and critically the causal relationships to past experiences are the same. You've just inserted some causal dependency of the copying process, which will be 'transparent' if the copying is lossless. I don't see why a discontinuity in space/time co-ordinates, or the particular mass-energy you're made out of, should have any ethical significance. The question is exactly equivalent to 'what if the memory manager moves a sentient AI program from one part of system memory to another part, suspending the application briefly while it does it'. No sensible person could have an ethical issue with this and the fact the human case involves shuffling molecules about instead of bits is really just ethically irrelevant implementation detail.
Starglider wrote:Sounds like from your description that there's some cognitive dissonance going on.
Pretty much true for any intelligent, scientifically literate religious person. At minimum you've got to somehow accept religion without ever seriously asking where the doctrine and/or your mysterious feeling that it must be correct came from.
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Starglider wrote:
Turin wrote:Sure. My "current state" changes over time and there's no special thing that makes this continuous except that it's a causal chain of states. The subjective perspective is an emergent property of that state.
'Emergent'? That word gets abused a lot by wannabe AI designers to cover for the fact that they don't understand what's going on. The human subjective perspective isn't really 'emergent', it's the result of specific cognitive structures that have evolved because a self-centric model provided a survival advantage <snip> Adult consciousness is the result of billions of neurons operating together, but it isn't any more emergent than (say) the Windows Vista desktop interface is an 'emergent property' of powering on your computer.
Maybe/probably I'm just mis-using a technical term. (I'm not even a wannabe AI designer :) ). I'm using a more-or-less dictionary definition, something like "arising causally and perhaps unexpectedly" Which is to say, consciousness is generated by the structure of the brain, not some ethereal quality like a "soul." It's "unexpectedly" in the same sense that if one were to be given a pile of organic molecules, one would not easily come to the conclusion that one could generate what we call life out of it (without knowing that it was possible beforehand.)
Starglider wrote:
Turin wrote:The subjective experience is what I'm concerned about from an ethical stance, however. I don't disagree with you in any way that the copy is "the same" as the original at the point of copying, and is a "real" intelligence. What doesn't follow from the above is what happens in the case of destructive copying. The copy believes it is the original, but the original's subjective experience has ceased.
The copy/original distinction is purely a matter of space-time co-ordinates. The content is the same, the behaviour is the same, and critically the causal relationships to past experiences are the same. You've just inserted some causal dependency of the copying process, which will be 'transparent' if the copying is lossless. I don't see why a discontinuity in space/time co-ordinates, or the particular mass-energy you're made out of, should have any ethical significance.
Okay, maybe I'm not sure on what's the justification that copying would be lossless. In principle, this is somewhat easy for an intelligence that's already in a digital state. For meat brains, though, I'm not sure how this would be possible even in principle -- you can't actually freeze something in magical time-halted stasis. You'd have to be able to freeze the brain and "scan" it within a span of time shorter than the brain's "clock cycle."
Starglider wrote:The question is exactly equivalent to 'what if the memory manager moves a sentient AI program from one part of system memory to another part, suspending the application briefly while it does it'. No sensible person could have an ethical issue with this and the fact the human case involves shuffling molecules about instead of bits is really just ethically irrelevant implementation detail.
This is where the "emergent" part from earlier comes in. If the subjective experience of consciousness is generated by the structure of the substrate (digital, brain, whatever), than a discontinuity in the existence of the substrate is a discontinuity in the subjective experience as well... shouldn't it?

I'm working on writing up a thought experiment here that might help me see your point, but it's taking me longer than I thought. Would you indulge me in walking through it with me?
User avatar
Turin
Jedi Master
Posts: 1066
Joined: 2005-07-22 01:02pm
Location: Philadelphia, PA

Post by Turin »

Okay, here's the thought experiment. Please excuse the length of this post. Hopefully I'm contributing something useful or interesting out of all this.

Experiment #1

Okay, assume in our Evil Genius Laboratory we have a functioning human brain of ~100 billion neurons (connection to sexy human body of hero's romantic interest optional). Assume we also have a technology that will allow us to replace neurons with their electronic equivalent, in situ; presumably we're talking about some virus-like nanotech here. The replacement is destructive, but with only one neuron being replaced at a time, it shouldn't have an appreciable effect on the total state of the brain (and/or the subjective state of consciousness created by that brain). For the sake of argument, let's assume that our technology doesn't cause damage to neurons not being immediately replaced -- we're going to ignore heat/mechanical damage, etc.

If we replace neurons at a gradual pace, I think we both agree that the subjective state of consciousness experiences no discontinuity whatsoever. In fact, in many ways our brains do this on an ongoing basis throughout our lives as they constantly rewire and the molecules that make up the neurons are replaced.

Of course with such a large number of neurons involved, we'll want to replace them at a very fast rate in order for finish in any kind of time we'd want to wait around for (although in theory there's nothing stopping us from lengthening the process out over years). If we replace 1 million neurons per second, it would take us a little over a day.

Apart from the obvious technological/mechanical/heat problems (which we're hand-waving at the moment), in principle we can arbitrarily increase the rate at which we replace neurons. But at some point, the replacement of all the neurons happens at a rate fast enough that the time involved falls within the "time resolution" of the subjective conscious state. (I'm probably inventing a term here, but my understanding is that the subjective consciousness can only be "updated" with information from the environment/brain at a given albeit variable rate.) At this moment, the change is effectively instantaneous from a subjective standpoint. Right so far?

Alternately, we can slow down the brain (by freezing it, let's say) and thereby increase the "time resolution" of the subjective conscious state enough to make the one-by-one replacement of neurons at a slower rate, but still have the change be subjectively instantaneous. As far as I can tell, it doesn't matter which way we go, and we'll probably do both.

At the end of this process, we have a completely functioning electronic "human" brain, which we can in theory use to create non-destructive copies at will (if our electronic components are properly designed for it.)

Experiment #2

Now let's repeat the experiment, but with a twist. This time instead of replacing the neuron in situ, we remove it and make an electronic copy in another location (maybe even in a computer simulation).

This time, if we gradually remove the neurons one by one, we eventually reach a point where we no longer have a functioning human brain (killing our hero's girlfriend). Yet, because this point is likely to be well before we've completed our new electronic brain, we don't have a new functioning electronic brain yet. Because the original is now dead (and therefore in a much different state which succumbs rapidly to entropy), we can't complete the copying process to our electronic brain.

If we increase the speed of the copying, as in our previous experiment, we eventually reach a point where our speed is such that we have copied the entire brain and created a copy within the span of time given by our "time resolution" (again, whether by increasing speed or slowing down the brain, or both). The brain is destroyed and recreated before the subjective consciousness has a chance to experience it.

Your argument seems to be that in this case, there will be continuity of the subjective conscious state, even though a slower version of the same process resulted in the irreversible destruction of the subject.

This is where I'm having trouble. If the physical structure is what generates the subjective state, then why does a completely different structure generate a continuation of the same subjective state, rather than a subjective state that merely appears to be a continuation of the same subjective state (with the original destroyed)?

Just so that I'm clear, I'm not differentiating between electronic and meat intelligences here. It would be the same if you moved an electronic intelligence from one memory area to another. And so interrupting a general AI, removing it from a computer, and reinstalling it somewhere else, would be just as destructive to its continuity of consciousness (or not destructive, if I'm wrong!)
User avatar
Ender
Emperor's Hand
Posts: 11323
Joined: 2002-07-30 11:12pm
Location: Illinois

Post by Ender »

Starglider wrote:I smacked down Ender's pathetic objections to this in a debate in the HAB a while back (following on from the best design for robotic soldiers).
No, you claimed there was an inherent difference between biological and mechanical processes and then I deployed. Let's at least try to maintain some semblance of honesty.
بيرني كان سيفوز
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Ender wrote:No, you claimed there was an inherent difference between biological and mechanical processes and then I deployed. Let's at least try to maintain some semblance of honesty.
You lying sack of shit. Not only did I do no such thing, that is completely inconsistent with my stance above (where I am arguing that there is no moral difference between an accurate simulation of a human and a biological human) and every single AI ethics argument I have ever posted to the Internet in the last ten years (and there have been many).

Your claim was that it is wrong to enslave anything that looks intelligent, regardless of how it actually works and whether it actually has any morally-relevant cognitive structure.
User avatar
Ender
Emperor's Hand
Posts: 11323
Joined: 2002-07-30 11:12pm
Location: Illinois

Post by Ender »

Starglider wrote:
Ender wrote:No, you claimed there was an inherent difference between biological and mechanical processes and then I deployed. Let's at least try to maintain some semblance of honesty.
You lying sack of shit. Not only did I do no such thing,
link

You dismissed comparisons to biological intelligences as irrelevant on the basis that "humans are not made out of microchips" and tried to dismiss my point that that did not suffice with the statement "The nearest common 'fundamental level' is atoms - and even there AIs use different elements." which fails because it doesn't change the fact that how they think is irrelevant when the discussion was on the abilities, not the mechanism.

Accusing someone of lying when one needs only go back 3 pages to see the text is not a good idea.
that is completely inconsistent with my stance above (where I am arguing that there is no moral difference between an accurate simulation of a human and a biological human) and every single AI ethics argument I have ever posted to the Internet in the last ten years (and there have been many).
Yes, I did note that. I found it curious.
Your claim was that it is wrong to enslave anything that looks intelligent, regardless of how it actually works and whether it actually has any morally-relevant cognitive structure.
Absolute lie. I very clearly stated that I was referring to strong AIs which possessed human type cognitive abilities - the ability to "model reality and use this model to concieve [sic] and plan actions and predict their outcome at a rate and to a level of detail and accuracy matching or surpassing human abilities". You even agreed that the cognetive potential was similar though you dismissed it by saying the lack of a desire to do things meant it lacked human potential even though my argument was that restricting its desires was denying it its full potential and thus wrong.


I leave again this weekend - I'm not interested a debate because I won't be around for it. But don't try and make the situation out to be something it was not. It is beneath you.
بيرني كان سيفوز
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
Post Reply