Page 39 of 104
Posted: 2008-06-15 05:12am
by fb111a
Zim wrote:I wonder what will happen to Beelzebub. He can't return to Satan, Abigor was sentenced to a fate tantamount to death along with his entire family for losing sixty legions. Beelzebub is about to lose over 8 times that number and in his own backyard to boot. Even if he retreats with the remnants of his army intact, he's going to have lots of explaining to do.
Abigor's probably going to have a new roommate soon.
Either that, or he commits
seppuku or whatever the equivalent in Hell is, OR he decides to commit suicide by T-72/M1A1/Challenger/BMP/Bradley/Warrior/[insert AFV of choice here].
Posted: 2008-06-15 05:16am
by Peptuck
Adrian Laguna wrote:Peptuck wrote:I think we've already hit this point; Beelzebub just thought in the last chapter that the sarin bombardment the humans unleashed easily outdid whatever plagues Uriel could possibly manage.
That assumes Beelzebub has accurate intel as to Uriel's abilities. It's entirely possible that Yaweh has always kept him in a short leash, as a secret reserve, just in case.
Good point. I think one of the things Stuart said earlier on at the start of this story that really struck me was how his work tended to focus on not only the perceptions of the participants, but also the
misperceptions of them as well.
Posted: 2008-06-15 08:52am
by gtg947h
Shroom Man 777 wrote:Also: Why are General Petraeus and the Russian General calling each other funny names?

Umm... according to Google, "tovarish" is Russian for "comrade", and "bratishka" means "brother".
Posted: 2008-06-15 09:26am
by Bayonet
Darth Wong wrote:Why must you resort to fiction? You live in Toronto; remember Anthony Brooks in 2004, who took a hostage in downtown Toronto and caught a bullet to the head from the police for his troubles?
Yeah, but fiction usually comes with better one-liners.

Posted: 2008-06-15 09:28am
by Bayonet
Shroom Man 777 wrote:Also: Why are General Petraeus and the Russian General calling each other funny names?

It's polite to use each others' terms of respect. It would be like, if you were Japanese, calling you Shroom San, even though it means nothing in English.
Posted: 2008-06-15 03:06pm
by Starglider
Darth Wong wrote:That's not precisely what I was trying to say; I was trying to say that merciless conduct actually is ethical in this particular context,
Granted, up to a point. Clearly we want to maximise the human:demon casualty ratio and pretty much any means are justified for that.
because the survival of the human race is the prime value in any ethical system.
That isn't correct. You can generalise to 'sentience' or some subset (e.g. 'humanlike sentience', 'ethical sentience', whatever). This particular debate has come up before on this board and a substantial minority of the participants said that if it was a choice between humanity being wiped out and 10 times as many sapient aliens being wiped out, they'd want humanity to be wiped out. I certainly agree with that if either nothing is known about the aliens or they're at least as ethical as we are (on average, adjusting for technical development level etc), then you should disregard species and save as many sapient beings as possible. 'My species is best' as a general principle is really no more rational than 'my tribe is best'.
In this case I wish I could cheer for the baldricks, because they're biologically so much more interesting than the humans, but of course culture and ethics have to come first, so I'm not.
It's one of the reasons that utilitarianism is a superior ethics system; it can adapt to new situations smoothly, whereas other systems often cannot.
Yes although simple utilitarianism can have problems generalising to unbounded conditions. If human life is good and the universe could potentially hold a nearly infinite amount of human life, then any and all actions that preserve and increase the amount of humanity become desireable, up to and including pre-emptively wiping out any potential threats or competitors (if the chance of failure/retaliation is low enough to make it a net plus). This kind of thing isn't an issue for normal human purposes, but it's a serious issue in formal philosophy (e.g. general AI goal system theory).
Stuart wrote:I'd have thought that was self-obvious. One can't be ethical and extinct.
You can't be ethical and dead either, but people still sacrifice their lives for kin and country. Obviously in this case the demonic ethics are repugnant (hopefully a cultural issue rather than a biological one), but in the general case the single most important thing is that someone is around who is ready and able to perpetuate your ethical system, not that you, your family, your tribe or your species is still around (though of course, all of those are good too).
Posted: 2008-06-15 03:15pm
by JN1
This particular debate has come up before on this board and a substantial minority of the participants said that if it was a choice between humanity being wiped out and 10 times as many sapient aliens being wiped out, they'd want humanity to be wiped out.
An interesting point, though if about to face being wiped out I suspect I personally would see it the other way.
Posted: 2008-06-15 03:32pm
by Shroom Man 777
Bayonet wrote:Shroom Man 777 wrote:Also: Why are General Petraeus and the Russian General calling each other funny names?

It's polite to use each others' terms of respect. It would be like, if you were Japanese, calling you Shroom San, even though it means nothing in English.
That's Shroom-kun, Bayonet-chan
Anyway, that's awesome. So General Petraeus and the Ruskies are gonna be great drinking buddies now that they've nuked Satan

Posted: 2008-06-15 03:34pm
by Starglider
JN1 wrote:This particular debate has come up before on this board and a substantial minority of the participants said that if it was a choice between humanity being wiped out and 10 times as many sapient aliens being wiped out, they'd want humanity to be wiped out.
An interesting point, though if about to face being wiped out I suspect I personally would see it the other way.
It's ok, I don't blame you, you evolved that way.
However that particular excuse only applies as long as you don't have the ability to rewrite your own brain structure. If you did (and eventually humans will), then there would be no good reason for your 'in theory...' answers to differ from your 'in practice' answers.
Out of curiosity if you had a choice between say all the English-speaking countries in the world being wiped out, and all the other humans on earth being wiped out, would you still save yourself (and your family)? If not, why not?
In practice the 'selfishness threshold' varies all the way from 'I only care about myself' through 'family', 'tribe', 'nation (slightly bigger tribe)', 'near-human', 'ethical biological sentience' etc all the way out to 'sentience of any kind'. The bizarre thing for me really is that there's no real objective justification for any of them (as much as morality can have an objective justification) except 'myself', 'people I know personally and like' and 'all ethical sentient beings'. Yet somehow people still manage to pick (and self-justify) other points on the spectrum.
Posted: 2008-06-15 03:54pm
by JN1
Out of curiosity if you had a choice between say all the English-speaking countries in the world being wiped out, and all the other humans on earth being wiped out, would you still save yourself (and your family)? If not, why not?
That's a more difficult moral question to answer, as the aliens previously mentioned are purely theoretical while the non-English speaking countries are very real. However if I am being 100% honest then I'd have to put the safety and survival of my family, my nation and nation state before anyone else, but that's more of an emotional than logical reaction.
A more logical reaction is to say that the larger number of humans should be saved, though, on the other hand, removing many of the world's democracies and most important industrialised nations may not be good for the survivors, but that also works in reverse if everybody but the English speakers (does that mean killing some Scots, Welsh and Irish btw?) is killed off.
It might give us the chance to re-colonise the rest of the world and create an even bigger Anglosphere and make the world safe for democracy and liberal capitalism.
AIGF, of course.

Posted: 2008-06-15 04:36pm
by KlavoHunter
You do realize, guys, that the only thing that's been holding back the close-air support - the Warthogs, Su-25s, and all manner of helicopters - has been the Harpies?
And they're all dead now...

Posted: 2008-06-15 04:46pm
by Mr Bean
KlavoHunter wrote:You do realize, guys, that the only thing that's been holding back the close-air support - the Warthogs, Su-25s, and all manner of helicopters - has been the Harpies?
And they're all dead now...

Don't forget the single greatest close air support craft in an safe sky....
Spooky...
Posted: 2008-06-15 06:14pm
by Darth Wong
If it was a choice between us and some psychopathic alien civilization bent on genocide, I see no reason to conclude that the ethical decision is to let the other side win, even if it has a greater population, regardless of whether you call it tribalism. Utilitarianism also includes rule utilitarianism after all.
Posted: 2008-06-15 07:05pm
by tim31
Mr Bean wrote:
Spooky...
I actually did a frat-boy cheer when I saw that image!
Posted: 2008-06-15 07:07pm
by Fyrwulf
Bayonet wrote:
It's polite to use each others' terms of respect. It would be like, if you were Japanese, calling you Shroom San, even though it means nothing in English.
Ah, first the proper use is Shroom-san. And it does mean something, being the equivalent of Mister or Misses. You also have honorifics such as -sensei which literally mean "teacher" or -sempai/-kohai which refer to a formalized teacher/student or master/apprentice relationship. -Sama is a formalized variant of -san which refers to somebody astronomically senior and -dono is used to refer to somebody who in the old days would be a lord (mayor, governor, PM/President, etc.)
There are a couple instances where honorifics aren't used at all. One used to be when referring to somebody convicted of a crime, although that's changed recently, and when a married couple is speaking to each other in privacy (although most will still slip into the -kun/-chan thing out of habit.)
Shroom Man 777 wrote:That's Shroom-kun, Bayonet-chan
Only if you know each other offline fairly well. Both forms are diminutive and are often used to tease somebody, not to mention being very familiar.
Posted: 2008-06-15 07:39pm
by Bayonet
Starglider wrote:
Out of curiosity if you had a choice between say all the English-speaking countries in the world being wiped out, and all the other humans on earth being wiped out, would you still save yourself (and your family)? If not, why not?
in a minute. And I would seek to kill anyone who stood in my way. Our side winning is the only law.
Posted: 2008-06-15 08:13pm
by Starglider
Darth Wong wrote:If it was a choice between us and some psychopathic alien civilization bent on genocide, I see no reason to conclude that the ethical decision is to let the other side win,
Neither do I. My point wasn't that utilitarianism implies pacifism or rules out genocide (when that inflicts the least harm in the long run), it doesn't, my point was that utilitarianism does not have to be species-centric and does not rule out self-sacrifice on a pan-species scale either. It depends on the situation and exactly which version of utilitarianism you're using.
Utilitarianism also includes rule utilitarianism after all.
Frankly I've long since lost track of which philosophers mean what by 'utilitarianism'. Strictly, using any sort of additive utility formula - ideally incorporating probabilities to get expected utility - is utilitarianism, even if you're assigning utility to each instance of baby eating (5) and dog kicking (3) and nothing else. But the normal/popular sense of the word, which I was using above, is the 'assign utility to individual life, freedom and/or comfort'. That's still broad enough for endless arguments of course.
Bayonet wrote:in a minute. And I would seek to kill anyone who stood in my way. Our side winning is the only law.
Since language was a deliberately arbitrary criteria, would you care to specify exactly what you criteria for 'my side' is?
Posted: 2008-06-15 09:38pm
by Adrian Laguna
Starglider wrote:In practice the 'selfishness threshold' varies all the way from 'I only care about myself' through 'family', 'tribe', 'nation (slightly bigger tribe)', 'near-human', 'ethical biological sentience' etc all the way out to 'sentience of any kind'. The bizarre thing for me really is that there's no real objective justification for any of them (as much as morality can have an objective justification) except 'myself', 'people I know personally and like' and 'all ethical sentient beings'. Yet somehow people still manage to pick (and self-justify) other points on the spectrum.
It's easy to do if you wilfully discard logic and make your decision on whim and emotion. For example, I tell you that I would sacrifice any number of aliens for the sake of humanity. I don't bother coming up with ethical explanations as to why, because I don't care, and because there probably isn't one. I think the truly bizarre thing is how few people are willing to admit they are being irrational. Humans can drop logic easily, but they hold on to the
pretence of it with incredible tenacity.
Posted: 2008-06-15 10:41pm
by Nova Andromeda
Adrian Laguna wrote:Starglider wrote:In practice the 'selfishness threshold' varies all the way from 'I only care about myself' through 'family', 'tribe', 'nation (slightly bigger tribe)', 'near-human', 'ethical biological sentience' etc all the way out to 'sentience of any kind'. The bizarre thing for me really is that there's no real objective justification for any of them (as much as morality can have an objective justification) except 'myself', 'people I know personally and like' and 'all ethical sentient beings'. Yet somehow people still manage to pick (and self-justify) other points on the spectrum.
It's easy to do if you wilfully discard logic and make your decision on whim and emotion. For example, I tell you that I would sacrifice any number of aliens for the sake of humanity. I don't bother coming up with ethical explanations as to why, because I don't care, and because there probably isn't one. I think the truly bizarre thing is how few people are willing to admit they are being irrational. Humans can drop logic easily, but they hold on to the
pretence of it with incredible tenacity.
-Speak for yourself. I'm firmly on Starglider's side of this argument. Preserving humanity at any cost to other sentiences is silly. I think I've raised similar issues previously as well. In fact, I'd prefer to see humanity go extinct and be replaced by something (assuming it becomes tech. possible) that fixes and improves all the 'God given' crap.
-I do wonder if you realize that such a stance makes you an ethically legitimate target for preemptive extermination. Perhaps you should reconsider?
Posted: 2008-06-15 10:43pm
by MKSheppard
This is what the baldricks are having to smash through:
Map of a single Rifle Division at Kursk's defensive stuff
Bigger Origional map, but in russian
Makes you feel maybe 0.1% sorry for them
Posted: 2008-06-15 10:50pm
by Surlethe
Nova Andromeda wrote:Adrian Laguna wrote:Starglider wrote:In practice the 'selfishness threshold' varies all the way from 'I only care about myself' through 'family', 'tribe', 'nation (slightly bigger tribe)', 'near-human', 'ethical biological sentience' etc all the way out to 'sentience of any kind'. The bizarre thing for me really is that there's no real objective justification for any of them (as much as morality can have an objective justification) except 'myself', 'people I know personally and like' and 'all ethical sentient beings'. Yet somehow people still manage to pick (and self-justify) other points on the spectrum.
It's easy to do if you wilfully discard logic and make your decision on whim and emotion. For example, I tell you that I would sacrifice any number of aliens for the sake of humanity. I don't bother coming up with ethical explanations as to why, because I don't care, and because there probably isn't one. I think the truly bizarre thing is how few people are willing to admit they are being irrational. Humans can drop logic easily, but they hold on to the
pretence of it with incredible tenacity.
-Speak for yourself. I'm firmly on Starglider's side of this argument. Preserving humanity at any cost to other sentiences is silly.
I disagree. Ethics codes are like mathematics and theology: self-contained, axiomatic systems. I see no reason why I should accept your system of ethics -- one that apparently embraces the extinction of humanity as a good thing (!) -- in favor of mine, which notes that ethics exists to serve humanity.
Posted: 2008-06-15 10:56pm
by Darth Wong
Starglider wrote:Frankly I've long since lost track of which philosophers mean what by 'utilitarianism'. Strictly, using any sort of additive utility formula - ideally incorporating probabilities to get expected utility - is utilitarianism, even if you're assigning utility to each instance of baby eating (5) and dog kicking (3) and nothing else. But the normal/popular sense of the word, which I was using above, is the 'assign utility to individual life, freedom and/or comfort'. That's still broad enough for endless arguments of course.
I'm referring to the distinction between act utilitarianism and rule utilitarianism. Act utilitarianism says "what decision would preserve the greatest utility in this particular case", whereas rule utilitarianism says "what rule would preserve the greatest utility if it were applied universally". In your hypothetical scenario where some genocidal species which outnumbers us is attempting to wipe us out, act utilitarianism might actually arrive at the conclusion that it is ethical to let them wipe us out, while rule utilitarianism would not, because the rule that smaller groups should simply lay back and passively allow larger genocidal groups to wipe them out would cause far more problems in the long term than it solves.
Of course, if one interprets utilitarianism to apply only to humans, all of this is a moot point.
Posted: 2008-06-15 11:12pm
by Nova Andromeda
Surlethe wrote:Nova Andromeda wrote:Adrian Laguna wrote:
It's easy to do if you wilfully discard logic and make your decision on whim and emotion. For example, I tell you that I would sacrifice any number of aliens for the sake of humanity. I don't bother coming up with ethical explanations as to why, because I don't care, and because there probably isn't one. I think the truly bizarre thing is how few people are willing to admit they are being irrational. Humans can drop logic easily, but they hold on to the pretence of it with incredible tenacity.
-Speak for yourself. I'm firmly on Starglider's side of this argument. Preserving humanity at any cost to other sentiences is silly.
I disagree. Ethics codes are like mathematics and theology: self-contained, axiomatic systems. I see no reason why I should accept your system of ethics -- one that apparently embraces the extinction of humanity as a good thing (!) -- in favor of mine, which notes that ethics exists to serve humanity.
-Your concept of ethics is narrow minded indeed if it can only be applied to humans. In fact, it is so narrow minded that it would probably increase the likelihood of humanity's extermination by making humanity a greater threat than necessary to every other sentient.
Posted: 2008-06-15 11:18pm
by Surlethe
Nova Andromeda wrote:-Your concept of ethics is narrow minded indeed if it can only be applied to humans.
So? The point I'm trying to get across is that no code of ethics is intrinsically "better" than any other -- even the authoritarian moral system of fundamentalists. All you can do is compare what different moral systems are better at
doing, and then make your choice based on your preference. When it comes to something as esoteric as dealing with other sentiences in the universe, you'll have difficulty finding people who like the consequences of your assumptions,
especially if you proudly trot out the fact that you'd like to see humanity extinct as a consequence of your moral code.
In fact, it is so narrow minded that it would probably increase the likelihood of humanity's extermination by making humanity a greater threat than necessary to every other sentient.
Why?
Posted: 2008-06-15 11:21pm
by Darth Wong
Nova Andromeda wrote:-Your concept of ethics is narrow minded indeed if it can only be applied to humans.
The whole point of an ethics system is to define what you can and
can't do. It's not
supposed to be open-minded.
In fact, it is so narrow minded that it would probably increase the likelihood of humanity's extermination by making humanity a greater threat than necessary to every other sentient.
That is only true if you assume that an ethics system which puts humanity first would necessarily recommend unprovoked hostile action against other intelligent, powerful species. There is no reason for that assumption.