A choice of two "utopias".

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Who do you want running the show?

Benign AIs
77
95%
Rabid Fundies
4
5%
 
Total votes: 81

User avatar
Perinquus
Virus-X Wannabe
Posts: 2685
Joined: 2002-08-06 11:57pm

Post by Perinquus »

Darth Wong wrote: Right. Better to be ruled by venal politicians, wealthy lobbyists, and mega-corporations :wink:
Hey, at least you have some power to vote the bums out of office. But really, the fact is, for all their faults (which are many), they're still human beings, and largely got where they are because other human beings put them there.

This is nothing I can say with certainty, since we've no experience of AIs and what their effects on society would be, especially if they were running things. Nevertheless, I have a strong feeling that once it seeped into people's core beliefs that human beings could not be trusted with their own government, it would have a deliterious effect on the human spirit.
User avatar
UltraViolence83
Jedi Master
Posts: 1120
Joined: 2003-01-12 04:59pm
Location: Youngstown, Ohio, USA

Post by UltraViolence83 »

Again I agree with Peringuus on that point. We would be nothing more than slaves/pets living under the "polite tyranny" of a machine overlord, even if it programmed with the ethics of commonplace modern morality and only running the government. If it is possible, if it does happen, most people in that society would feel as though humans are so innately flawed that they cannot run a government. Give me tyrannical human rule anyday. At least we would think lowly of them.

Frank Herbert: "Man shall not be replaced." I know, I know. It's from a novel, but hell, the man kicked ass.
...This would sharpen you up and make you ready for a bit of the old...ultraviolence.
User avatar
SHODAN
Padawan Learner
Posts: 333
Joined: 2002-11-04 06:47am

Post by SHODAN »

Perinquus, you falsely assume the AI would be completely non-selfcorrecting entity, blind to its mistakes and unwilling to be removed from position of power. Humans may be all that but the entire point of making AI (that would control the world or whatever) is to have more competent and efficient employer than human could be. What troubles me is that your argument seems to be based on old science fiction cliche, the HAL-9000 syndrome, whereas I see no reason why AIs should go insane because of inconsistent programming, or be unable to care human well-being or just go homicidal for no apparent reason whatsoever.

And Ultraviolence, if that tyranny would grant you more prosperity and actual freedom than our current society, and definitely more than fundie one, why would it be a bad thing? 'I dont like it' is poor argument against statistics.
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Post by Darth Wong »

UltraViolence83 wrote:Again I agree with Peringuus on that point. We would be nothing more than slaves/pets living under the "polite tyranny" of a machine overlord, even if it programmed with the ethics of commonplace modern morality and only running the government. If it is possible, if it does happen, most people in that society would feel as though humans are so innately flawed that they cannot run a government. Give me tyrannical human rule anyday. At least we would think lowly of them.
So it's better to suffer horrible persecution, intolerance, and cruelty as long as you don't get an inferiority complex? You need help.
Frank Herbert: "Man shall not be replaced." I know, I know. It's from a novel, but hell, the man kicked ass.
Actually, I think Dune is vastly overrated, and a lot of his ideas are just plain stupid. The lasgun/shield interaction is quite frankly the dumbest plot device I've seen since the holodeck.
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Post by Darth Wong »

Perinquus wrote:This is nothing I can say with certainty, since we've no experience of AIs and what their effects on society would be, especially if they were running things. Nevertheless, I have a strong feeling that once it seeped into people's core beliefs that human beings could not be trusted with their own government, it would have a deliterious effect on the human spirit.
So you take your conviction that even a completely benevolent AI might hurt peoples' feelings and seriously weigh that over the certainty that fundie theocracy would bring suffering, cruelty, persecution, scientific suppression, elimination of freedom of thought, and intolerance? How can you be serious?
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
fgalkin
Carvin' Marvin
Posts: 14557
Joined: 2002-07-03 11:51pm
Location: Land of the Mountain Fascists
Contact:

Post by fgalkin »

I choose the Culture (the benign AIs)

Have a very nice day.
-fgalkin
User avatar
beyond hope
Jedi Council Member
Posts: 1608
Joined: 2002-08-19 07:08pm

Post by beyond hope »

Here's my impression from reading through the 5 pages of this thread.

We've been offered a choice between a utopia (the Nanny AI takes care of us all and we're all happy) and a distopia (the Fundies have outlawed fun in any way, shape or form. Either choice involves giving up our free will, either to the Pentehostiles or to the overgrown PC. What boggles my mind is that people are saying they'll voluntarily choose the distopia. Citing fear of machines doesn't seem a good reason to me when the scenario clearly specifies that the machine's rule is benevolent. It's not SHODAN, the AIs from The Matrix, "The Computer" from Paranoia, Fred Saberhagen's "Berserkers," or SkyNet. (or Windows 95.:twisted:)

Really, a debate on the man vs. machine aspect would be really good if the two choices were otherwise equivalent. Since they're not, I'm very curious as to why anyone would voluntarily choose the system which will guarantee them a joyless existance under the rabid Pentehostiles.
User avatar
Thirdfain
The Player of Games
Posts: 6924
Joined: 2003-02-13 09:24pm
Location: Never underestimate the staggering drawing power of the Garden State.

Post by Thirdfain »

Mr. Beyond Hope,

I don't think anyone wants the fundamentalist dictatorship. The argument now is over whether or not the "Benevolent Nanny A.I." is truly a utopia.

I don't think that it is. Even if every person in the world lived a perfect, well-fed, peaceful life- two girls for everyone! every movie, a blockbuster! every date, a roaring success! every orgasm a trip to heaven! Every meal a banquet! so on and so forth, I assert that it just would not be worth it if it is not the result of our own works. It's cheating. It is giving up. It is admitting that mankind is NOT GOOD ENOUGH. It is the ultimate insult to the last thousand generations, and the next thousand to come. The message sent is thus:

People of the world! you are foolish, incompotent children! Here, we have programmed a self-correcting A.I, benevolent and loving, the perfect mother for you all! Now, she will ensure that you will all behave, and live super, productive lives. Fold into her warm, narcotic embrace. You no longer need to struggle with hardship, or face a challenge you don't want to. She will bring us PEACE!


When I read "Consider Phlebas," I rooted for the Idirians. Freedom is the right to fuck up.

Oh, btw, I just can't get over the fact that one of the supporters of the benevolent A.I. is called SHODAN :shock: :? :P .
Image

Under capitalism, man exploits man. Under communism, it's just the opposite.
John Kenneth Galbraith (1908 - )
User avatar
Perinquus
Virus-X Wannabe
Posts: 2685
Joined: 2002-08-06 11:57pm

Post by Perinquus »

Darth Wong wrote:
So it's better to suffer horrible persecution, intolerance, and cruelty as long as you don't get an inferiority complex? You need help.
No, I am saying that a religious theocracy may turn out to be the lesser of two evils, since an AI government remains an unknown quanitity, and so long as it remains an unknown quantity, there is the possibility that it could be even worse. And if it should turn out to be worse, it could conceivably be a lot worse. I am not about to assume that an artificial intelligence must be a better alternative to the religious government, which is exactly what a lot of you here are doing. Well, what is your basis for that assumption? We have a known track record of theocracies (abysmal), but a complete blank for the AI. In other words, there is no basis for comparison; just speculation. Well, the worst case scenario I can imagine for the AI makes it look worse than a theocracy, all things considered, and even the best case is a government I should not care to live under, under any circumstances.
User avatar
Darth Servo
Emperor's Hand
Posts: 8805
Joined: 2002-10-10 06:12pm
Location: Satellite of Love

Post by Darth Servo »

Perinquus wrote:No, I am saying that a religious theocracy may turn out to be the lesser of two evils, since an AI government remains an unknown quanitity, and so long as it remains an unknown quantity, there is the possibility that it could be even worse.
Then you didn't read the original conditions of this scenario:
First post in thread wrote:In the first scenario, self-aware artificial intelligence exceeds human intelligence. While not enslaving us or even outrightly commanding us, the AIs are content to be our advisors, knowing that we would rely more and more upon their superior wisdom as civilization becomes more socially and economically complex. Over the centuries, as humanity and its sentient offshoots (biological and syntheic) spread throughout the solar system, the problems of war, crime, exploitation, oppression and disease dwindle to (by today's standards) negligible levels. Poverty still exists, of course; although we probably won't recognise it as poverty today. The humans still believe that they are running the show. That belief, of course, is essential to preserve their psychological well-being
"everytime a person is born the Earth weighs just a little more."--DMJ on StarTrek.com
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com

"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
User avatar
Perinquus
Virus-X Wannabe
Posts: 2685
Joined: 2002-08-06 11:57pm

Post by Perinquus »

And incidentally, in a way, the fundamentalist government might be better in the long term. Beyond hope, you pointed out that one is a utopia, and another is a dystopia. Quite true. The utopia would be far more pleasant for individuals, to be sure, but the dystopia might prove better for man as a species in the long run. If you live in a utopia where everything is rosy, I imagine it would not be at all difficult to descend into the kind of mass complacency pictured in Aldous Huxley's Brave New World. People are content with their lives; they have no incentive to change it. And if everything is kind of tame and predictable, well, that's a small price to pay for universal harmony isn't it?

So where do you go from here? I don't picture human beings striving to better themselves or reach out for new frontiers. People seldom stir themselves to action when they're utterly content. A society like this would probably drift complacently and stagnantly along until wiped out by some natural catastrophe, or conquered by some foreign power.

On the other hand, at least the dystopia gives people a goal to strive for - the overthrow of this nightmare regime, and its replacement by something better. People might become many things under this system, but complacent is unlikely to be one of them. Nothing lasts forever. Sooner or later the fundies would be toppled, and at least people would likely be in better shape to look after themselves than they would had they been cared for like sheep by some AI.
User avatar
Darth Servo
Emperor's Hand
Posts: 8805
Joined: 2002-10-10 06:12pm
Location: Satellite of Love

Post by Darth Servo »

Perinquus wrote:If you live in a utopia where everything is rosy, I imagine it would not be at all difficult to descend into the kind of mass complacency pictured in Aldous Huxley's Brave New World. People are content with their lives; they have no incentive to change it. And if everything is kind of tame and predictable, well, that's a small price to pay for universal harmony isn't it?
This rant only proves that you know absolutely nothing about human nature. No one stays content for very long. Even in a Utopia, people get bored with the status quo. People are lazy. People will WANT some kind of changes and improvements now and then. Even Bill Gates still wants more money.
"everytime a person is born the Earth weighs just a little more."--DMJ on StarTrek.com
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com

"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
User avatar
Perinquus
Virus-X Wannabe
Posts: 2685
Joined: 2002-08-06 11:57pm

Post by Perinquus »

Darth Servo wrote:
Perinquus wrote:If you live in a utopia where everything is rosy, I imagine it would not be at all difficult to descend into the kind of mass complacency pictured in Aldous Huxley's Brave New World. People are content with their lives; they have no incentive to change it. And if everything is kind of tame and predictable, well, that's a small price to pay for universal harmony isn't it?
This rant only proves that you know absolutely nothing about human nature. No one stays content for very long. Even in a Utopia, people get bored with the status quo. People are lazy. People will WANT some kind of changes and improvements now and then. Even Bill Gates still wants more money.
I know quite a bit about human nature. As a cop, I have to be able to understand people on some level. You have absolutely no idea how many lazy good for nothing slugs I see every day, who do stay content their whole lives, and have absolutely no ambition; they just drift through life. Shit, probably 80% of the people I deal with in the poor crime ridden areas I work in fit this description; it's the main reason they stay at the bottom of the economic ladder. Their lives may suck like a hoover on steroids, but damned if they'll get up off their complacent, lazy asses and change it. And this describes people whose lives are actually pretty drab and honestly crummy. They ought to have incentive to change, but they don't. Now what happens when basically everybody is materially quite well off?

Also bear in mind that since human beings are not running the show anymore, there is no outlet for the really ambitious ones (or at the very least, a greatly reduced outlet) who do want to better themselves. This sort of ambition may very well be quietly discouraged among the sheep, and may even be largely conditioned out of people. In fact, if this AI wants to maintain this society indefinitely (and I see no reason why it wouldn't, if that's the purpose for which it was built), there's every likelihood that this is the case, since the more complacent people are, the more stable the society's likely to be.

Not everybody wants change either. In fact there are some people who actually fear it. This super AI is supposed to be providing everybody with just about everything they need isn't it? People living what they see as the good life don't really want change; they just want a few new toys to play with every now and again.
User avatar
Darth Servo
Emperor's Hand
Posts: 8805
Joined: 2002-10-10 06:12pm
Location: Satellite of Love

Post by Darth Servo »

Do you know what a hasty generalization fallacy is? Sure, some people will want to live in crap in real life, but most do not. If we didn't want to better ourselves, we'd still be living in the dark ages (a religious dictatorship).
"everytime a person is born the Earth weighs just a little more."--DMJ on StarTrek.com
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com

"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
User avatar
Perinquus
Virus-X Wannabe
Posts: 2685
Joined: 2002-08-06 11:57pm

Post by Perinquus »

Darth Servo wrote:Do you know what a hasty generalization fallacy is?


Something like the sweeping statement: "People are lazy. People will WANT some kind of changes and improvements now and then."?
Darth Servo wrote:Sure, some people will want to live in crap in real life, but most do not. If we didn't want to better ourselves, we'd still be living in the dark ages (a religious dictatorship).
Ah, but now we don't need to better ourselves do we? We have the AI taking care of us nicht wahr? And as I said, with outlets for the ambitious far more limited, and the AI programmed for the sole purpose of running an ordered, harmonious society, it's very much in the interests of the machine to limit the scope of people's ambition and even condition them into a more contented, complacent, and thus stable society.

I have a hard time believing that the kind of people with an ambition to better themselves would turn their government over to a machine in the first place, since you're hardly bettering yourself when you voluntarily submit yourself to be cared for by an outside overlord, as children are cared for a parent.
User avatar
SHODAN
Padawan Learner
Posts: 333
Joined: 2002-11-04 06:47am

Post by SHODAN »

Are you claiming the AI would repress human evolution :?:
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Post by Darth Wong »

Perinquus wrote:I know quite a bit about human nature. As a cop, I have to be able to understand people on some level. You have absolutely no idea how many lazy good for nothing slugs I see every day, who do stay content their whole lives, and have absolutely no ambition; they just drift through life. Shit, probably 80% of the people I deal with in the poor crime ridden areas I work in fit this description; it's the main reason they stay at the bottom of the economic ladder.
With all due respect, a police officer must deal with, shall we say, the dregs of society. It's not fair to characterize society in general by that token.
Also bear in mind that since human beings are not running the show anymore, there is no outlet for the really ambitious ones (or at the very least, a greatly reduced outlet) who do want to better themselves.
Leap in logic. AI's can run the show without also manning every conceivable station and taking over every conceivable task. Does the office of the US President eliminate the need for all labour, decisions, and individual human effort throughout all society?
Not everybody wants change either. In fact there are some people who actually fear it. This super AI is supposed to be providing everybody with just about everything they need isn't it?
No, it's just supposed to be running a smooth government. Government does not provide everybody with everything they need; it only regulates human activity. Humans must still be active.

Your scenario still relies upon the assumption that if our government puts an AI at the top, it will be impossible to dislodge it or stop it from seizing more power until it directly controls all of society down to the most minute detail. I have seen no rationale whatsoever for this assumption, which you repeat only because of your fear that what you saw in a novel would come true.
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
SHODAN
Padawan Learner
Posts: 333
Joined: 2002-11-04 06:47am

Post by SHODAN »

Thirdfain wrote: admitting that mankind is NOT GOOD ENOUGH.
History proves that above is the correct assessment.
Oh, btw, I just can't get over the fact that one of the supporters of the benevolent A.I. is called SHODAN :shock: :? :P .
Why, surely I am capable of any action to further my plans. 8)
User avatar
UltraViolence83
Jedi Master
Posts: 1120
Joined: 2003-01-12 04:59pm
Location: Youngstown, Ohio, USA

Post by UltraViolence83 »

SHODAN wrote:
Thirdfain wrote: admitting that mankind is NOT GOOD ENOUGH.
History proves that above is the correct assessment.
No, it doesn't. It just proves certain ideals get in the way of human nature. We tend to think that humans are flawed because we can't live up to the artificial standards we set for ourselves due to our modern conceptions of morality. Face the music, humans are violent power-hungry pack animals. That is our main instinct which leads to structured/feudalistic systems of society, the most stable form in the world. Democracy isn't too far left of our nature, so it lasts longer than communism and anarchy. We surround ourselves with ideals that breed more impossible goals such as those last two. If we embraced that aspect of our nature, we wouldn't have so many fucked up problems with our culture as we do today. It seems cruel, but you're going to have to hurt a good portion of people in one way or another to create a balanced, stable society (Balanced in that in between individual rights and social stability). "Everybody happy" doesn't seem to work out too well is what history proves.*

*You see, I don't base my morality on popular standards. I see slavery as a bad thing today, but 2,000 years ago I might not have. Ethics are relative to the times, and we're conditioned into them from birth. I do what I feel is right for myself. That includes harshness sometimes. Historywise, what works works in my opinion. I DO like democracy; I am a fiercely individualistic person.

DW: What I was trying to say was that if we don't have an inferiority complex towards a higher power, we feel as we can defeat it moreso than otherwise. I see what you mean with the governmental AI. That in itself isn't too bad. Just that when people think of ruling AI, they think of TOTAL supervision. Your idea of governmental AI seems alright, as long as humans are controlling its actions.

Dune shields worse plot device same as/worse than the holodeck? That's kind of harsh... :( Considering the writing standards of the original Dune novels versus modern Star Trek's. :evil:


I'm aware this had little to do with the topic, but the topic's really degraded into the current arguments which were beyond its scope anyway.
...This would sharpen you up and make you ready for a bit of the old...ultraviolence.
User avatar
UltraViolence83
Jedi Master
Posts: 1120
Joined: 2003-01-12 04:59pm
Location: Youngstown, Ohio, USA

Post by UltraViolence83 »

Boy I love to rant. 8)
...This would sharpen you up and make you ready for a bit of the old...ultraviolence.
User avatar
Darth Servo
Emperor's Hand
Posts: 8805
Joined: 2002-10-10 06:12pm
Location: Satellite of Love

Post by Darth Servo »

Perinquus wrote:
Darth Servo wrote:Do you know what a hasty generalization fallacy is?
Something like the sweeping statement: "People are lazy. People will WANT some kind of changes and improvements now and then."?
I never claimed ALL people were like that. You OTOH tried to argue that ALL people would descend into wastefulness just because an AI was making the big decisions. YOUR statement is the hasty generalization. Mine is not.
Darth Servo wrote:Sure, some people will want to live in crap in real life, but most do not. If we didn't want to better ourselves, we'd still be living in the dark ages (a religious dictatorship).
Ah, but now we don't need to better ourselves do we? We have the AI taking care of us nicht wahr?
By that "logic" kids would never leave their parent's homes.
And as I said, with outlets for the ambitious far more limited, and the AI programmed for the sole purpose of running an ordered, harmonious society, it's very much in the interests of the machine to limit the scope of people's ambition and even condition them into a more contented, complacent, and thus stable society.
Limited does NOT mean eliminated.
I have a hard time believing that the kind of people with an ambition to better themselves would turn their government over to a machine in the first place, since you're hardly bettering yourself when you voluntarily submit yourself to be cared for by an outside overlord, as children are cared for a parent.
Appeal to Authority fallacy. You can't personally accept it therefore it must not be true? Have you always had such a bloated ego?
"everytime a person is born the Earth weighs just a little more."--DMJ on StarTrek.com
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com

"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
User avatar
UltraViolence83
Jedi Master
Posts: 1120
Joined: 2003-01-12 04:59pm
Location: Youngstown, Ohio, USA

Post by UltraViolence83 »

Ok, let's just say that both utopias are bad in their very own special way. There, I solved this dilemma. 8)
...This would sharpen you up and make you ready for a bit of the old...ultraviolence.
User avatar
beyond hope
Jedi Council Member
Posts: 1608
Joined: 2002-08-19 07:08pm

Post by beyond hope »

How could an AI not be the result of our own works. When you look at man as an animal, we're pathetic: real predators (and most herbivores for that matter) are stronger, faster, more agile, and have much more impressive natural weaponry. I laugh when I hear people say that man is the top of the food chain, because we're not: we're opportunistic omnivorous scavengers. What we do have in our favor are intelligence and the ability to make and use tools. Technology, in other words, is the secret of our success. Machines allow us to travel farther, communicate faster, and know more than our primitive forebearers could have possibly conceived of. Consider what kind of advances it would take to create an AI which is capable of understanding the wants and needs of billions of people and providing for them.

That leads me to my other point: nowhere in the scenario did it specify that machines are doing all the work. In fact, if you consider the drive that a lot of people have to do something productive and useful with their lives, it would be counter-productive for all work to be done by robots. I know people who would go nuts if they suddenly found themselves without something to do... picture this instead: the AI assesses your aptitudes, figures out what you'd be happiest doing, and that's your job. Work still gets done and people are still an important part of it. The role of the AI would be more as counseler and overseer. Does it sound hopelessly unrealistic? Of course it does: utopia schemes don't work, and never will. If the alternative is turning the clock back to the Dark Ages and handing the power over to the Pentehostiles, however, I would even take something like SkyNet or SHODAN: it would be a quicker and more merciful end for humanity than rotting away under the Fundie blight.

*edit* corrected a typo.
User avatar
Perinquus
Virus-X Wannabe
Posts: 2685
Joined: 2002-08-06 11:57pm

Post by Perinquus »

beyond hope wrote: ...If the alternative is turning the clock back to the Dark Ages and handing the power over to the Pentehostiles, however, I would even take something like SkyNet or SHODAN: it would be a quicker and more merciful end for humanity than rotting away under the Fundie blight.

*edit* corrected a typo.
Now who's making the leap of logic?

So this theocracy will of necessity be the final form of human government, and will preside unto the ending of humanity? Given that nothing like this has ever happened before, with any of the numerous theocracies that have existed throughout human history, just how did you reach this conclusion?
User avatar
Darth Servo
Emperor's Hand
Posts: 8805
Joined: 2002-10-10 06:12pm
Location: Satellite of Love

Post by Darth Servo »

UltraViolence83 wrote:Ok, let's just say that both utopias are bad in their very own special way. There, I solved this dilemma. 8)
*smacks UV83 in the face*
Don't ruin a perfectly good argument like that.
*smack*
"everytime a person is born the Earth weighs just a little more."--DMJ on StarTrek.com
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com

"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
Post Reply