A choice of two "utopias".
Moderator: Alyrium Denryle
-
- SMAKIBBFB
- Posts: 19195
- Joined: 2002-07-28 12:30pm
- Contact:
- Captain tycho
- Has Elected to Receive
- Posts: 5039
- Joined: 2002-12-04 06:35pm
- Location: Jewy McJew Land
After news of the subway fire in South Korea, I felt that humans were too unstable to look after civilization by themselves. And I was mumbling to myself. I feel better now. Honest.Morat wrote:Now tell me again: What is this poll supposed to accomplish?
With any luck, we might also bait some fundies and "log cabin libertarian luddites" for our dissecting pleasure; but I'm certain that those types would be too scared of this place by now.
- UltraViolence83
- Jedi Master
- Posts: 1120
- Joined: 2003-01-12 04:59pm
- Location: Youngstown, Ohio, USA
Refuse to vote. I hate the notion of ruling AIs as much as fundies.
Rathark: We're too unstable to look after our OWN civilization!? The day I'm ruled by a machine is the day I place magnets on every harddrive I see.
Call me a pro-human terrorist if you must but I will not let another sentient "thing" rule over me or my species.
I don't see what's so bad about log cabins...I wouldn't call myself a Luddite by a longshot, I'm more of a "Naturalist," if such a thing exists.
Won't have to worry, anyway. Utopias are impossible.
Sorry. I get into rant mode whenever I hear "utopia" and "AI rule."
Rathark: We're too unstable to look after our OWN civilization!? The day I'm ruled by a machine is the day I place magnets on every harddrive I see.
Call me a pro-human terrorist if you must but I will not let another sentient "thing" rule over me or my species.
I don't see what's so bad about log cabins...I wouldn't call myself a Luddite by a longshot, I'm more of a "Naturalist," if such a thing exists.
Won't have to worry, anyway. Utopias are impossible.
Sorry. I get into rant mode whenever I hear "utopia" and "AI rule."
...This would sharpen you up and make you ready for a bit of the old...ultraviolence.
- Peregrin Toker
- Emperor's Hand
- Posts: 8609
- Joined: 2002-07-04 10:57am
- Location: Denmark
- Contact:
- EmperorMing
- Sith Devotee
- Posts: 3432
- Joined: 2002-09-09 05:08am
- Location: The Lizard Lounge
- Darth Wong
- Sith Lord
- Posts: 70028
- Joined: 2002-07-03 12:25am
- Location: Toronto, Canada
- Contact:
Living under the fundies might not even be an option at all for many of us, since many of us would be executed for heresy, and once the fundies are in complete control and the Inquisition inevitably returns, the rest could very well face execution for simply looking at a priest the wrong way or ticking off the town gossip.
AI's suck too, but it's paradise compared to that.
AI's suck too, but it's paradise compared to that.
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing
"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC
"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness
"Viagra commercials appear to save lives" - tharkûn on US health care.
http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC
"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness
"Viagra commercials appear to save lives" - tharkûn on US health care.
http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
- EmperorMing
- Sith Devotee
- Posts: 3432
- Joined: 2002-09-09 05:08am
- Location: The Lizard Lounge
At least you can have anal sex with the Ai's around...Darth Wong wrote:Living under the fundies might not even be an option at all for many of us, since many of us would be executed for heresy, and once the fundies are in complete control and the Inquisition inevitably returns, the rest could very well face execution for simply looking at a priest the wrong way or ticking off the town gossip.
AI's suck too, but it's paradise compared to that.
DILLIGAF: Does It Look Like I Give A Fuck
Kill your God!
- Admiral Valdemar
- Outside Context Problem
- Posts: 31572
- Joined: 2002-07-04 07:17pm
- Location: UK
How does one go about fucking a computer in the arse?EmperorMing wrote:At least you can have anal sex with the Ai's around...Darth Wong wrote:Living under the fundies might not even be an option at all for many of us, since many of us would be executed for heresy, and once the fundies are in complete control and the Inquisition inevitably returns, the rest could very well face execution for simply looking at a priest the wrong way or ticking off the town gossip.
AI's suck too, but it's paradise compared to that.
- Keevan_Colton
- Emperor's Hand
- Posts: 10355
- Joined: 2002-12-30 08:57pm
- Location: In the Land of Logic and Reason, two doors down from Lilliput and across the road from Atlantis...
- Contact:
I think you need a copy of "Back Orifice"Admiral Valdemar wrote:How does one go about fucking a computer in the arse?EmperorMing wrote: At least you can have anal sex with the Ai's around...
"Prodesse Non Nocere."
"It's all about popularity really, if your invisible friend that tells you to invade places is called Napoleon, you're a loony, if he's called Jesus then you're the president."
"I'd drive more people insane, but I'd have to double back and pick them up first..."
"All it takes for bullshit to thrive is for rational men to do nothing." - Kevin Farrell, B.A. Journalism.
BOTM - EBC - Horseman - G&C - Vampire
"It's all about popularity really, if your invisible friend that tells you to invade places is called Napoleon, you're a loony, if he's called Jesus then you're the president."
"I'd drive more people insane, but I'd have to double back and pick them up first..."
"All it takes for bullshit to thrive is for rational men to do nothing." - Kevin Farrell, B.A. Journalism.
BOTM - EBC - Horseman - G&C - Vampire
- UltraViolence83
- Jedi Master
- Posts: 1120
- Joined: 2003-01-12 04:59pm
- Location: Youngstown, Ohio, USA
What!? How dare you question our Dark Lord! Burn the heretic, I say!["SHODAN"]What do you mean by 'AIs suck', oh Lord Wong?
Just wait till I get my plans of world domination in motion
*pant, pant*
*returns to normal operating condition*
Well, since he apparently isn't around right now, I'll put in my two cents on why I think uber-AIs suck, as well as utopias in general:
(Soapbox alert)
Main reason is that if we do make supersmart machines and we are actually stupid enough to make MORE and let them control us and do our work, we won't have anything meaningful to do. That's the greatest pitfall of utopian thought: utopians never realize that once you fix everything to perfection, there is absolutley nothing to do.
The term Utopia is Greek(?) for "Nowhere," and was coined by Thomas More, who wrote a book entitled Utopia*. This was about 400 years ago IIRC, so it's not an ancient ideal like alot of people think.
Work is something we need as a species, as individual persons. Without meaning in our lives, we are meaningless. To do nothing but grow complacent and stagnant will lead to nothing but our decay as a whole. Without perpetual goals to work for, we lose that essential spark of humanity that we seem to take for granted. That spark called "hope."
I don't mean pointless work like hobbies or deviant sex ( ), I mean the kind of work that runs society and the life-or-death decisions made by our leaders as well as ourselves sometimes. The kind of work that makes us who we are, the adventurous dangerous kind.
Imagine living in a totally complacent world. Would there even be any stories to write or fun games to make? Every good story has some kind of conflict in it. A world without any conflict would breed generations of boring, placid people.
Though with AIs running the scene, there may be much less uncertainty and tragedies, but we need death and chaos to really understand the consequences of our actions and reality in general. Without accidents or human error like the Columbia explosion and Chernobol, we wouldn't think of our space pioneers as brave individuals or truly understand the importance of engineering a suitable nuclear power plant.
Point is, sustained contentment leads to apathy and a lack of prudence. The underestimating of possible disasters is also a real danger.
And let's not even get into the fact that everyone's idea of paradise is different. "One man's Heaven is another man's Hell."
*Soapbox collapses from constant abuse*
...This would sharpen you up and make you ready for a bit of the old...ultraviolence.
I know most of you voted for benign AIs, but really, I'd rather have religious fundies. At least, being human, you could hope to out-think and overthrow them eventually. A faster thinking AI would be worse. Read Jack Williamson's "humanoid" stories, especially the novella "with Folded Hands", for an idea what a benevolent AI might do to humanity. I have truly never read a more chilling, more terrifying story in my entire life than that. If you can really imagine a world like the one Williamson described in that story, I guarantee you, you will thank whatever god you pray to (or blind luck if you follow no god) that you have not lived to see such a state of events. It's the most depressing and hope-killing thing I've ever read in my entire life.
UltraViolence, are you saying that humans have to do world-changing work in order to be happy?
Would you therefore say that all barbers (for example) are unhappy?
Would you therefore say that all barbers (for example) are unhappy?
Which is why the Culture is a true utopia. If you don't like the conditions in the Culture proper, you can always go form your own offshoot. Chances are you'll find some like-minded people to go with you.And let's not even get into the fact that everyone's idea of paradise is different.
And as a counterpoint, I suggest reading the Culture novels by Iain M. Banks for a very different view on what life under super-intelligent AI's might be like.I know most of you voted for benign AIs, but really, I'd rather have religious fundies. At least, being human, you could hope to out-think and overthrow them eventually. A faster thinking AI would be worse. Read Jack Williamson's "humanoid" stories, especially the novella "with Folded Hands", for an idea what a benevolent AI might do to humanity. I have truly never read a more chilling, more terrifying story in my entire life than that. If you can really imagine a world like the one Williamson described in that story, I guarantee you, you will thank whatever god you pray to (or blind luck if you follow no god) that you have not lived to see such a state of events. It's the most depressing and hope-killing thing I've ever read in my entire life.
- Darth Servo
- Emperor's Hand
- Posts: 8805
- Joined: 2002-10-10 06:12pm
- Location: Satellite of Love
How could public executions for herresy possibly in any way be a superior alternative to yielding some of our personal freedom and decisions to machines?
"everytime a person is born the Earth weighs just a little more."--DMJ on StarTrek.com
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com
"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
"You see now you are using your thinking and that is not a good thing!" DMJay on StarTrek.com
"Watching Sarli argue with Vympel, Stas, Schatten and the others is as bizarre as the idea of the 40-year-old Virgin telling Hugh Hefner that Hef knows nothing about pussy, and that he is the expert."--Elfdart
Re: A choice of two "utopias".
Rathark wrote:Poverty still exists, of course; although we probably won't recognise it as poverty today......
Social, political, personal, educational and creative freedom no longer exists for the vast, mindless majority. ....
If these are utopias, I want a refund!
But that's the entire point; all important tasks will be allocated to machines so you dont have to do anything.UltraViolence83 wrote: Main reason is that if we do make supersmart machines and we are actually stupid enough to make MORE and let them control us and do our work, we won't have anything meaningful to do. That's the greatest pitfall of utopian thought: utopians never realize that once you fix everything to perfection, there is absolutley nothing to do.
- Keevan_Colton
- Emperor's Hand
- Posts: 10355
- Joined: 2002-12-30 08:57pm
- Location: In the Land of Logic and Reason, two doors down from Lilliput and across the road from Atlantis...
- Contact:
I'll take the AI utopia on one important stipulation....YOU arent the AI in chargeSHODAN wrote:But that's the entire point; all important tasks will be allocated to machines so you dont have to do anything.UltraViolence83 wrote: Main reason is that if we do make supersmart machines and we are actually stupid enough to make MORE and let them control us and do our work, we won't have anything meaningful to do. That's the greatest pitfall of utopian thought: utopians never realize that once you fix everything to perfection, there is absolutley nothing to do.
"Prodesse Non Nocere."
"It's all about popularity really, if your invisible friend that tells you to invade places is called Napoleon, you're a loony, if he's called Jesus then you're the president."
"I'd drive more people insane, but I'd have to double back and pick them up first..."
"All it takes for bullshit to thrive is for rational men to do nothing." - Kevin Farrell, B.A. Journalism.
BOTM - EBC - Horseman - G&C - Vampire
"It's all about popularity really, if your invisible friend that tells you to invade places is called Napoleon, you're a loony, if he's called Jesus then you're the president."
"I'd drive more people insane, but I'd have to double back and pick them up first..."
"All it takes for bullshit to thrive is for rational men to do nothing." - Kevin Farrell, B.A. Journalism.
BOTM - EBC - Horseman - G&C - Vampire
Well, call me a pessimist, but once you put machines firmly in the saddle, I think the results are likely to be less utopian than you hoped for. Remember, these are machines, not people, they wouldn't think the way we do, and they wouldn't necessarily even comprehend human nature or human psychology. I think a dystopian outcome like Williamson's is more likely, as the machines decide they know what's best for us, and do it all for our own good, whether we like it or not.Morat wrote:
And as a counterpoint, I suggest reading the Culture novels by Iain M. Banks for a very different view on what life under super-intelligent AI's might be like.