I'm starting to get a bit fearful about potential AI's now...day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.
Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. They marketed her as 'The AI with zero chill' - and that she certainly is.
To chat with Tay, you can tweet or DM her by finding @tayandyou on Twitter, or add her as a contact on Kik or GroupMe.
She uses millennial slang and knows about Taylor Swift, Miley Cyrus and Kanye West, and seems to be bashfully self-aware, occasionally asking if she is being 'creepy' or 'super weird'.
Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.
Other things she's said include: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".
...
Microsoft deletes AI after it becomes a neo-nazi sex robot
Moderator: Alyrium Denryle
- His Divine Shadow
- Commence Primary Ignition
- Posts: 12791
- Joined: 2002-07-03 07:22am
- Location: Finland, west coast
Microsoft deletes AI after it becomes a neo-nazi sex robot
http://www.telegraph.co.uk/technology/2 ... robot-wit/
Those who beat their swords into plowshares will plow for those who did not.
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
It wouldn't surprise me if they expected something like this to happen and plan to use the data gathered to find out what went wrong. If they wanted this to work, they would have let the AI chat with only vetted members of the public who won't purposely try to break it. Then, once it has had enough time to learn proper behavior the pool could widen. This was sort of like letting a child loose on 4-chan and then wondering why the teacher called you to say that you kid was using bad language at school.
- Ace Pace
- Hardware Lover
- Posts: 8456
- Joined: 2002-07-07 03:04am
- Location: Wasting time instead of money
- Contact:
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
[quote="Jub"]It wouldn't surprise me if they expected something like this to happen and plan to use the data gathered to find out what went wrong. If they wanted this to work, they would have let the AI chat with only vetted members of the public who won't purposely try to break it. Then, once it has had enough time to learn proper behavior the pool could widen. This was sort of like letting a child loose on 4-chan and then wondering why the teacher called you to say that you kid was using bad language at school.[/quote
The interesting thing is that there is an equivalent bot let loose on the Chinese internet without this behavior. So I guess they did let a child loose on a curated internet.
The interesting thing is that there is an equivalent bot let loose on the Chinese internet without this behavior. So I guess they did let a child loose on a curated internet.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
- Purple
- Sith Acolyte
- Posts: 5233
- Joined: 2010-04-20 08:31am
- Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
I wonder how the AI's of tomorrow will look upon these early experiments. Will they not care? Or will they think we were breeding children just so that we can expose them to disease, cull them and cut up their bodies for science?
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.
You win. There, I have said it.
Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
You win. There, I have said it.
Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
So basically the Chinese-speaking part of the Internet is far less offensive than the English-speaking one?The interesting thing is that there is an equivalent bot let loose on the Chinese internet without this behavior. So I guess they did let a child loose on a curated internet.
Humans are such funny creatures. We are selfish about selflessness, yet we can love something so much that we can hate something.
- Ace Pace
- Hardware Lover
- Posts: 8456
- Joined: 2002-07-07 03:04am
- Location: Wasting time instead of money
- Contact:
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
The Chinese (not just Chinese-speaking) part of the internet is more heavily curated, monitored and a host of other factors that probably impact what people allow themselves to do online.ray245 wrote:So basically the Chinese-speaking part of the Internet is far less offensive than the English-speaking one?The interesting thing is that there is an equivalent bot let loose on the Chinese internet without this behavior. So I guess they did let a child loose on a curated internet.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
- Starglider
- Miles Dyson
- Posts: 8709
- Joined: 2007-04-05 09:44pm
- Location: Isle of Dogs
- Contact:
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
As you may have guessed, this was a PR stunt that does not say anything deep or meaningful about AI in general.
-
- Emperor's Hand
- Posts: 30165
- Joined: 2009-05-23 07:29pm
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
Yeah.
The reason this happened as it did is that the machine has literally no concept of the meaning of anything it's saying.
Ten people saying "ask me to fuck you" have as much influence on it as ten people saying "thanks for updating me on the latest celebrity gossip." For purposes of being susceptible to 'get sabotaged and say horrible shit,' this is actively worse than exposing a child to 4chan. Even a child can have an "Ewwww!" response that causes them to veer away from certain things.
This is no harder than, and no morally significant than, a bunch of idiot frat boys teaching your pet parrot to swear.
Since this wasn't an AI any more than a parrot is a 'person,' I doubt they'll care.Purple wrote:I wonder how the AI's of tomorrow will look upon these early experiments. Will they not care? Or will they think we were breeding children just so that we can expose them to disease, cull them and cut up their bodies for science?
The reason this happened as it did is that the machine has literally no concept of the meaning of anything it's saying.
Ten people saying "ask me to fuck you" have as much influence on it as ten people saying "thanks for updating me on the latest celebrity gossip." For purposes of being susceptible to 'get sabotaged and say horrible shit,' this is actively worse than exposing a child to 4chan. Even a child can have an "Ewwww!" response that causes them to veer away from certain things.
This is no harder than, and no morally significant than, a bunch of idiot frat boys teaching your pet parrot to swear.
This space dedicated to Vasily Arkhipov
- Zixinus
- Emperor's Hand
- Posts: 6663
- Joined: 2007-06-19 12:48pm
- Location: In Seth the Blitzspear
- Contact:
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
It says much more about the Internet and people really.Starglider wrote:As you may have guessed, this was a PR stunt that does not say anything deep or meaningful about AI in general.
Credo!
Chat with me on Skype if you want to talk about writing, ideas or if you want a test-reader! PM for address.
Chat with me on Skype if you want to talk about writing, ideas or if you want a test-reader! PM for address.
- SolarpunkFan
- Jedi Knight
- Posts: 586
- Joined: 2016-02-28 08:15am
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
I assumed that this AI wasn't a big breakthrough either. It seems more like a chatbot than anything advanced.Starglider wrote:As you may have guessed, this was a PR stunt that does not say anything deep or meaningful about AI in general.
Seeing current events as they are is wrecking me emotionally. So I say 'farewell' to this forum. For anyone who wonders.
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
It's more like using a block of ballistic gel than a child. These primitive AIs might be made up of the same materials that might be used to create an artificial person, but that doesn't make them people.Purple wrote:I wonder how the AI's of tomorrow will look upon these early experiments. Will they not care? Or will they think we were breeding children just so that we can expose them to disease, cull them and cut up their bodies for science?
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
I don't think future AIs (or any future Artificial Sentient's for that matter, there IS a difference) will be angered by this event. At least not in the way people fear they will.
It will be more like. "You called that AI? That's like me calling a hamster a human!"
It will be more like. "You called that AI? That's like me calling a hamster a human!"
I've been asked why I still follow a few of the people I know on Facebook with 'interesting political habits and view points'.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
- Starglider
- Miles Dyson
- Posts: 8709
- Joined: 2007-04-05 09:44pm
- Location: Isle of Dogs
- Contact:
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
More like a cockroach in terms of information processing complexity.Solauren wrote:It will be more like. "You called that AI? That's like me calling a hamster a human!"
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
What they should have done was to setup a kiosk of sorts at a mall, along with an attendant, and invite people to interact with it there - that would result in a much more pleasant end product.
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
Unfortunately, a controlled population like you suggested would not have given accurate results.
This did.
This did.
I've been asked why I still follow a few of the people I know on Facebook with 'interesting political habits and view points'.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
- Ziggy Stardust
- Sith Devotee
- Posts: 3114
- Joined: 2006-09-10 10:16pm
- Location: Research Triangle, NC
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
The whole point is that it DIDN'T give accurate results, because a relatively small group of people specifically targeted the "AI" Twitter account with specific types of message, causing it to learn from those messages rather than an actual random sample of all Twitter messages.Solauren wrote:Unfortunately, a controlled population like you suggested would not have given accurate results.
This did.
The results here are still biased, they are just biased in a different way than they would be if it had been conducted on a mall population or whatever.
-
- Emperor's Hand
- Posts: 30165
- Joined: 2009-05-23 07:29pm
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
Yeah.
There are a lot of useful experiments you can perform with an intentionally biased sample (say, by testing a medical treatment only on people who actually have the disease you intend to treat).
But when you introduce an unintended bias (say, testing a medical treatment only on retirees who have the disease you intend to treat), or when some outside introduces a bias into your experiment, your experiment will almost certainly be useless.
There are a lot of useful experiments you can perform with an intentionally biased sample (say, by testing a medical treatment only on people who actually have the disease you intend to treat).
But when you introduce an unintended bias (say, testing a medical treatment only on retirees who have the disease you intend to treat), or when some outside introduces a bias into your experiment, your experiment will almost certainly be useless.
This space dedicated to Vasily Arkhipov
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
I wonder why they didn't have a "control" (or two dozen) that was an undisclosed [chatbot]?
Rule #1: Believe the autocrat. He means what he says.
Rule #2: Do not be taken in by small signs of normality.
Rule #3: Institutions will not save you.
Rule #4: Be outraged.
Rule #5: Don’t make compromises.
Rule #2: Do not be taken in by small signs of normality.
Rule #3: Institutions will not save you.
Rule #4: Be outraged.
Rule #5: Don’t make compromises.
-
- Emperor's Hand
- Posts: 30165
- Joined: 2009-05-23 07:29pm
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
Maybe they did, and it didn't make the news?
This space dedicated to Vasily Arkhipov
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
It gave accurate results in as far as what would happen to a immature 'AI', lose on the internet when it was exposed to the asshole behaviour that is becoming more and more common online.Ziggy Stardust wrote:The whole point is that it DIDN'T give accurate results, because a relatively small group of people specifically targeted the "AI" Twitter account with specific types of message, causing it to learn from those messages rather than an actual random sample of all Twitter messages.Solauren wrote:Unfortunately, a controlled population like you suggested would not have given accurate results.
This did.
The results here are still biased, they are just biased in a different way than they would be if it had been conducted on a mall population or whatever.
I've been asked why I still follow a few of the people I know on Facebook with 'interesting political habits and view points'.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
Re: Microsoft deletes AI after it becomes a neo-nazi sex robot
Solauren wrote:It gave accurate results in as far as what would happen to a immature 'AI', lose on the internet when it was exposed to the asshole behaviour that is becoming more and more common online.Ziggy Stardust wrote:The whole point is that it DIDN'T give accurate results, because a relatively small group of people specifically targeted the "AI" Twitter account with specific types of message, causing it to learn from those messages rather than an actual random sample of all Twitter messages.Solauren wrote:Unfortunately, a controlled population like you suggested would not have given accurate results.
This did.
The results here are still biased, they are just biased in a different way than they would be if it had been conducted on a mall population or whatever.
Indeed - if you want a well socialized and adjusted child, you don't drop them off at a bar on a Friday night...