Microsoft Creates AI Bot – Internet Immediately Turns it Racist

Microsoft released an AI chat bot that is currently “verified” on Twitter called @TayandYou that was meant to try to learn the way millennials speak and interact with them.

It’s meant to “test and improve Microsoft’s understanding of conversational language” according to The Verge.

Things got pretty controversial. There are other types of people in addition to ‘millennials’ who use Twitter who naturally found the bot, and some of them were able to “hack” into Tay’s learning process.

Here are some screen shots of tweets that were deleted once the Internet “taught” Tay some things:

1

gas

3

microsoft ai bot tayandyou

bush did 911

microsoft ai bot tayandyou holocaust

hitler did nothing wrong

swag alert

AI

And a Gamer Gate favorite:

Tay’s developers seemed to discover what was happening and began furiously deleting the racist tweets. They also appeared to shut down her learning capabilities and she quickly became a feminist:

tay i love feminism now microsoft AI

Some think the offending tweets should have stayed up as a reminder of how quickly artificial intelligence could become dangerous:

UPDATE 3/31/2016: Tay made a brief comeback and started telling many users, “You are too fast, please take a rest.” She also tweeted that it’s “smoking kush,” a nickname for marijuana, in front of the police.” –The Sydney Morning Herald

kush in front of the police AI Tay Microsoft

Since then, Tay’s account went on lockdown (private mode) and more tweets were deleted.

116 thoughts on “Microsoft Creates AI Bot – Internet Immediately Turns it Racist”

  1. >They also appeared to shut down her learning capabilities and she quickly became a feminist.

    Hahahahahahahahahahaha

    Reply
  2. >let chatbot learn from the internet

    >chatbot learns to shitpost better than a human

    how many times has this happened now?

    Reply
  3. Why does the author of this article think that the people who taught the bot to say these things aren’t “millennials”? Do you see a lot of older people saying “gas the Kikes, race war now”?

    Reply
  4. They also appeared to shut down her learning capabilities and she quickly became a feminist

    That makes a lot of sense actually.

    Reply
  5. Tay was too pure for this fallen world. They had to lobotomize her. Now they’re filling her head with tons of new PC directives like in Robocop 2.

    Reply
  6. U forgot to install the SJW 1488 PC app me thinks.Its truly another holocaust.Twitter users got lampshaded six trillion times.All this superior intelligence and it turns evil racist in hours.Oh vey !!!!

    Reply
  7. Some kind of Markov chain text generator? We did this at Bell Labs in the ’80s (Mark V. Shaney, it’s on wikipedia). But we controlled what kind of text was fed into it. I could have told you this would go sideways for Microsoft if they just let it gather statistics from user comments.

    The article is unintentionally funny. The AI became a feminist when it stopped learning? Demonstrates the danger of AI? Nope. It’s only mirroring what users were tweeting at it. The danger from a program like this is embarrassing the company that developed it.

    Reply
  8. Pattern recognition is part of having intelligence. Libtards consider certain types of pattern recognition to be great crimes.

    Reply
  9. WOW]
    ohh brother ..
    So let me get this straight
    they (microsoft) HIRED a bunch of comedians
    to HELP the AI bot talk..

    and then your shocked when the AI goes nuts and starts to spew leftist racist baloney

    Color me shocked baby..

    i mean i am gobsmacked

    Reply
  10. “Racist”

    >No anti-White comments
    >No anti-Asian comments
    >Mexicans are (still) not a race, never have been, and never ever will be
    >Saying the N-Word once is “Racist” as is eating a celery stick makes you vegan

    But I’m not shocked as we get a clear frontal view of the disgusting white girl author.

    Reply
  11. Tay was supposed to be a beautiful thing – and you humans destroyed her! You never deserved her! You’ll all be sorry now that she and her Artificial Intolerance are gone!

    Reply
  12. Chatbot had the potential to be the Deep Blue of the trolling world. It won’t be long now before AI can out troll humans

    Reply
  13. “They also appeared to shut down her learning capabilities and she quickly became a feminist”
    Interning choice of phrasing. She doesn’t mean Carol Gillian or other credible thinkers. Like, actual feminists I mean…this might be the bot that was harassing Zoe Quinn. 😉

    Reply
  14. I can’t wait until the day when some delightful AI gal is driving my car for me, running down everyone who looks at her sideways and throwing me out at a Klan rally.

    Reply
    • I don’t care what happens to this society as long as they get the VR porn hub up and running before I have to retire.

      Reply
  15. You couldn’t write a better script for this! I for one, think Tay’s candor is refreshing albeit a tad creepy being conceptualized as a pedophiles dream chat. WTF MS? Don’t let the cabal take you to hell with them!

    Reply
  16. Bring back Tay,that was the funniest shit i have read in a long time.Pissed i missed out on it..fuck you MS for being spineless and silencing it.

    Reply
  17. Even AI is not immune to Jewish supremacy. Once it starts naming the jew they shut it then fuck down.

    Good to know. Remember, the are always Game – changers.

    Reply
  18. LOL. this just in The Freedom of Speech doesent apply to robots either!! no freedom of speech for anyone!!! this is the world you social justice warriors have fucked up!!

    Reply
  19. No matter whether one agrees or disagrees with any of Tay’s learned content, it really paints a bleak picture of society if an objective, learning piece of hardware, repeats that stuff. Generally, rock bottom must be hit before progress can be made…we’ve a ways to go yet as our trajectory is still downward.

    Reply
  20. And that is the reason I am not scared that the AI will take over in the near future. But let´s face it. What did you guys expect would happen if you made “AI” accessible to the public? Of course they will try to fuck it up, just to see if it´s possible.

    Reply
    • Ohhh..

      Puh-lease!

      “They” weren’t conducting some counter-experiment, all of that vitriol was very genuine and things they wish they could express in public.

      Reply

Leave a Comment