Favorite

Microsoft's Brief Experiment With Tay the Teen Chatbot Is the Reason We Can't Have Nice Things

Microsoft released a chatbot aimed at 18-24 year olds with the ability to increase her responses by learning from what people say to her. Unfortunately, like many teens, Tay the chatbot had no filters and was able to rapidly spin out of control. Within 24 hours, she was shut down due to the horrible, racist things she was spitting out in all caps. Thankfully, the internet did not let her go without some documentation!

Share
Tweet
Stumble
Pin It
Email
  • 1

    That's a Little Bossy But, Okay

    Via: @Levisan

  • 2

    She's Not Wrong

    Via: @Rachel53461

  • 3

    Uhh.. Maybe That's Not a Good Idea

    Via: @NukeChem

  • 4

    Is She Part of the Celebrity Cloning Conspiracy?

    Via: John-Flynt

  • 5

    That's Not Very Nice

    Via: @Robbphoenix

  • 6

    She's Not a Fan of Ted Cruz

    Via: @godsenfrik

  • 7

    But Apparently She Loves Donald Trump

    Via: John-Flynt

  • 8

    That Actually Makes a Lot of Sense Considering...

    Via: John-Flynt

  • 9

    Tay... No.

    Via: NotANestleShill

  • 10

    Tay! NO!!!

    Via: John-Flynt

  • 11

    Something Tells Us This Is the Last We'll See of an AI Like Tay

    Via: John-Flynt
    Hopefully, anyway.
  • -
  • Vote
  • -
Share
Tweet
Stumble
Pin It
Email

Next on The Daily What

These Zero Gravity Videos Will Make You Wish You Lived in Space
Comments - Click to show - Click to hide