Microsoft created an online chatbot called 'Tay' in March 2016, intended to test the limits of artificial intelligence and machine learning on social media platforms. Less than 24 hours later, Tay went off the rails.
Hosted by Corbin Davenport, guest starring Joe Fedewa.
Follow on Twitter: https://twitter.com/TechTalesShow
Follow on Mastodon/Fediverse: https://mas.to/@techtales
Support the Show: https://techtalesshow.com/support
Sources:
• https://web.archive.org/web/20160323194709/https://tay.ai/
https://arxiv.org/abs/1812.08989
• https://www.nytimes.com/2015/08/04/science/for-sympathetic-ear-more-chinese-turn-to-smartphone-program.html
• https://www.theverge.com/2016/3/23/11290200/tay-ai-chatbot-released-microsoft
• https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
• https://www.businessinsider.com/microsoft-deletes-racist-genocidal-tweets-from-ai-chatbot-tay-2016-3
• https://web.archive.org/web/20170210055324/http://fusion.net/story/284617/8chan-microsoft-chatbot-tay-racist/
• https://fortune.com/2016/03/30/microsofts-tay-return/
• https://www.cnet.com/tech/mobile/microsoft-zo-chatbot-ai-artificial-intelligence/
• https://qz.com/1340990/microsofts-politically-correct-chat-bot-is-even-worse-than-its-racist-one/