
Sign up to save your podcasts
Or


Remember Microsoft Tay? The 2016 Twitter chatbot suffered from a concerted effort to teach it all manner of vile things, prompting Microsoft to withdraw the service and apologize. That experience is top of mind for pretty much everyone talking about Meta's new chatbot, now available in public testing in the U.S. The developers behind BlenderBot 3 insist that they, too, had the Tay experiment in mind when they developed the process for teaching this new chatbot to behave. In fact, the beta is designed so people can report when it says something inappropriate that it may have learned from another user. That information is used to teach it what is not appropriate. Will it work?
Continue Reading →
The post FIR #276: Should I Tay Or Should I Go? appeared first on FIR Podcast Network.
By Neville Hobson and Shel Holtz5
2020 ratings
Remember Microsoft Tay? The 2016 Twitter chatbot suffered from a concerted effort to teach it all manner of vile things, prompting Microsoft to withdraw the service and apologize. That experience is top of mind for pretty much everyone talking about Meta's new chatbot, now available in public testing in the U.S. The developers behind BlenderBot 3 insist that they, too, had the Tay experiment in mind when they developed the process for teaching this new chatbot to behave. In fact, the beta is designed so people can report when it says something inappropriate that it may have learned from another user. That information is used to teach it what is not appropriate. Will it work?
Continue Reading →
The post FIR #276: Should I Tay Or Should I Go? appeared first on FIR Podcast Network.

32,007 Listeners

30,206 Listeners

112,360 Listeners

56,536 Listeners

10,204 Listeners

9,167 Listeners

68 Listeners

16,362 Listeners

14,284 Listeners

2,183 Listeners

29,220 Listeners

12,836 Listeners

19,771 Listeners

1,242 Listeners

96 Listeners