
Sign up to save your podcasts
Or


Remember Microsoft Tay? The 2016 Twitter chatbot suffered from a concerted effort to teach it all manner of vile things, prompting Microsoft to withdraw the service and apologize. That experience is top of mind for pretty much everyone talking about Meta's new chatbot, now available in public testing in the U.S. The developers behind BlenderBot 3 insist that they, too, had the Tay experiment in mind when they developed the process for teaching this new chatbot to behave. In fact, the beta is designed so people can report when it says something inappropriate that it may have learned from another user. That information is used to teach it what is not appropriate. Will it work?
Continue Reading →
The post FIR #276: Should I Tay Or Should I Go? appeared first on FIR Podcast Network.
By Neville Hobson and Shel Holtz5
2020 ratings
Remember Microsoft Tay? The 2016 Twitter chatbot suffered from a concerted effort to teach it all manner of vile things, prompting Microsoft to withdraw the service and apologize. That experience is top of mind for pretty much everyone talking about Meta's new chatbot, now available in public testing in the U.S. The developers behind BlenderBot 3 insist that they, too, had the Tay experiment in mind when they developed the process for teaching this new chatbot to behave. In fact, the beta is designed so people can report when it says something inappropriate that it may have learned from another user. That information is used to teach it what is not appropriate. Will it work?
Continue Reading →
The post FIR #276: Should I Tay Or Should I Go? appeared first on FIR Podcast Network.

32,254 Listeners

30,209 Listeners

113,049 Listeners

56,906 Listeners

10,330 Listeners

9,138 Listeners

68 Listeners

16,494 Listeners

14,400 Listeners

2,225 Listeners

29,333 Listeners

12,906 Listeners

20,992 Listeners

1,249 Listeners

97 Listeners