Super Data Science: ML & AI Podcast with Jon Krohn

695: NLP with Transformers, feat. Hugging Face's Lewis Tunstall

07.11.2023 - By Jon KrohnPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

What are transformers in AI, and how do they help developers to run LLMs efficiently and accurately? This is a key question in this week’s episode, where Hugging Face’s ML Engineer Lewis Tunstall sits down with host Jon Krohn to discuss encoders and decoders, and the importance of continuing to foster democratic environments like GitHub for creating open-source models.

This episode is brought to you by the AWS Insiders Podcast (https://pod.link/1608453414), by https://WithFeeling.ai, the company bringing humanity into AI, and by Modelbit (https://modelbit.com), for deploying models in seconds. Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.

In this episode you will learn:

• What a transformer is, and why it is so important for NLP [04:34]

• Different types of transformers and how they vary [11:39]

• Why it’s necessary to know how a transformer works [31:52]

• Hugging Face’s role in the application of transformers [57:10]

• Lewis Tunstall’s experience of working at Hugging Face [1:02:08]

• How and where to start with Hugging Face libraries [1:18:27]

• The necessity to democratize ML models in the future [1:25:25]

Additional materials: www.superdatascience.com/695

More episodes from Super Data Science: ML & AI Podcast with Jon Krohn