
Sign up to save your podcasts
Or


In this episode of Generative AI 101, we explore how Transformers break down text into tokens. Imagine turning a big, colorful pile of Lego blocks into individual pieces to build something cool—this is what tokenization does for AI models. Emily explains tokens, and how they work, and shows you why they’re the magic behind GenAI’s impressive outputs. Learn how Transformers assign numerical values to tokens and process them in parallel, allowing them to understand context, detect patterns, and generate coherent text. Tune in to discover why tokenization is important for tasks like language translation and text summarization.
Connect with Emily Laird on LinkedIn
By Emily Laird4.6
1919 ratings
In this episode of Generative AI 101, we explore how Transformers break down text into tokens. Imagine turning a big, colorful pile of Lego blocks into individual pieces to build something cool—this is what tokenization does for AI models. Emily explains tokens, and how they work, and shows you why they’re the magic behind GenAI’s impressive outputs. Learn how Transformers assign numerical values to tokens and process them in parallel, allowing them to understand context, detect patterns, and generate coherent text. Tune in to discover why tokenization is important for tasks like language translation and text summarization.
Connect with Emily Laird on LinkedIn

334 Listeners

152 Listeners

208 Listeners

197 Listeners

154 Listeners

227 Listeners

608 Listeners

274 Listeners

107 Listeners

54 Listeners

173 Listeners

55 Listeners

146 Listeners

62 Listeners

24 Listeners