
Sign up to save your podcasts
Or


Our guest today is Sebastian Raschka, Senior Staff Research Engineer at Lightning AI and bestselling book author.
In our conversation, we first talk about Sebastian's role at Lightning AI and what the platform provides. We also dive into two great open source libraries that they've built to train, finetune, deploy and scale LLMs.: pytorch lightning and litgpt.
In the second part of our conversation, we dig into Sebastian's new book: "Build and LLM from Scratch". We discuss the key steps needed to train LLMs, the differences between GPT-2 and more recent models like Llama 3.1, multimodal LLMs and the future of the field.
If you enjoyed the episode, please leave a 5 star review and subscribe to the AI Stories Youtube channel.
Build a Large Language Model From Scratch Book: https://www.amazon.com/Build-Large-Language-Model-Scratch/dp/1633437167
Blog post on Multimodal LLMs: https://magazine.sebastianraschka.com/p/understanding-multimodal-llms
Lightning AI (with pytorch lightning and litgpt repos): https://github.com/Lightning-AI
Follow Sebastian on LinkedIn: https://www.linkedin.com/in/sebastianraschka/
Follow Neil on LinkedIn: https://www.linkedin.com/in/leiserneil/
---
(00:00) - Intro
(02:27) - How Sebastian got into Data & AI
(06:44) - Regressions and loss functions
(13:32) - Academia to joining LightningAI
(21:14) - Lightning AI VS other cloud providers
(26:14) - Building PyTorch Lightning & LitGPT
(30:48) - Sebastian’s role as Staff Research Engineer
(34:35) - Build an LLM From Scratch
(45:00) - From GPT2 to Llama 3.1
(48:34) - Long Context VS RAG
(56:15) - Multimodal LLMs
(01:03:27) - Career Advice
By Neil LeiserOur guest today is Sebastian Raschka, Senior Staff Research Engineer at Lightning AI and bestselling book author.
In our conversation, we first talk about Sebastian's role at Lightning AI and what the platform provides. We also dive into two great open source libraries that they've built to train, finetune, deploy and scale LLMs.: pytorch lightning and litgpt.
In the second part of our conversation, we dig into Sebastian's new book: "Build and LLM from Scratch". We discuss the key steps needed to train LLMs, the differences between GPT-2 and more recent models like Llama 3.1, multimodal LLMs and the future of the field.
If you enjoyed the episode, please leave a 5 star review and subscribe to the AI Stories Youtube channel.
Build a Large Language Model From Scratch Book: https://www.amazon.com/Build-Large-Language-Model-Scratch/dp/1633437167
Blog post on Multimodal LLMs: https://magazine.sebastianraschka.com/p/understanding-multimodal-llms
Lightning AI (with pytorch lightning and litgpt repos): https://github.com/Lightning-AI
Follow Sebastian on LinkedIn: https://www.linkedin.com/in/sebastianraschka/
Follow Neil on LinkedIn: https://www.linkedin.com/in/leiserneil/
---
(00:00) - Intro
(02:27) - How Sebastian got into Data & AI
(06:44) - Regressions and loss functions
(13:32) - Academia to joining LightningAI
(21:14) - Lightning AI VS other cloud providers
(26:14) - Building PyTorch Lightning & LitGPT
(30:48) - Sebastian’s role as Staff Research Engineer
(34:35) - Build an LLM From Scratch
(45:00) - From GPT2 to Llama 3.1
(48:34) - Long Context VS RAG
(56:15) - Multimodal LLMs
(01:03:27) - Career Advice

30,817 Listeners

302 Listeners

200 Listeners

646 Listeners

141 Listeners

10,043 Listeners

25 Listeners

531 Listeners

16,096 Listeners

137 Listeners

93 Listeners

227 Listeners

631 Listeners

34 Listeners

16 Listeners