
Sign up to save your podcasts
Or


Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!
Interested in sponsoring a SuperDataScience Podcast episode? Email [email protected] for sponsorship information.
In this episode you will learn:
• Explanation of Charles' job title: Chief of Frontier Research [03:31]
• Model Merging Technology combining multiple LLMs without increasing size [04:43]
• Using MergeKit for model merging [14:49]
• Evolutionary Model Merging using evolutionary algorithms [22:55]
• Commercial applications and success stories [28:10]
• Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]
• Spectrum Project for efficient training by targeting specific modules [54:28]
• Future of Small Language Models (SLMs) and their advantages [01:01:22]
Additional materials: www.superdatascience.com/801
By Jon Krohn4.6
295295 ratings
Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!
Interested in sponsoring a SuperDataScience Podcast episode? Email [email protected] for sponsorship information.
In this episode you will learn:
• Explanation of Charles' job title: Chief of Frontier Research [03:31]
• Model Merging Technology combining multiple LLMs without increasing size [04:43]
• Using MergeKit for model merging [14:49]
• Evolutionary Model Merging using evolutionary algorithms [22:55]
• Commercial applications and success stories [28:10]
• Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]
• Spectrum Project for efficient training by targeting specific modules [54:28]
• Future of Small Language Models (SLMs) and their advantages [01:01:22]
Additional materials: www.superdatascience.com/801

480 Listeners

625 Listeners

585 Listeners

334 Listeners

152 Listeners

269 Listeners

208 Listeners

142 Listeners

95 Listeners

130 Listeners

154 Listeners

227 Listeners

608 Listeners

274 Listeners

40 Listeners