
Sign up to save your podcasts
Or


How does Meta AI's natural language model, LLaMa compare to the rest? Based on the Chinchilla scaling laws, LLaMa is designed to be smaller but more performant. But how exactly does it achieve this feat? It's all done by training a small model for a longer period of time. Discover how LLaMa compares to its competition, including GPT-3, in this week's episode.
Additional materials: www.superdatascience.com/670
Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.
By Jon Krohn4.6
295295 ratings
How does Meta AI's natural language model, LLaMa compare to the rest? Based on the Chinchilla scaling laws, LLaMa is designed to be smaller but more performant. But how exactly does it achieve this feat? It's all done by training a small model for a longer period of time. Discover how LLaMa compares to its competition, including GPT-3, in this week's episode.
Additional materials: www.superdatascience.com/670
Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.

480 Listeners

623 Listeners

585 Listeners

334 Listeners

152 Listeners

269 Listeners

207 Listeners

142 Listeners

95 Listeners

131 Listeners

154 Listeners

227 Listeners

608 Listeners

275 Listeners

40 Listeners