
Sign up to save your podcasts
Or


How does Meta AI's natural language model, LLaMa compare to the rest? Based on the Chinchilla scaling laws, LLaMa is designed to be smaller but more performant. But how exactly does it achieve this feat? It's all done by training a small model for a longer period of time. Discover how LLaMa compares to its competition, including GPT-3, in this week's episode. 
Additional materials: www.superdatascience.com/670
Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.
 By Jon Krohn
By Jon Krohn4.6
294294 ratings
How does Meta AI's natural language model, LLaMa compare to the rest? Based on the Chinchilla scaling laws, LLaMa is designed to be smaller but more performant. But how exactly does it achieve this feat? It's all done by training a small model for a longer period of time. Discover how LLaMa compares to its competition, including GPT-3, in this week's episode. 
Additional materials: www.superdatascience.com/670
Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.

475 Listeners

1,084 Listeners

339 Listeners

769 Listeners

156 Listeners

268 Listeners

210 Listeners

141 Listeners

90 Listeners

132 Listeners

151 Listeners

208 Listeners

562 Listeners

265 Listeners

70 Listeners