
Sign up to save your podcasts
Or


Phi-1 is a new, smaller language model for code with 1.3B parameters, trained on a selection of web data and synthetically generated textbooks and exercises, achieving high accuracy on HumanEval and MBPP.
By Igor Melnyk5
33 ratings
Phi-1 is a new, smaller language model for code with 1.3B parameters, trained on a selection of web data and synthetically generated textbooks and exercises, achieving high accuracy on HumanEval and MBPP.

958 Listeners

1,977 Listeners

438 Listeners

112,858 Listeners

10,073 Listeners

5,535 Listeners

214 Listeners

51 Listeners

98 Listeners

473 Listeners