
Sign up to save your podcasts
Or


Phi-1 is a new, smaller language model for code with 1.3B parameters, trained on a selection of web data and synthetically generated textbooks and exercises, achieving high accuracy on HumanEval and MBPP.
By Igor Melnyk5
33 ratings
Phi-1 is a new, smaller language model for code with 1.3B parameters, trained on a selection of web data and synthetically generated textbooks and exercises, achieving high accuracy on HumanEval and MBPP.

977 Listeners

1,993 Listeners

443 Listeners

113,121 Listeners

10,254 Listeners

5,576 Listeners

221 Listeners

51 Listeners

101 Listeners

475 Listeners