
Sign up to save your podcasts
Or


“Microsoft is making a bet that we’re not going to need a single AI, we’re going to need many different AIs” Sebastien Bubeck, Microsoft’s vice president of generative-AI research, tells Bloomberg senior technology analyst Anurag Rana. In this Tech Disruptors episode, the two examine the differences between a large language model like ChatGPT-4o and a small language model such as Microsoft’s Phi-3 family. Bubeck and Rana account for various use cases of the models across various industries and workflows. The two also compare the costs and differences in compute/GPU requirements between SLMs and LLMs.
By Bloomberg4.6
1212 ratings
“Microsoft is making a bet that we’re not going to need a single AI, we’re going to need many different AIs” Sebastien Bubeck, Microsoft’s vice president of generative-AI research, tells Bloomberg senior technology analyst Anurag Rana. In this Tech Disruptors episode, the two examine the differences between a large language model like ChatGPT-4o and a small language model such as Microsoft’s Phi-3 family. Bubeck and Rana account for various use cases of the models across various industries and workflows. The two also compare the costs and differences in compute/GPU requirements between SLMs and LLMs.

978 Listeners

411 Listeners

1,186 Listeners

2,169 Listeners

1,873 Listeners

425 Listeners

1,092 Listeners

965 Listeners

797 Listeners

193 Listeners

69 Listeners

31 Listeners

352 Listeners

4 Listeners

260 Listeners

58 Listeners

232 Listeners

228 Listeners

60 Listeners

130 Listeners

60 Listeners

503 Listeners

83 Listeners

383 Listeners

22 Listeners

12 Listeners

7 Listeners

2 Listeners

73 Listeners