
Sign up to save your podcasts
Or


“Microsoft is making a bet that we’re not going to need a single AI, we’re going to need many different AIs” Sebastien Bubeck, Microsoft’s vice president of generative-AI research, tells Bloomberg senior technology analyst Anurag Rana. In this Tech Disruptors episode, the two examine the differences between a large language model like ChatGPT-4o and a small language model such as Microsoft’s Phi-3 family. Bubeck and Rana account for various use cases of the models across various industries and workflows. The two also compare the costs and differences in compute/GPU requirements between SLMs and LLMs.
By Bloomberg4.6
1414 ratings
“Microsoft is making a bet that we’re not going to need a single AI, we’re going to need many different AIs” Sebastien Bubeck, Microsoft’s vice president of generative-AI research, tells Bloomberg senior technology analyst Anurag Rana. In this Tech Disruptors episode, the two examine the differences between a large language model like ChatGPT-4o and a small language model such as Microsoft’s Phi-3 family. Bubeck and Rana account for various use cases of the models across various industries and workflows. The two also compare the costs and differences in compute/GPU requirements between SLMs and LLMs.

1,282 Listeners

535 Listeners

403 Listeners

2,179 Listeners

423 Listeners

1,099 Listeners

2,350 Listeners

969 Listeners

339 Listeners

194 Listeners

63 Listeners

30 Listeners

530 Listeners

506 Listeners

350 Listeners

4 Listeners

58 Listeners

233 Listeners

228 Listeners

65 Listeners

138 Listeners

77 Listeners

82 Listeners

394 Listeners

18 Listeners

12 Listeners

7 Listeners

2 Listeners

72 Listeners