
Sign up to save your podcasts
Or
Is pre-training a thing of the past? In Episode 34 of Mixture of Experts, host Tim Hwang is joined by Abraham Daniels, Vagner Santana and Volkmar Uhlig to debrief this week in AI. First, OpenAI cofounder Ilya Sutskever said that “peak data” was achieved, does this mean there is no longer a need to model pre-training? Next, IBM released Granite 3.1 with a slew of features, we cover them all. Then, there is a new way to steal AI models, how do we protect against model exfiltration. Finally, can NVIDIA Jetson for AI developers really increase hardware accessibility? Tune-in for more!
The opinions expressed in this podcast are solely those of the participants and do not necessarily reflect the views of IBM or any other organization or entity.
00:01 — Intro
00:49— Is pre-training over?
10:25 — Granite 3.1
22:23 — AI model stealing
33:38—NVIDIA Jetson
5
88 ratings
Is pre-training a thing of the past? In Episode 34 of Mixture of Experts, host Tim Hwang is joined by Abraham Daniels, Vagner Santana and Volkmar Uhlig to debrief this week in AI. First, OpenAI cofounder Ilya Sutskever said that “peak data” was achieved, does this mean there is no longer a need to model pre-training? Next, IBM released Granite 3.1 with a slew of features, we cover them all. Then, there is a new way to steal AI models, how do we protect against model exfiltration. Finally, can NVIDIA Jetson for AI developers really increase hardware accessibility? Tune-in for more!
The opinions expressed in this podcast are solely those of the participants and do not necessarily reflect the views of IBM or any other organization or entity.
00:01 — Intro
00:49— Is pre-training over?
10:25 — Granite 3.1
22:23 — AI model stealing
33:38—NVIDIA Jetson
995 Listeners
292 Listeners
611 Listeners
323 Listeners
192 Listeners
279 Listeners
122 Listeners
191 Listeners
63 Listeners
420 Listeners
39 Listeners
57 Listeners
26 Listeners
16 Listeners
29 Listeners