
Sign up to save your podcasts
Or


This New Stack Makers podcast co-hosted by Alex Williams, TNS founder and publisher, and Adrian Cockcroft, Partner and Analyst at OrionX.net, discussed Nvidia's GH200 Grace Hopper superchip. Industry expert Sunil Mallya, Co-founder and CTO of Flip AI weighed in on how it is revolutionizing the hardware industry for AI workloads by centralizing GPU communication, reducing networking overhead, and creating a more efficient system.
Mallya noted that despite its innovative design, challenges remain in adoption due to interface issues and the need for software to catch up with hardware advancements. However, optimism persists for the future of AI-focused chips, with Nvidia leading the charge in creating large-scale coherent memory systems. Meanwhile, Flip AI, a DevOps large language model, aims to interpret observability data to troubleshoot incidents effectively across various cloud platforms. While discussing the latest chip innovations and challenges in training large language models, the episode sheds light on the evolving landscape of AI hardware and software integration.
Learn more from The New Stack about Nvidia and the future of chip design
Nvidia Wants to Rewrite the Software Development Stack
Nvidia GPU Dominance at a Crossroads
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
By The New Stack4.3
3131 ratings
This New Stack Makers podcast co-hosted by Alex Williams, TNS founder and publisher, and Adrian Cockcroft, Partner and Analyst at OrionX.net, discussed Nvidia's GH200 Grace Hopper superchip. Industry expert Sunil Mallya, Co-founder and CTO of Flip AI weighed in on how it is revolutionizing the hardware industry for AI workloads by centralizing GPU communication, reducing networking overhead, and creating a more efficient system.
Mallya noted that despite its innovative design, challenges remain in adoption due to interface issues and the need for software to catch up with hardware advancements. However, optimism persists for the future of AI-focused chips, with Nvidia leading the charge in creating large-scale coherent memory systems. Meanwhile, Flip AI, a DevOps large language model, aims to interpret observability data to troubleshoot incidents effectively across various cloud platforms. While discussing the latest chip innovations and challenges in training large language models, the episode sheds light on the evolving landscape of AI hardware and software integration.
Learn more from The New Stack about Nvidia and the future of chip design
Nvidia Wants to Rewrite the Software Development Stack
Nvidia GPU Dominance at a Crossroads
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

9 Listeners

3 Listeners

289 Listeners

1,089 Listeners

625 Listeners

43 Listeners

4 Listeners

226 Listeners

988 Listeners

190 Listeners

211 Listeners

203 Listeners

63 Listeners

511 Listeners

494 Listeners

33 Listeners

467 Listeners

35 Listeners