
Sign up to save your podcasts
Or
Learn More from the AI Infrastructure Field Day Presentations
AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.
Host
Alastair Cooke, Tech Field Day Event Lead
Panelists:
Karen Lopez
Lino Telera
Denise Donohue
Follow the Tech Field Day Podcast on X/Twitter or on Bluesky and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes on the podcast page of the website.
Follow Tech Field Day for more information on upcoming and current event coverage on X/Twitter, on Bluesky, and on LinkedIn, or visit our website.
4.7
1515 ratings
Learn More from the AI Infrastructure Field Day Presentations
AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.
Host
Alastair Cooke, Tech Field Day Event Lead
Panelists:
Karen Lopez
Lino Telera
Denise Donohue
Follow the Tech Field Day Podcast on X/Twitter or on Bluesky and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes on the podcast page of the website.
Follow Tech Field Day for more information on upcoming and current event coverage on X/Twitter, on Bluesky, and on LinkedIn, or visit our website.
1,647 Listeners
1,971 Listeners
3,448 Listeners
153 Listeners
194 Listeners
1,006 Listeners
202 Listeners
21 Listeners
3 Listeners
9,048 Listeners
421 Listeners
127 Listeners
5,420 Listeners
27 Listeners
155 Listeners