
Sign up to save your podcasts
Or


In this Intel on AI podcast episode: FPGA (field-programmable gate array) technology can offer a very high level of flexibility and performance, with low latency. Yet, with software writing thresholds, limited performance optimization, and difficult power control, FPGA solutions can also be challenging to implement. Bob Anderson, General Manager of Sales for Strategic Accounts at Inspur, joins Intel on AI to talk about the Inspur TensorFlow-supported FPGA Compute Acceleration Engine (TF2). Bob illustrates how the TF2 helps customers more easily deploy FPGA solutions and take advantage of the customization and performance of FPGAs for AI inference applications. He also describes how the TF2 is especially suitable for image-based AI applications with high real-time requirements.
To learn more, visit: inspursystems.com
Visit Intel AI Builders at: builders.intel.com/ai
By Intel Corporation4.9
1313 ratings
In this Intel on AI podcast episode: FPGA (field-programmable gate array) technology can offer a very high level of flexibility and performance, with low latency. Yet, with software writing thresholds, limited performance optimization, and difficult power control, FPGA solutions can also be challenging to implement. Bob Anderson, General Manager of Sales for Strategic Accounts at Inspur, joins Intel on AI to talk about the Inspur TensorFlow-supported FPGA Compute Acceleration Engine (TF2). Bob illustrates how the TF2 helps customers more easily deploy FPGA solutions and take advantage of the customization and performance of FPGAs for AI inference applications. He also describes how the TF2 is especially suitable for image-based AI applications with high real-time requirements.
To learn more, visit: inspursystems.com
Visit Intel AI Builders at: builders.intel.com/ai

32,003 Listeners

229,169 Listeners

30,175 Listeners

1,703 Listeners

9,927 Listeners

306 Listeners

15,931 Listeners

348 Listeners

769 Listeners