
Sign up to save your podcasts
Or
In this Intel on AI podcast episode: FPGA (field-programmable gate array) technology can offer a very high level of flexibility and performance, with low latency. Yet, with software writing thresholds, limited performance optimization, and difficult power control, FPGA solutions can also be challenging to implement. Bob Anderson, General Manager of Sales for Strategic Accounts at Inspur, joins Intel on AI to talk about the Inspur TensorFlow-supported FPGA Compute Acceleration Engine (TF2). Bob illustrates how the TF2 helps customers more easily deploy FPGA solutions and take advantage of the customization and performance of FPGAs for AI inference applications. He also describes how the TF2 is especially suitable for image-based AI applications with high real-time requirements.
To learn more, visit: inspursystems.com
Visit Intel AI Builders at: builders.intel.com/ai
4.9
1313 ratings
In this Intel on AI podcast episode: FPGA (field-programmable gate array) technology can offer a very high level of flexibility and performance, with low latency. Yet, with software writing thresholds, limited performance optimization, and difficult power control, FPGA solutions can also be challenging to implement. Bob Anderson, General Manager of Sales for Strategic Accounts at Inspur, joins Intel on AI to talk about the Inspur TensorFlow-supported FPGA Compute Acceleration Engine (TF2). Bob illustrates how the TF2 helps customers more easily deploy FPGA solutions and take advantage of the customization and performance of FPGAs for AI inference applications. He also describes how the TF2 is especially suitable for image-based AI applications with high real-time requirements.
To learn more, visit: inspursystems.com
Visit Intel AI Builders at: builders.intel.com/ai
1,646 Listeners
161 Listeners
26,409 Listeners
323 Listeners
111,438 Listeners
658 Listeners
56,025 Listeners
317 Listeners
192 Listeners
1,837 Listeners
5,923 Listeners
9,042 Listeners
1,542 Listeners
199 Listeners
458 Listeners