
Sign up to save your podcasts
Or


Try OCI for free at http://oracle.com/eyeonai
This episode is sponsored by Oracle. OCI is the next-generation cloud designed for every workload – where you can run any application, including any AI projects, faster and more securely for less. On average, OCI costs 50% less for compute, 70% less for storage, and 80% less for networking.
Join Modal, Skydance Animation, and today's innovative AI tech companies who upgraded to OCI…and saved.
Why is AI moving from the cloud to our devices, and what makes on device intelligence finally practical at scale?
In this episode of Eye on AI, host Craig Smith speaks with Christopher Bergey, Executive Vice President of Arm's Edge AI Business Unit, about how edge AI is reshaping computing across smartphones, PCs, wearables, cars, and everyday devices.
We explore how Arm v9 enables AI inference at the edge, why heterogeneous computing across CPUs, GPUs, and NPUs matters, and how developers can balance performance, power, memory, and latency. Learn why memory bandwidth has become the biggest bottleneck for AI, how Arm approaches scalable matrix extensions, and what trade offs exist between accelerators and traditional CPU based AI workloads.
You will also hear real world examples of edge AI in action, from smart cameras and hearing aids to XR devices, robotics, and in car systems. The conversation looks ahead to a future where intelligence is embedded into everything you use, where AI becomes the default interface, and why reliable, low latency, on device AI is essential for creating experiences users actually trust.
Stay Updated: Craig Smith on X: https://x.com/craigss Eye on A.I. on X: https://x.com/EyeOn_AI
By Craig S. Smith4.7
5555 ratings
Try OCI for free at http://oracle.com/eyeonai
This episode is sponsored by Oracle. OCI is the next-generation cloud designed for every workload – where you can run any application, including any AI projects, faster and more securely for less. On average, OCI costs 50% less for compute, 70% less for storage, and 80% less for networking.
Join Modal, Skydance Animation, and today's innovative AI tech companies who upgraded to OCI…and saved.
Why is AI moving from the cloud to our devices, and what makes on device intelligence finally practical at scale?
In this episode of Eye on AI, host Craig Smith speaks with Christopher Bergey, Executive Vice President of Arm's Edge AI Business Unit, about how edge AI is reshaping computing across smartphones, PCs, wearables, cars, and everyday devices.
We explore how Arm v9 enables AI inference at the edge, why heterogeneous computing across CPUs, GPUs, and NPUs matters, and how developers can balance performance, power, memory, and latency. Learn why memory bandwidth has become the biggest bottleneck for AI, how Arm approaches scalable matrix extensions, and what trade offs exist between accelerators and traditional CPU based AI workloads.
You will also hear real world examples of edge AI in action, from smart cameras and hearing aids to XR devices, robotics, and in car systems. The conversation looks ahead to a future where intelligence is embedded into everything you use, where AI becomes the default interface, and why reliable, low latency, on device AI is essential for creating experiences users actually trust.
Stay Updated: Craig Smith on X: https://x.com/craigss Eye on A.I. on X: https://x.com/EyeOn_AI

479 Listeners

170 Listeners

332 Listeners

149 Listeners

205 Listeners

96 Listeners

131 Listeners

91 Listeners

152 Listeners

228 Listeners

622 Listeners

274 Listeners

25 Listeners

36 Listeners

39 Listeners