As generative and agentic AI become table stakes, enterprises are facing a new question: not if to deploy AI, but where it should run. In this episode of Putting the I in AI, hosted by CIO.com’s Chris Pullam, leaders from HCLTech and Intel discuss the rise of the AI PC and why on-device intelligence is becoming a cornerstone of enterprise AI strategies.
“Hybrid AI is reality,” says Sarah Wieskus, general manager for commercial client and data center sales at Intel. “There will be workloads that make sense in the cloud, and there will be workloads that make sense on the edge or on device. Running AI where it makes the most sense delivers lower latency, better privacy, and even offline intelligence.”
Zuber Khan, who leads the Intel ecosystem business unit at HCLTech, notes that the shift is being driven as much by users as by IT. “AI consumption is no longer centralized,” he explains. “Developers, engineers, analysts, and frontline workers want intelligence embedded directly where they work. AI PCs mark the transition from asking can we do AI to how fast can we enable thousands of users with it.”
For regulated industries, that proximity to the user can be transformative. “In life sciences and healthcare, we deal with highly sensitive data,” says Joelien Jose, global leader for Digital Foundation Services at HCLTech. “The ability to do local inference, where data never leaves the device, makes it easier for organizations to innovate while maintaining compliance and trust.”
Throughout the conversation, all three guests emphasized that success with AI PCs requires more than deployment. Leaders must align use cases to real workflows, invest in change management, and measure outcomes that matter.
Listen to the full episode to hear how AI PCs are helping enterprises move AI from experimentation to everyday impact—and why intelligence closer to the user is shaping the future of work.