
Sign up to save your podcasts
Or


How do you make AI inference affordable enough to deliver real ROI in the enterprise?
In this episode of The Deep View Conversations, we talk with Rob May, founder and CEO of Neurometric AI, to break down one of the most urgent challenges in AI today: the soaring cost of inference, and how to bring it down without sacrificing performance.
Today's AI is increasingly powerful, but it’s also expensive. For enterprises to see real returns, inference costs have to drop dramatically. Neurometric believes the answer lies in "thinking algorithms" paired with small, specialized models and workload-specific optimization. This approach can significantly reduce costs while often improving accuracy and efficiency. Rob walks through how this works in practice and why it matters as AI moves from experimentation to scaled deployment.
We also talk about:
+ Why the current AI boom pulled Rob back into operating a startup after multiple exits and a move into investing
+ How founders should think about AI infrastructure, efficiency, and long-term economics
+ What startup leaders can do to get journalists to pay attention — and a pivotal early-career conversation that led to coverage which changed the trajectory of one of Rob’s companies
If you’re building, deploying, or investing in AI and wrestling with the economics of inference, this conversation offers a clear, practical perspective on what comes next.
Thank you to our sponsor, Deel, an AI-native platform for HR, IT, and payroll. Hire, manage, pay, and equip anyone, anywhere. https://www.deel.com/deepview
Subscribe to the podcast for more unique conversations with the brightest minds solving the biggest challenges in AI.
And don't miss The Deep View daily newsletter. We don’t just cover AI — we decode it. In a world flooded with hype, we deliver sharp, no-nonsense insights that keep our audience ahead of the curve and help them put AI to work every day: https://subscribe.thedeepview.com/
By The Deep ViewHow do you make AI inference affordable enough to deliver real ROI in the enterprise?
In this episode of The Deep View Conversations, we talk with Rob May, founder and CEO of Neurometric AI, to break down one of the most urgent challenges in AI today: the soaring cost of inference, and how to bring it down without sacrificing performance.
Today's AI is increasingly powerful, but it’s also expensive. For enterprises to see real returns, inference costs have to drop dramatically. Neurometric believes the answer lies in "thinking algorithms" paired with small, specialized models and workload-specific optimization. This approach can significantly reduce costs while often improving accuracy and efficiency. Rob walks through how this works in practice and why it matters as AI moves from experimentation to scaled deployment.
We also talk about:
+ Why the current AI boom pulled Rob back into operating a startup after multiple exits and a move into investing
+ How founders should think about AI infrastructure, efficiency, and long-term economics
+ What startup leaders can do to get journalists to pay attention — and a pivotal early-career conversation that led to coverage which changed the trajectory of one of Rob’s companies
If you’re building, deploying, or investing in AI and wrestling with the economics of inference, this conversation offers a clear, practical perspective on what comes next.
Thank you to our sponsor, Deel, an AI-native platform for HR, IT, and payroll. Hire, manage, pay, and equip anyone, anywhere. https://www.deel.com/deepview
Subscribe to the podcast for more unique conversations with the brightest minds solving the biggest challenges in AI.
And don't miss The Deep View daily newsletter. We don’t just cover AI — we decode it. In a world flooded with hype, we deliver sharp, no-nonsense insights that keep our audience ahead of the curve and help them put AI to work every day: https://subscribe.thedeepview.com/