
Sign up to save your podcasts
Or
AI agents are set to transform software development, but software itself isn’t going anywhere—despite the dramatic predictions. On this episode of The New Stack Makers, Mark Hinkle, CEO and Founder of Peripety Labs, discusses how AI agents relate to serverless technologies, infrastructure-as-code (IaC), and configuration management.
Hinkle envisions AI agents as “dumb robots” handling tasks like querying APIs and exchanging data, while the real intelligence remains in large language models (LLMs). These agents, likely implemented as serverless functions in Python or JavaScript, will automate software development processes dynamically. LLMs, leveraging vast amounts of open-source code, will enable AI agents to generate bespoke, task-specific tools on the fly—unlike traditional cloud tools from HashiCorp or configuration management tools like Chef and Puppet.
As AI-generated tooling becomes more prevalent, managing and optimizing these agents will require strong observability and evaluation practices. According to Hinkle, this shift marks the future of software, where AI agents dynamically create, call, and manage tools for CI/CD, monitoring, and beyond. Check out the full episode for more insights.
Learn more from The New Stack about emerging trends in AI agents:
Lessons From Kubernetes and the Cloud Should Steer the AI Revolution
AI Agents: Why Workflows Are the LLM Use Case to Watch
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
4.3
3131 ratings
AI agents are set to transform software development, but software itself isn’t going anywhere—despite the dramatic predictions. On this episode of The New Stack Makers, Mark Hinkle, CEO and Founder of Peripety Labs, discusses how AI agents relate to serverless technologies, infrastructure-as-code (IaC), and configuration management.
Hinkle envisions AI agents as “dumb robots” handling tasks like querying APIs and exchanging data, while the real intelligence remains in large language models (LLMs). These agents, likely implemented as serverless functions in Python or JavaScript, will automate software development processes dynamically. LLMs, leveraging vast amounts of open-source code, will enable AI agents to generate bespoke, task-specific tools on the fly—unlike traditional cloud tools from HashiCorp or configuration management tools like Chef and Puppet.
As AI-generated tooling becomes more prevalent, managing and optimizing these agents will require strong observability and evaluation practices. According to Hinkle, this shift marks the future of software, where AI agents dynamically create, call, and manage tools for CI/CD, monitoring, and beyond. Check out the full episode for more insights.
Learn more from The New Stack about emerging trends in AI agents:
Lessons From Kubernetes and the Cloud Should Steer the AI Revolution
AI Agents: Why Workflows Are the LLM Use Case to Watch
Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
272 Listeners
284 Listeners
152 Listeners
40 Listeners
9 Listeners
621 Listeners
3 Listeners
441 Listeners
4 Listeners
201 Listeners
987 Listeners
189 Listeners
181 Listeners
192 Listeners
62 Listeners
47 Listeners
75 Listeners
53 Listeners