
Sign up to save your podcasts
Or


Diamond Bishop has spent 15 years building AI systems at Microsoft (Cortana), Amazon (Alexa), and Facebook (PyTorch) before founding an AI DevOps startup that Datadog acquired. Now running Datadog's AI Skunk Works, a deliberately small interdisciplinary team modeled on Lockheed's original, he's focused on a question most enterprise AI teams aren't asking yet: what does your product look like if humans are no longer the primary customer?
That question drives everything from Bits AI, their production SRE and security agent, to a set of longer-range bets organized around three pillars: personalized agent learning, enterprise agent infrastructure, and eval. Diamond breaks down how he structures each one, why the demo-to-production gap comes down to data and eval rather than model capability, and where the real unsolved problems in agent development still sit.
Topics discussed:
Bits AI's capabilities in production across SRE incident response, security analysis and code generation
Three-pillar agent development framework: personalized learning, enterprise infrastructure and eval
LoRA-style adapter architecture for layering custom per-user agents on top of first-party agents
Why SRE agent startups without proprietary observability data face a structural disadvantage at production scale
Service graph and entity relationship context as a structured alternative to RAG for DevOps agents
Skunk Works team design: staying small and interdisciplinary to move like a startup inside a public company
The shift from human-operated cloud services to ambient AI-native services built to run with fewer humans over time
Crawl-walk-run path for enterprise agent adoption: from LangGraph-based Python agents to continuously learning systems
Why concentrating AI research investment in transformer scaling creates long-term architectural risk
Building agent-native tooling rather than repurposing interfaces designed for humans
By Front LinesDiamond Bishop has spent 15 years building AI systems at Microsoft (Cortana), Amazon (Alexa), and Facebook (PyTorch) before founding an AI DevOps startup that Datadog acquired. Now running Datadog's AI Skunk Works, a deliberately small interdisciplinary team modeled on Lockheed's original, he's focused on a question most enterprise AI teams aren't asking yet: what does your product look like if humans are no longer the primary customer?
That question drives everything from Bits AI, their production SRE and security agent, to a set of longer-range bets organized around three pillars: personalized agent learning, enterprise agent infrastructure, and eval. Diamond breaks down how he structures each one, why the demo-to-production gap comes down to data and eval rather than model capability, and where the real unsolved problems in agent development still sit.
Topics discussed:
Bits AI's capabilities in production across SRE incident response, security analysis and code generation
Three-pillar agent development framework: personalized learning, enterprise infrastructure and eval
LoRA-style adapter architecture for layering custom per-user agents on top of first-party agents
Why SRE agent startups without proprietary observability data face a structural disadvantage at production scale
Service graph and entity relationship context as a structured alternative to RAG for DevOps agents
Skunk Works team design: staying small and interdisciplinary to move like a startup inside a public company
The shift from human-operated cloud services to ambient AI-native services built to run with fewer humans over time
Crawl-walk-run path for enterprise agent adoption: from LangGraph-based Python agents to continuously learning systems
Why concentrating AI research investment in transformer scaling creates long-term architectural risk
Building agent-native tooling rather than repurposing interfaces designed for humans