
Sign up to save your podcasts
Or


In this episode of AI x DevOps, Rohit sits down with Görkem Ercan, CTO at Jozu, a company building a DevOps platform for AI agents and models. Görkem, a veteran with over two decades of software experience (including contributions to the Eclipse Foundation), explains why MLOps is fundamentally different from traditional, deterministic DevOps—leading to extreme pipeline fragmentation.
Here are some of our favourite takeaways:
• Standardization is Key: Why OCI is the recognized standard for packaging AI/ML artifacts, and how the Model Packs project (with ByteDance, Red Hat, and Docker) is defining the artifact structure.
• Open Source Headaches: The critical challenge maintainers face when receiving large amounts of untested, verbose, AI-generated code.
• LLM Economics: Discover why running small, fine-tuned LLMs in-house can be cheaper and provide more predictable, consistent results than generic large providers.
• KitOps Solution: How KitOps creates an abstraction that allows data scientists to focus on training while leveraging existing DevOps platforms for deployment.
Tune in now to understand the standardization movement reshaping the future of AI development!
By Facets.cloudIn this episode of AI x DevOps, Rohit sits down with Görkem Ercan, CTO at Jozu, a company building a DevOps platform for AI agents and models. Görkem, a veteran with over two decades of software experience (including contributions to the Eclipse Foundation), explains why MLOps is fundamentally different from traditional, deterministic DevOps—leading to extreme pipeline fragmentation.
Here are some of our favourite takeaways:
• Standardization is Key: Why OCI is the recognized standard for packaging AI/ML artifacts, and how the Model Packs project (with ByteDance, Red Hat, and Docker) is defining the artifact structure.
• Open Source Headaches: The critical challenge maintainers face when receiving large amounts of untested, verbose, AI-generated code.
• LLM Economics: Discover why running small, fine-tuned LLMs in-house can be cheaper and provide more predictable, consistent results than generic large providers.
• KitOps Solution: How KitOps creates an abstraction that allows data scientists to focus on training while leveraging existing DevOps platforms for deployment.
Tune in now to understand the standardization movement reshaping the future of AI development!