
Sign up to save your podcasts
Or


In this episode, I sit down with industry veteran Robin Moffatt — Sr. Principal Advisor in Streaming Data Technologies (Kafka, etc.) and a longtime voice in the data engineering community, to unpack the journey from old-school data architectures to today’s real-time streaming ecosystems. From early mainframe data processing and COBOL through the rise of Apache Kafka, streaming ETL, and event-driven systems, Robin shares lived experience from across decades of building, scaling, and evolving data platforms.
We dive into:
* 🧠 How the role of software engineering has shifted with the rise of distributed, real-time systems
* 📊 Why event streaming and platforms like Kafka aren’t just messaging systems, but the backbone of modern data architectures
* 🚀 How the community’s tooling and mental models have had to evolve — from static databases and nightly jobs to continuous, always-on streaming applications
* 🤖 A candid look at how AI and real-time data are intersecting, shaping both tooling and expectations for the next decade
* 🔮 Robin’s perspective on where the industry is headed — beyond buzzwords toward real engineering maturity
Along the way, we get historical context, real-world lessons from conference stages and community forums, and a perspective on building resilient, scalable systems that power today’s data-rich applications.
If you’ve ever wondered how we got from batch jobs to continuous event streams, or what it really takes to build modern pipelines that support AI workflows, this conversation with Robin is a must-listen.
For more from Robin:
* 📍 His personal blog & talks:
https://rmoff.net/
* 🔗 LinkedIn profile: https://www.linkedin.com/in/robinmoffatt
Thanks for reading Data Engineering Central! This post is public so feel free to share it.
By Data Engineering in Real LifeIn this episode, I sit down with industry veteran Robin Moffatt — Sr. Principal Advisor in Streaming Data Technologies (Kafka, etc.) and a longtime voice in the data engineering community, to unpack the journey from old-school data architectures to today’s real-time streaming ecosystems. From early mainframe data processing and COBOL through the rise of Apache Kafka, streaming ETL, and event-driven systems, Robin shares lived experience from across decades of building, scaling, and evolving data platforms.
We dive into:
* 🧠 How the role of software engineering has shifted with the rise of distributed, real-time systems
* 📊 Why event streaming and platforms like Kafka aren’t just messaging systems, but the backbone of modern data architectures
* 🚀 How the community’s tooling and mental models have had to evolve — from static databases and nightly jobs to continuous, always-on streaming applications
* 🤖 A candid look at how AI and real-time data are intersecting, shaping both tooling and expectations for the next decade
* 🔮 Robin’s perspective on where the industry is headed — beyond buzzwords toward real engineering maturity
Along the way, we get historical context, real-world lessons from conference stages and community forums, and a perspective on building resilient, scalable systems that power today’s data-rich applications.
If you’ve ever wondered how we got from batch jobs to continuous event streams, or what it really takes to build modern pipelines that support AI workflows, this conversation with Robin is a must-listen.
For more from Robin:
* 📍 His personal blog & talks:
https://rmoff.net/
* 🔗 LinkedIn profile: https://www.linkedin.com/in/robinmoffatt
Thanks for reading Data Engineering Central! This post is public so feel free to share it.