Data Engineering Central Podcast

From Industrial Data at BASF to Delta Lake Committer


Listen Later

In this episode, Robert Pack walks through his journey from engineering and simulation work to building large-scale data systems across 900+ plants at BASF.

We break down what those systems actually looked like, including ingestion, modeling, and the realities of batch vs real-time in industrial environments.

We also dive into:

* AI Workflows for Developers

* His work as a committer on Delta Lake

* Where lakehouse architecture works and where it falls short

* The transition into Developer Relations at Databricks

This is a grounded, practical conversation about what actually matters when building data platforms.

Today’s podcast is sponsored by Estuary.

Without them, content like this isn’t possible. The best way to support this Newsletter is to check out what Estuary has to offer and click the links below.

Build millisecond-latency, scalable, future-proof data pipelines in minutes.

Estuary is the Right-Time Data Platform that integrates all of the systems you use to produce, process, and consume data. Also, providing best-in-class CDC (Change Data Capture).

Estuary unifies today’s batch and streaming paradigms so that your systems, current and future, are synchronized around the same datasets, updating in milliseconds.

You can find Robert on LinkedIn and GitHub, below.

Thanks for reading Data Engineering Central! This post is public so feel free to share it.

Come follow me on YouTube!!



This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit dataengineeringcentral.substack.com/subscribe
...more
View all episodesView all episodes
Download on the App Store

Data Engineering Central PodcastBy Data Engineering in Real Life