
Sign up to save your podcasts
Or


Turning complex datasets into meaningful analysis requires robust data infrastructure and seamless orchestration. In this episode, we’re joined by Jennifer Melot, Technical Lead at the Center for Security and Emerging Technology (CSET) at Georgetown University, to explore how Airflow powers data-driven insights in technology policy research. Jennifer shares how her team automates workflows to support analysts in navigating complex datasets.
Key Takeaways:
(02:04) CSET provides data-driven analysis to inform government decision-makers.
(03:54) ETL pipelines merge multiple data sources for more comprehensive insights.
(04:20) Airflow is central to automating and streamlining large-scale data ingestion.
(05:11) Larger-scale databases create challenges that require scalable solutions.
(07:20) Dynamic DAG generation simplifies Airflow adoption for non-engineers.
(12:13) DAG Factory and dynamic task mapping can improve workflow efficiency.
(15:46) Tracking data lineage helps teams understand dependencies across DAGs.
(16:14) New Airflow features enhance visibility and debugging for complex pipelines.
Resources Mentioned:
Jennifer Melot -
https://www.linkedin.com/in/jennifer-melot-aa710144/
Center for Security and Emerging Technology (CSET) -
https://www.linkedin.com/company/georgetown-cset/
Apache Airflow -
https://airflow.apache.org/
Zenodo -
https://zenodo.org/
OpenLineage -
https://openlineage.io/
Cloud Dataplex -
https://cloud.google.com/dataplex
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
By Astronomer5
2020 ratings
Turning complex datasets into meaningful analysis requires robust data infrastructure and seamless orchestration. In this episode, we’re joined by Jennifer Melot, Technical Lead at the Center for Security and Emerging Technology (CSET) at Georgetown University, to explore how Airflow powers data-driven insights in technology policy research. Jennifer shares how her team automates workflows to support analysts in navigating complex datasets.
Key Takeaways:
(02:04) CSET provides data-driven analysis to inform government decision-makers.
(03:54) ETL pipelines merge multiple data sources for more comprehensive insights.
(04:20) Airflow is central to automating and streamlining large-scale data ingestion.
(05:11) Larger-scale databases create challenges that require scalable solutions.
(07:20) Dynamic DAG generation simplifies Airflow adoption for non-engineers.
(12:13) DAG Factory and dynamic task mapping can improve workflow efficiency.
(15:46) Tracking data lineage helps teams understand dependencies across DAGs.
(16:14) New Airflow features enhance visibility and debugging for complex pipelines.
Resources Mentioned:
Jennifer Melot -
https://www.linkedin.com/in/jennifer-melot-aa710144/
Center for Security and Emerging Technology (CSET) -
https://www.linkedin.com/company/georgetown-cset/
Apache Airflow -
https://airflow.apache.org/
Zenodo -
https://zenodo.org/
OpenLineage -
https://openlineage.io/
Cloud Dataplex -
https://cloud.google.com/dataplex
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning

32,200 Listeners

228,980 Listeners

534 Listeners

624 Listeners

145 Listeners

3,993 Listeners

25 Listeners

140 Listeners

10,064 Listeners

58,792 Listeners

5,532 Listeners

14 Listeners

8 Listeners

24 Listeners

151 Listeners