
Sign up to save your podcasts
Or
Efficient orchestration and maintainability are crucial for data engineering at scale. Gil Reich, Data Developer for Data Science at Wix, shares how his team reduced code duplication, standardized pipelines, and improved Airflow task orchestration using a Python-based framework built within the data science team.
In this episode, Gil explains how this internal framework simplifies DAG creation, improves documentation accuracy, and enables consistent task generation for machine learning pipelines. He also shares lessons from complex DAG optimization and maintaining testable code.
Key Takeaways:
(03:23) Code duplication creates long-term problems.
(08:16) Frameworks bring order to complex pipelines.
(09:41) Shared functions cut down repetitive code.
(17:18) Auto-generated docs stay accurate by design.
(22:40) On-demand DAGs support real-time workflows.
(25:08) Task-level sensors improve run efficiency.
(27:40) Combine local runs with automated tests.
(30:09) Clean code helps teams scale faster.
Resources Mentioned:
Gil Reich
https://www.linkedin.com/in/gilreich/
Wix | LinkedIn
https://www.linkedin.com/company/wix-com/
Wix | Website
https://www.wix.com/
DS DAG Framework
https://airflowsummit.org/slides/2024/92-refactoring-dags.pdf
Apache Airflow
https://airflow.apache.org/
https://www.astronomer.io/events/roadshow/london/
https://www.astronomer.io/events/roadshow/new-york/
https://www.astronomer.io/events/roadshow/sydney/
https://www.astronomer.io/events/roadshow/san-francisco/
https://www.astronomer.io/events/roadshow/chicago/
Thanks for listening to “The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
5
2020 ratings
Efficient orchestration and maintainability are crucial for data engineering at scale. Gil Reich, Data Developer for Data Science at Wix, shares how his team reduced code duplication, standardized pipelines, and improved Airflow task orchestration using a Python-based framework built within the data science team.
In this episode, Gil explains how this internal framework simplifies DAG creation, improves documentation accuracy, and enables consistent task generation for machine learning pipelines. He also shares lessons from complex DAG optimization and maintaining testable code.
Key Takeaways:
(03:23) Code duplication creates long-term problems.
(08:16) Frameworks bring order to complex pipelines.
(09:41) Shared functions cut down repetitive code.
(17:18) Auto-generated docs stay accurate by design.
(22:40) On-demand DAGs support real-time workflows.
(25:08) Task-level sensors improve run efficiency.
(27:40) Combine local runs with automated tests.
(30:09) Clean code helps teams scale faster.
Resources Mentioned:
Gil Reich
https://www.linkedin.com/in/gilreich/
Wix | LinkedIn
https://www.linkedin.com/company/wix-com/
Wix | Website
https://www.wix.com/
DS DAG Framework
https://airflowsummit.org/slides/2024/92-refactoring-dags.pdf
Apache Airflow
https://airflow.apache.org/
https://www.astronomer.io/events/roadshow/london/
https://www.astronomer.io/events/roadshow/new-york/
https://www.astronomer.io/events/roadshow/sydney/
https://www.astronomer.io/events/roadshow/san-francisco/
https://www.astronomer.io/events/roadshow/chicago/
Thanks for listening to “The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
481 Listeners
38 Listeners
142 Listeners
265 Listeners
140 Listeners
289 Listeners
8,909 Listeners
2,146 Listeners
12 Listeners
8 Listeners
8 Listeners
15 Listeners
450 Listeners