
Sign up to save your podcasts
Or
Many organizations turned to HDFS to address the challenge of storing growing volumes of semi-structured and unstructured data. However, Hadoop never managed to replace the data warehouse for enterprise-grade Business Intelligence and Reporting, and most teams ended up with separate monolithic architectures including data lakes and data warehouses, with siloed data and analytic workloads That is why data teams are increasingly considering a data lakehouse architecture that combines the flexibility and scalability of data lake storage with the data management, data governance, and enterprise-grade analytic performance of the data warehouse. In this episode, Jorge A. Lopez, Product Specialist for Analytics at AWS, and Dremio's Jeremiah Morrow will discuss best practices for modernizing analytic workloads from Hadoop to an open data lakehouse architecture, including:
Many organizations turned to HDFS to address the challenge of storing growing volumes of semi-structured and unstructured data. However, Hadoop never managed to replace the data warehouse for enterprise-grade Business Intelligence and Reporting, and most teams ended up with separate monolithic architectures including data lakes and data warehouses, with siloed data and analytic workloads That is why data teams are increasingly considering a data lakehouse architecture that combines the flexibility and scalability of data lake storage with the data management, data governance, and enterprise-grade analytic performance of the data warehouse. In this episode, Jorge A. Lopez, Product Specialist for Analytics at AWS, and Dremio's Jeremiah Morrow will discuss best practices for modernizing analytic workloads from Hadoop to an open data lakehouse architecture, including: