![]() |
Don't Panic! It's Just DataDon't Panic! It's just Data Author: EM360Tech
Turn data overwhelm into data-driven success. Led by an ensemble cast of expert interviewers offering in-depth analysis and practical advice to make the most of your organization's data. Language: en Genres: Business, Management, Technology Contact email: Get it Feed URL: Get it iTunes ID: Get it |
Listen Now...
How enterprises can enable the Agentic AI Lakehouse on Apache Iceberg
Episode 49
Wednesday, 29 October, 2025
"A flaw of warehouses is that you need to move all your data into them so you can keep it going, and for a lot of organisations that's a big hassle,” says Will Martin, EMEA Evangelist at Dremio. “It can take a long time, it can be expensive, and you ultimately can end up ripping up processes that are there."In this episode of the Don’t Panic It’s Just Data podcast, recorded live at Big Data LDN (BDL) 2025, Will Martin, EMEA Evangelist at Dremio, joins Shubhangi Dua, Podcast Host and Tech Journalist at EM360Tech. They talk about how enterprises can enable the Agentic AI Lakehouse on Apache Iceberg and why query performance is critical for efficient data analysis. "If you have a data silo, it exists for a reason—something's feeding information to it. You usually have other processes feeding off of it. So if you shift all that to a warehouse, it disrupts a lot of your business," Martin tells Dua. This is where a lakehouse comes into play. Organisations can federate their access through a lakehouse data approach. They can centralise access to the respective organisation’s lakehouse while keeping their data in its original location. Such a system helps people get started quickly.In terms of data quality, if you access everything from one location, even with separate data silos, you can see all your data. This visibility allows you to identify issues, address them, and enhance your data quality. That’s beneficial for AI, too, Martin explains. Lakehouse Key to AI Infrastructure?Lakehouse has been recognised for unifying and simplifying governance. An imperative feature of a lakehouse is the data catalogue, which helps an organisation browse and find information. It also secures access and manages permissions."You can access in one place, but you can do all your security and permissions in one place rather than all these individual systems, which is great if you work in IT,” reflects Martin. "There are some drawbacks to lakehouses. So, a big component of a lakehouse is metadata. It can be quite big, and it needs managing. Certain companies and vendors are trying to deal with that."With AI and AI agents, it’s become even harder to optimise analytics on a lakehouse. However, this has been improved as technical barriers are disappearing. Martin explains that anyone can prompt a question; for instance, an enterprise CEO could ask questions about the data and demand justifications directly. In the past, a request would have to be submitted, and then a data scientist or engineer would create the dataset and hand it over. Now, engineers' roles have changed to focus on better optimisation. They help queries run smoothly and ensure tables are efficient. Agents cannot assist with that.Also Listen: Dremio: The State of the Data LakehouseOptimise LakehouseVendors such as











