On 2 December 2025, AI Talk host Kevin Craine was joined by Christian Hull, Chief Technology Officer, Tax Canary;Nabeel Nawaz, Global CIO M&A Leader, IBM; and Jerod Johnson,Senior Technology Evangelist, CData Software, Inc.
Views on news
Data architecture has evolved dramatically over the past two decades – and for good reason. What began as centralized data warehouses designed for structured reporting has transformed into a complex ecosystem of lakes, lakehouses, fabrics, and meshes, each responding to new challenges as they emerged. This evolution wasn’t arbitrary. Each architectural shift happened because organisations hit real limitations: exploding data volumes, new data types, stricter governance requirements or the need for faster experimentation. Some may see the data team here as the problem, not the solution.
However, in the data mesh architecture, ownership moves away from the centralised data team to subject matter experts, while previously weeks could pass before the data team put the necessary data into a data lake. To put data into the hands of those who create value from it, businesses can use technologies and introduce processes that ensure real-time access through either a centralised data layer or a real-time ipass. Removing the divide between the business and the technology side of the company can enable better data practices.
How to achieve data readiness
The three key components to good data management are data strategy, delivery and operations. Some of the data a business collects is of little or no value, so first, you have to identify where your useful data sits. The next stage is data operations including governance, understanding what you can tag, what has PII and SPI. This approach can also ensure that you avoid the “garbage in, garbage out” pitfall.
Legal is one of the sectors that works with huge amounts of unstructured data. But the crucial choice is between real-time and batch data delivery, and the latter, particularly micro-batches – once a minute or in 15 seconds – fits the use case better most of the time, with only a few genuine use cases calling for real-time. Real-time, meanwhile, is mostly “call and response” – you must ask a question to get the fresh data. Fraud detection, where real-time data plays a key role, is more reliant on ML and pattern recognition than LLMs.
The key design principles for building a sustainable data architecture are asking what data connectivity you need for your project and whether you can get that in-house or must provide it from a third party – CData, for instance, is a data connectivity solution. Typically, 20% of your data needs to be stored long term in a data warehouse and this is what you must build data pipelines for, and the rest can be done in call-and-response action. The tools you use will have access to the data warehouse. In terms of connectivity tech, you must decide whether your use case requires something that is brittle and won’t scale and whether data must be on-premises or can be in the cloud as well.
The panel’s advice

© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543