ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Moving to a unified data stack

Dael Williamson at Databricks makes the case for a unified data stack in the age of AI agents 

Many organisations still rely and operate on architectures in which their enterprise data is split into either operational systems (OLTP) and analytics systems (OLAP). This separation was dictated by legacy infrastructure which made it difficult to run both day-to-day applications and analytical workloads on the same platform. Today, that divide creates friction, waste and delays across teams. 

 

This split produced a disconnect, as developers focused on keeping applications running while analysts were stuck working with data that was either outdated or incomplete. Modern cloud architecture has removed some of the technical barriers, yet the divide continues, driven by legacy software, vendor lock-in and established ways of working. It’s time to rethink this approach and put in place a unified data stack that reflects the emergence of AI agents and applications.

 

 

Confronting the legacy bottleneck 

Once data lands in a transactional system, it becomes hard and expensive to move. Proprietary storage formats and tightly coupled architectures trap data inside operational systems and block integration with modern data and AI workflows. Organisations end up working around infrastructure that no longer fits their needs.

 

Today’s AI agents and applications require fast and reliable access to live data. But when operational data is stuck in legacy environments, it becomes much harder to enable automation, personalisation or real-time decision-making. This not only slows development, but it also limits responsiveness, scalability and the ability to extract timely insights from rapidly growing data volumes.

 

More organisations are now seeking alternatives that remove these constraints and offer a unified, responsive foundation for modern data-driven systems.

 

 

The gap between operations and analytics 

The original OLTP/OLAP split made sense when compute was limited. Running analytics alongside operational workloads simply wasn’t viable. But with cloud-native storage, such as open table formats, organisations no longer need separate pipelines to make operational data available for analytics. And yet many enterprises still rely on architectures where operational data must be extracted, transformed and loaded before it can be analysed, introducing delays, duplication and overhead.

 

The impact is significant. Analysts base decisions on outdated information. Developers spend time maintaining fragile pipelines instead of building new capabilities. Innovation slows and opportunity costs mount.

 

In response, more organisations are moving to unified data architectures, where operational and analytical workloads share a single data foundation, utilising engines optimised for each specific task. This reduces complexity, improves efficiency and enables faster iteration—all critical benefits in the AI era.

 

 

Preparing the stack for intelligent agents 

AI agents are driving a step-change in application development. These intelligent systems can perform complex, multi-step tasks by reasoning over proprietary data and interacting with other components in real time. With the ability to coordinate decisions and actions throughout an entire data ecosystem, these technologies are evolving beyond basic automation to become fundamental parts of organisational operations.

 

To support this shift, infrastructure must evolve. AI agents need low-latency access to live data, smooth integration across systems and modern development workflows. A new concept known as a lakebase tackles these problems head-on. It delivers the reliability of an operational database and the openness of a data lake in one place, so teams can run transactions and analytics without juggling systems. It gives fast access to data, scales easily through separated storage and compute, and fits modern development habits like instant branching and versioning. Built for today’s AI-driven workloads, a lakebase lets both developers and AI agents build, test, and ship applications quickly, without the constraints of old OLTP setups.

 

 

Designing the next-generation data platform

It’s becoming clear that a unified data stack is the future of modern systems. As AI becomes integrated into all aspects of a business, infrastructure that removes silos and brings both operational and analytical systems together will become essential to enable teams to innovate and grow without limitations.

 

Legacy OLTP systems have become out of sync with what AI-driven organisations demand, because of their fixed and complex architecture. Unified, open platforms that can support transactional operations and real-time intelligence without compromise are vital for AI-native applications. 

 

This shift will take time, but organisations that start to reduce fragmentation, adopt open standards and build for agent-driven systems will be best placed to succeed in the AI age. 

 


 

Dael Williamson is EMEA CTO at Databricks

 

Main image courtesy of iStockPhoto.com and alexsl

Business Reporter

Winston House, 3rd Floor, Units 306-309, 2-4 Dollis Park, London, N3 1HF

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543