How technology and familiarity are paving the way for real-time data system adoption in IoT and beyond
Some of the most impressive AI-driven solutions today are powered by real-time data. Whether it’s the latest fraud detection systems identifying anomalies, marketing tools tailoring recommendations to a customer’s current behaviour and sentiment or usage-based insurance policies activating only when an item is in use, all rely on processing data in real time.
In all these cases, data is collected from a multitude of sources – sensors, applications, enterprise systems – and aggregated within milliseconds.
From data update to decision-making and action (such as sending alerts or automated responses), the total process typically takes one to two seconds on average. For the most critical applications, response time must be under one second; for general IoT use cases, they can stretch up to five seconds.
The streaming ecosystem and EDA
Although technologies based on real-time data streaming are now reaching maturity, they didn’t emerge as the latest development stage of traditional data processing and storage techniques but have evolved independently as streaming data systems.
Legacy extract-transform-load systems (ETLs) manage data in batches: they extract data, transform it into the right format, and load it into a data warehouse.
In contrast, real-time data streaming systems must collect, aggregate and enrich captured data as it is generated by often thousands or even millions of sensors, cameras or smart appliances – particularly in large-scale IoT deployments.
Managing this complexity once required dedicated data-processing architecture, which allows the seamless integration of diverse data sources and decision-making in real time. In event-driven architecture (EDA), data is treated as a stream of events, produced by event creators, processed by event brokers and consumed by event consumers. Event creators might include IoT devices, business activity monitoring systems or user interfaces.
Event brokers receive incoming events and route them to the appropriate event consumers based on predefined rules. Event consumers – which may be applications, services, devices or people – then act on the events, updating databases, triggering workflows or sending outputs to data analytics engines, dashboards or user interfaces.
To translate this process into the use-cases mentioned earlier, detection of several large-value payment events are routed to fraud-detection event consumer systems to trigger alerts. Customer clicks and views on an online marketplace become events that generate personalised product recommendations.
IoT devices embedded in telematics-empowered cars send speed, braking and mileage data, which is forwarded by event brokers to telematics service providers and insurers’ servers – both the consumers of data events.
EDAs present an alternative architectural design to application programming interfaces (APIs) for enabling data integration and communication between systems. While APIs provide standardised, synchronous interfaces for system communication, event-driven architecture offers a looser and asynchronous integration between components.
In EDA, the decoupling of event producers and consumers means that one part of the system can be modified or updated without affecting the rest, allowing for far greater flexibility and scalability. This approach also enables multiple analytics consumers to process the same data stream independently.
The nature of data flow in real-time systems also demands a different approach to storage. To ensure events are processed in the correct sequence and can be retraced if needed, EDAs rely on specialised, time-stamped databases. These systems automatically update time stamps with each event, eliminating the need for manual data refreshes.
Considering trade-offs
However, while well-suited to high-volume, real-time environments, event-driven systems come with their own challenges. Their distributed nature, which comes handy when managing huge waves of real-time data, can create complexities that makes tracing events along their life-cycle daunting, especially when performing a troubleshooting exercise to find the root cause of a malfunction.
Sequencing errors can also occur if the system fails to maintain the correct order of events under heavy load.
According to Paul Butterworth, Co-founder and CTO of VANTIQ, the inflection point for real-time data processing in the 2020s was driven by a combination of factors. Technological advancements such as faster networks, enhanced computing power and the rise of edge computing laid the foundation.
However, an equally important factor was the growing familiarity of IT experts with these new architectures.
For a long time, many developers continued to build traditional IT applications simply because they were not yet familiar with the special demands of real-time systems. But as their understanding deepened and real-time architectures became easier to deploy, the industry saw a significant acceleration in adoption.
Today, with the support of event-driven architecture and a more open developer community, real-time data systems are more accessible than ever before.
Yet the classic rule of technology procurement still holds true: businesses must ensure that any technology they adopt fully addresses their real use-case, and that they avoid investing in shiny new solutions that will see only limited practical use.
© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543