Back To The Future of Real-Time Applications
For the last few decades, operations and analytics have been firmly separated in enterprise architectures, with different systems for the different needs, even as the rise of Big Data and Data Lakes has muddied the water between “real time operations” and “batch driven analytics”.
But in-memory hybrid transaction/analytical processing (HTAP) systems are now allowing organizations to do real-time, powerful analytics directly on day-to-day operational data, using a single architecture.
This idea is far from new — for example, here’s the text from a Univac computer ad in 1956:
Image source: http://www.dvq.com
“…there’s only one commercially available platform capable of real time performance… It’s the ideal system for…simulation and on-line data reduction.It solves complex problems from purely sensed data at speeds that are compatible with real-time control.
Because of its ability to reduce large volumes of data at tremendous speeds, the…system easily handles even the most difficult problems.
Furthermore, it offers many other outstanding characteristics, including: superb operating efficiency, large capacity, great versatility, the ability to interface with a wide variety of different types of data, and far greater reliability than any other computer of its type….”
Just like some of today’s “internet of things” systems, the Univac scientific machine was designed to process information in-memory, directly from sensors (it could also access information from magnetic tape or punch cards).
Note how perfectly the ad ticks all the modern buzzword boxes, promising the 3Vs of big data (volume, variety, and velocity), along with powerful analytics and data compression.
Unfortunately for business in the 1950s, rising amounts of data and cheaper disk storage quickly upended the economics of computing — and we have had to deal with the complexity and latency of separate operational and analytic systems ever since.
We’re now seeing a new tipping point. The falling price of memory and the increased costs of complexity mean that in-memory platforms like SAP HANA are once again the simplest and cheapest way of running businesses.
The result is that although moving to HTAP entails “upheaval in the established architectures” according to Gartner, we’re actually getting back to the fundamentals of what we’ve always wanted.
And what could be simpler than that?
[This post first appeared in my Business Analytics Blog]