Enterprises need to exploit data to make “in the moment” decisions that drive the real-time enterprise. However, the typical enterprise keeps its data in many databases. Most of these are transactional, some are analytic and a few are very large enterprise data warehouses. To enrich operational applications and the transactions they process with up-to-the-moment business intelligence, enterprises need to find ways to bring their data together from different sources and perform both analytics and transactions on a single platform.

Although the initial benefit of such a blended platform is that of running analytic applications against live operational data, this capability promises much more. A new class of application is on the horizon; one that performs analytics on a mixture of live operational data and contextual business intelligence data during the course of performing business operations, and modifies those operations based on the results of such analyses. In effect, data-driven intelligence is being integrated into what were formerly transaction-only processes, and the result is a new class of application that is highly flexible and responsive to changing business conditions. Such functionality can be enabled only by an analytic-transactional data platform.

The Memory-Optimized Core

At its core, the analytic-transactional data platform has a memory optimized database of the most relevant and active data running on a database management system (DBMS) that is optimized for the blending of transactional and analytic data operations.

The memory-optimized core must blend the capability to perform transactions with the ability to perform complex analytic queries at high speed. Although there are many approaches to delivering this capability today, technology trends suggest that over time three will prevail:

  1. The core database, with data simultaneously in both row and column format, is synchronized internally. In this approach, data subject to frequent row-wise update would be present in the row store; data for high-performance complex queries would be kept in the column store; and data that fits both kinds of operations would be present in both stores.
  2. The core database has transactional data moving through a delta store that is optimized to update the column store, which is the main structure. The delta store is used to ensure fast transaction processing and hold the data for queries until it can be properly blended into the appropriate columns. This approach is needed in cases where the transactional and analytic data cannot be separated cleanly. Also, because the data is in a single structure in memory, there is no duplication (meaning less memory is required), and there is no latency due to synchronization between row and column forms of the data.
  3. The core database holds the data in atomic elements that can be resolved as table structures, optimizing both transactions and analytic queries. This approach serves similar use cases to approach 2 but involves a very different internal architecture from the norm. This approach holds promise for a best-of-both-worlds capability but has not been fully tested against enough real-world examples to be considered a proven general solution.

Some database vendors, in addition to providing an analytic-transactional data platform, also provide advanced analytic capabilities such as predictive, temporal, spatial, streaming, text analytics, search and graph capabilities without duplicating data. These additional capabilities not only help to provide better analytics but also help to simplify IT landscape.

In seeking to develop the right analytic-transactional data platform strategy, it makes sense to favor a vendor that can offer the full range of analytic-transaction database platform technology, and especially one with which an enterprise already has a business relationship. This avoids such practical issues as operational efficiency, configuration management and contract management simplicity, data conversion, and staff retraining. Enterprises should choose a single vendor as the focal point for building an analytic-transactional data platform; one that has both technology that works today and a vision for where this functionality is going in the future. To really make this approach a reality, the chosen vendor should have a plan for delivering not just the platform, but the analytic-transaction application functionality, either on its own or from partners.

To learn more about the analytic-transactional platform and its role in realizing the real time enterprise, please see the related IDC study.

To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Dan Lahl

    Thanks Carl for such a clear picture of how managing transactions and analytics together in one system makes the case for real-time processing and true business transformation.

  2. Ajay Kalra

    Thanks Carl. This is a great read. But a question always bursting in my mind when we talk about the very recent analytic – transactional technologies taking the centrestage everywhere. How the data platform should evolve to inculcate the much needed “warehousing” of data. Shift to memory means rising costs, and when we think about data warehousing, the much needed and basic requirement of any organization with its data in many disparate systems. Can the Data platforms of future manage the heightenend volume of data along with providing the best in class analytic and transactional features. Even with the redundancy of row and column stores, aren’t the platforms overshooting without consideration towards a data warehouse needs.


Leave a Reply