Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
Planning for the future is an essential part of success. Database and Data Management is constantly changing and evolving, and we must evaluate the changes we are seeing to understand where they will take us next. With that being said, I’d like to share three market predictions my team and I believe will have a significant market impact in the future years.

  1. Expectations for database systems have expanded beyond relational to include alternative models.


Non-relational database technology, such as NoSQL and Hadoop, have emerged over the last few years. However, now the expectation is that leading database platforms can provide a wider range of capabilities and address the broader range of use cases and workloads that these non-relational technologies have enabled.

This has resulted in a “new normal” definition of capabilities for a general-purpose database platform, including support of new data types / multiple data models, in-memory, data virtualization, support for distributed storage, and extended capabilities such as graph and spatial. Customer are looking for a modern database platform that can natively support these additional workloads and functions.

  1. Real-time analytics on transactional data will see a rise in demand.


Driven by the need to perform “transaction window” analytics with a simplified technology architecture, hybrid transactional analytic database systems, enabled by in-memory technology, are seeing increasing adoption.

Increasing demand to support operational workloads that incorporate real-time analysis, such as recommendations, targeting, and fraud analysis, are leading to increasing adoption of hybrid transactional analytic database systems. Industry analysts are recognizing this trend; Gartner uses the term “hybrid transaction / analytical processing (HTAP)”, IDC uses “analytic transactional processing (ATP)”, and Forrester uses “translytic data platforms”, for which it recently published a brand new Wave report having SAP as the favorable market leader. In-memory is a key technological enabler of hybrid transactional analytic databases, which also provides the added benefit of simplicity of architecture – one system to maintain with no data movement.

Forrester has identified SAP as a leader in its first ever report on translytical data platforms. Forrester describes translytical data platform as having the capabilities to “support many types of use cases, including real-time insights, machine learning, streaming analytics, and extreme transactional processing. The sweet spot is the ability to perform all of these workloads within a single database.” SAP earned 18 perfect 5/5 scores in the report’s evaluation criteria which also goes on to note that, “SAP crushes translytical workloads.”

SAP is excited to be recognized as a leader in this new category of database platforms that we expect will be indispensable for businesses in the digital economy.

For more information on why SAP has set the industry standard, check out Forrester’s report here.

  1. Information management will evolve to manage disparate siloed data sources.


Companies are swimming in a sea of data right now, and to this point it’s been nearly impossible to make sense of it. And with the Internet of Things and additional sources emerging every day, more data means more problems.

A major challenge is that this data is increasingly becoming constrained by the reality of multiple data lakes and disparate siloed data sources. This is further driven by new data regulations, such as GDPR, which are mandating enterprise-wide data governance.

A new approach, as part of a modern data architecture, facilitates orchestrating, managing, and creating data flow pipelines, with push-down processing to the data where it resides, for data professionals as well as LoB users.

Previously, it was difficult for organizations to address this challenge, which required a build-it-yourself approach combined with piecemeal commercial products. However, new commercial solutions are now emerging to address this opportunity. This isn’t a nice-to-have product – these days, it’s an absolute necessity. Which is why we unveiled SAP Data Hub, a simpler, more scalable approach to data landscape management eases integration burdens and provides better enterprise-wide landscape visibility and governance.

SAP Data Hub is the way that we are rearchitecting a way to handle complex data landscapes and how to manage data so it becomes a catalyst for business transformation, not a barrier to business execution. SAP Data Hub tackles three major components: governance, data pipelines, and data sharing.

We’re addressing governance with centralized visibility into the lineage of the data, allowing users a unified view to understand how data was changed, how it was sourced, who changed it, why they changed it when they changed it. We’re offering powerful pipelines with distributed, push down processing, which allows users to move from a world of centralized data to centralized governance, leaving the data where it is, bringing together myriad sources – some we may not even know of yet – together. Lastly, the design intent of SAP Data Hub is to enable data sharing, leveraging existing connections and integration tools to combine these data sources. Each system’s data catalog is registered to SAP Data Hub, which makes it easier to add new systems over time and leverage them quickly with existing systems.

What do you think? Are we on the right track with these ideas? Do you see trends playing out differently? Please share your thoughts below or Tweet me at @McStravickGreg.