Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Sangeetha_K
Product and Topic Expert
Product and Topic Expert

Over the last 6 months, our conversations with customers have evolved from “how do we best manage all of our data across all these disparate systems of record and public data signals” … to “how should we  modernize our overly customized and mission critical SAP BW stack?” … to more recently “I have an idea using GenAI within my business process, how do I get this implemented?”.  Google and SAP are jointly addressing all the above, and more, with our recently announced open data offering across SAP Datasphere and Google BigQuery.


Our customers are continuously looking for better ways to gain insights into their business and drive innovations, but are challenged with the prototypical fragmentation of data across multiple applications and data warehouses.  For example, an electric utility company may be using SAP for their customer relationship management and billing functions and a non-SAP solution for outage management and connected meters. During a major weather event, the electric utility will want to have a 360 degree view of their customer impact and this usually involves connecting data from SAP and other non-SAP systems to provide a consolidated view of their customers.


Customers want a single pane of glass to their data.  Traditionally, to solve this problem, enterprises first build a data lake to bring data from all the data sources and then apply data cleansing, deduplication and normalization to build a data and analytics platform.  While this approach (data replication) solves the need to bring all the data into one single landing zone, it typically creates other challenges around data governance, data freshness, reconciliation and loss of semantic context, all of which comes at a cost to manage. We now see customers looking more and more at hybrid approaches where “time-sensitive” and “access-sensitive” data is federated from the source while other data is replicated to the target data platform. Google Cloud and SAP see customer value in these hybrid approaches and we are working together to make it easy for our customers to use our data and analytics solutions.


Google Cloud and SAP have been partnering for many years to help customers run their business critical SAP workloads on Google Cloud. We’re seeing strong adoption of SAP Business Technology Platform (BTP) on Google Cloud, as captured previously in examples  SAP BTP on Google Cloud Announces 5 new capabilities and SAP Build Process Automation is better with Google Document AI and Google Workspace. Today, we provide prescriptive guidance on reference architectures for building data integration between SAP BTP Data and Analytics solutions (Datasphere and SAP Analytics Cloud) and Google Cloud Data Analytics solutions (BigQuery, Google Cloud Cortex Framework and VertexAI).




Example Joint Reference Architecture


We illustrate the reference architecture here using a few sample business use cases. Please note that the architecture discussed here is not static. It evolves as new business cases arise and new features are added to the products. You can expect to hear updates on this architecture as SAP and Google Cloud continue to collaborate on more use cases. Also, while these use cases might be specific to an industry or line of business (LoB) process, the architecture patterns are the same for similar use cases in other industries/LoBs. Let’s take a high level view of an example reference architecture before diving deeper into the use cases.



In this architecture, data from SAP sources like SAP S/4HANA, SAP BW/4HANA and other SAP cloud applications can be federated and/or replicated to SAP Datasphere using the available connectors. Once data is available in SAP Datasphere (virtual or physical), customers can use native capabilities such as Data Flow and views to enrich, deduplicate and normalize the data. They can also build SAP specific semantic models on top of the data. On the other hand, customers can bring in data from sources like Google Analytics, Google Ads and other enterprise applications into BigQuery. Further, customers can also use Google Cloud Cortex Framework to accelerate insights with packaged data analytics content for common business scenarios. Customers are then able to federate data from BigQuery into SAP Datasphere using  connectors like the Google BigQuery Datasphere connector. With SAP and non-SAP data available in SAP Datasphere, customers can build operational reporting, planning and visualizations using SAP Analytics Cloud).


Machine learning also plays a key role in delivering meaningful data insights. Vertex AI brings together Google Cloud services for building machine learning under one, unified user interface and API. It is no secret that high quality training data is critical to get machine learning right, particularly as GenAI concepts evolve and customers start to rely on it to make decisions for them. Your machine learning models are only as accurate as the quality of training data. Traditionally, customers create their training datasets by moving the data into Google Cloud Storage or BigQuery. However, with SAP FedML libraries for Google VertexAI, customers now have an option to federate data from SAP Datasphere to create their training datasets, further empowering their ability to leverage GenAI, AI, and ML alongside their SAP applications.


Now that we have a high level understanding of architecture components, let’s look into a relevant sample use case.



Use Case Example: Predict & Analyze Retail Inventory using SAP Datasphere, BigQuery & FedML on Google Vertex AI


Retailers are constantly looking at maintaining optimal inventory levels, striking a balance between locked capital due to excess inventory and lost revenue due to stock out situations.


It is commonplace for retailers to have multi-layer supply chains resulting in inventory data spread across disconnected and fragmented systems. This results in limited inventory visibility. Retailers are looking at ways to continuously assess inventory positions relative to demand, production capacities and existing supply positions to maintain optimal inventory levels. They are looking at ways to accurately predict inventory demand and take proactive steps to mitigate stock out situations or locked capital due to over-stocking. Using the reference architecture above, retailers looking to gain inventory intelligence can bring data from their SAP ERP and Warehouse Management systems into SAP Datasphere. Furthermore, they can bring in data from non-SAP applications like logistics, point of sales and external data sets such as weather pattern and carbon footprint(GDELT) into BigQuery. Then, they can stitch together SAP and non-SAP data  for a continuous and accurate representation of real time inventory positions across a multi-layer supply chain network. Further, customers can also create accurate inventory forecasting machine learning models in Vertex AI based on training datasets created from data in SAP Datasphere using FedML and/or data in BigQuery.


This use case allows retailers to accurately predict their inventory safety stock requirements and take appropriate actions to mitigate stock-out situations and avoid locking working capital. With the power of Datasphere and the highly scalable architecture of BigQuery, retailers can tap into huge datasets to make meaningful predictions on their inventory needs.



Follow this SAP Discovery Center mission to try this scenario yourself.




Summary and Next steps


Our customers are generating data exponentially but a vast majority of this data remains as ‘dark data’ as the data is not utilized in time to drive better business outcomes. Google Cloud and SAP are bringing simplified data and analytics capabilities that use hybrid architecture, allowing our customers to tap into their data. Our goal is to make it easy for our customers to gain visibility to their data, generate insights and drive meaningful actions… more and more with a lens towards becoming predictive and prescriptive through AI/ML and GenAI technologies. The sample reference architecture shared here is the first step in this journey.


So let’s get started!!!




Google Cloud and SAP are committed to further advancing the portfolio of analytics options that customers can leverage as they continuously improve their processes to meet evolving business needs. You can expect more exciting updates in the near future.



Credits


The above blog content , use cases and reference architectures are the results of teamwork and contributions from both SAP and Google Cloud. I would like to thank the following SAP colleagues for their guidance and support: Anirban Majumdar, Christian Thisgaard, Rahul Tiwari, Sivakumar N & Aaron Graber and  special thanks to the following team members from Google Cloud: Michael Harding, Blake Lanning, Jesper Christensen, KK Ramamurthy.


If you have any questions, please leave a comment below or contact us at paa@sap.com


 
2 Comments