Skip to Content
Business Trends
Author's profile photo Gustavo Calixto

SAP Business Technology Platform – Consume SAP Data Warehouse Cloud´s assets using SAP Analytics Cloud

Overview

This is the last blog post of the SAP Business Technology Platform Showcase series of blogs and videos. We invite you to check this overall blog, so you can understand the full end-to-end story and the context involving multiple SAP BTP solutions.
Here we will see how to provide self-service business insights to the business community. We will also demonstrate how to use SAP Analytics Cloud’s Smart Insights and Smart Discovery augmented analytics smart features.
Smart Discovery and Smart Insights help users to take advantage of advanced contribution, classification, and regression techniques with the power of machine learning. The features empower anyone to surface hidden patterns and complex relationships within their information, even without any data science knowledge or experience.

Below you can see this is the 7th and last step of the “Solution Map” prepared for the journey on the referred overall blog:

SAP BTP Showcase – Overall Technical Architecture

You can also follow these scenarios watching this technical demonstration video.

Prerequisites

• Having completed the previous blog of this series (Blog 5: Provide governed business semantics with SAP Data Warehouse Cloud).

Scenarios: Consume SAP Data Warehouse Cloud´s Analytical Datasets

For this first scenario, we will consume the Analytical Datasets that were created in the previous blog of this series.

We will use the Analytical Dataset Views to build our stories. Once the models are available to the business user, he/she can access the story builder to create dashboards and extract valuable insights from its data. Create a new story following the arrows, as shown below:

In the example below, we will add a new chart to an existing dashboard using the Analytical Dataset from SAP Data Warehouse Cloud. To do this, click the Edit button, then click on the Insert Chart button, as shown below:

Now, we will add the data source to this chart. Click on the chart that was added, click on the Designer button, then click on the Change primary model button to add a new data source for the chart.

Now we can select amongst the already imported models for this story or select another model.

For this example, we will use a previously imported data source. However, when you are building the dashboard from scratch, we would need to select a new data source from the option list that contains the Analytical Datasets created in SAP Data Warehouse Cloud.

As an example, we will create a chart that shows the average daily energy consumption per time range. To do this, we will have to create a calculated measure.

This calculation measure will execute the average operation on the Energy Consumption (actual) measure using Date as an Aggregation Dimension.

We will need to create a calculated dimension as well to show the time ranges on the chart.

This calculated dimension will be a measure-based dimension, using the Hour (integer) dimension. For each measure value range, we are going to assign a member name. And last but not least, we will add a dimension context with the time dimensions available in our dataset.

Once the calculated dimension has been created and selected for the chart, it will be plotted grouping the average consumption by the time range of the day. Add a white background for the result to become more visible.

To finish this example, we will sort the chart to order by the average energy consumption from highest to lowest.

Congratulations! We finished this scenario, where we showed how business users in SAP Analytics Cloud can self-service their own data to extract rich insights and add value to their businesses.

Using Smart Insights and Smart Discovery

For the next scenario, we will show two features that are exclusive to SAP Analytics Cloud: Smart Insights and Smart Discovery. Smart Discovery uses machine learning to analyze and explore your data and uncover valuable insights. Smart Insight picks up a point, variance in data and examine what is behind that data, being able to add context to a visualization, which helps in understanding what is going on.

The first step to run a Smart Discovery is to add a new model in SAP Analytics Cloud.

In this example, we will import a csv dataset extracted from the energy production example. Feel free to use your own dataset.

Once the dataset is imported, we can validate the quality of our data, create the model, and do some adjustments to the measures and dimensions from the dataset. In this example, we will only validate the model and create it.

Select a project folder and save the model on it.

After the model was created, open the project folder and create a new story.

We will start our story running a Smart Discovery.

 

To run the Smart Discovery, select the model we just created as data source, then select the measures and dimensions to find correlations between these elements.

Smart Discovery will automatically generate a dashboard that identifies key influencers for selected target/outcome and explores the key differentiators between your attributes.
As simple as that, we are able to extract the total number of cities in our dataset, the minimum and maximum total value for a city, the sum of all the values for the cities, a distribution of cities by ranges of the total value, the top 10 cities with the greatest total value, and even a forecast of the total value over time.

Now we will add a Smart Insights analysis to a chart. Click on the More actions buttons from a chart, then click on Add Smart Insights.

SAP Analytics Cloud will automatically uncover top contributors and various insights for your selected value or variance.
In this example, Smart Insights automatically highlighted the day with the highest total value and which were the customers, orders, cities, product types, and top products that contributed the most for this result.

To wrap it up the Smart Insights and Smart Discovery scenario, we were able to build a holistic dashboard about the total value measure and the cities dimension under 3 minutes. This allows the business user to spend more time on what really matters, analyzing data and extracting valuable insights which otherwise could have been missed, as the machine learning capabilities analyze complex relations in the underlying model. In this scenario, using Smart Discovery, the business user without having any knowledge about data science is capable of extracting a forecast of the total value for the next 4 months and also identify a downwards trend of the total value. Those insights without the aid of machine learning could have been easily missed.

Furthermore, with Smart Insights, this same business user that has no deep knowledge in data science can easily identify multiple correlations that contributes for this result. Instead of having to build all of those charts having the risk of missing an important aspect of the data, Smart Insights uses machine learning to point out how each variable from the dataset contributes for the result being displayed on the chart. In this example, the business user can easily identify all the aspects that contributed for the exceptional result in December 2017. These tools are a great ally to boost productivity and to allow any business user to use features that were only available for data scientists.

Summary

Congratulations! We have completed both of the scenarios. Together, they are powerful data analysis capabilities that help businesses make faster, better decisions with SAP Analytics Cloud.

 

Recommended reading: 

Blog 1: Location, Location, Location: Loading data into SAP Data Warehouse Cloud: how to easily consume data from systems of records (e.g. SAP ERP), cloud and on-premise databases (e.g. SAP HANA, SQLServer, Oracle, Athena, Redshift, BigQuery, etc.), oData Services, csv/text files available in your own local computer, or any File/Object store (e.g. Amazon S3). We will leverage SAP Data Warehouse Cloud’s Replication and Data Flow capabilities, as well as demonstrate how to access remote sources using data virtualization.

Blog 2: Access the SAP HANA Cloud database underneath SAP Data Warehouse Cloud: how to create an SAP HANA Deployment Infrastructure (aka HDI) container on SAP HANA Cloud, and persist actual sales data originated from an external system in the same SAP Data Warehouse Cloud’s existing persistence area. We will show how to provide bi-directional access between SAP Data Warehouse Cloud and SAP HANA Cloud’s managed datasets. You will also see how to expose SAP HANA Cloud & SAP Data Warehouse Cloud’s artifacts, like a table or a Graphical View, as oData services. You should also take a look on this additional blog, which provides hands-on instructions for exposing SAP Data Warehouse Cloud artifacts as oData services and a complete Git repository to kick-start implementation.

Blog 3: SAP Continuous Integration & Delivery (CI/CD) for SAP HANA Cloud: how to develop and trigger a pipeline using either Jenkins or SAP Continuous Integration and Delivery for automating the deploy of the above SAP HANA Cloud application on multi-target (DEV/PRD) landscapes.

Blog 4: Run future sales prediction using SAP HANA Cloud Machine Learning algorithms: how to create an SAP HANA Cloud HDI container, load training and testing historical data, and run Predictive Analytics Library (PAL) procedures for just-in-time predicting future sales (energy consumption) values.

Blog 5: Develop a SAP HANA Cloud native application: how to create a SAP Cloud Application Programming Model project, which will manage additional data values, working on the back-end application (HDI providing oData services) as well as the front-end SAP Fiori/SAPUI5 application, deployed on dedicated services in SAP BTP.

Blog 6: Provide governed business semantics with SAP Data Warehouse Cloud: how to consume all of the multiple data sources referenced in the blog, enabling business users with self-service data modeling, harmonization, transformation and persistence.

 

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.