User Experience Insights
A complete journey from a CDS view in S/4HANA cloud to an SAP Analytics Cloud story
In this blog I will cover and end to end development scenario of a CDS view in S/4HANA cloud till the stage it can be consumed in SAP Analytics Cloud as a story, while focusing in some of the key development guidelines and recommendations.
S/4HANA cloud supports different embedded analytics options that can be developed and consumed directly in S/4HANA cloud. With the SAP Analytics Cloud embedded version, the options now are even wider than ever. Creating CDS views and consuming them in SAP Analytics Cloud is one of the scenarios which supports query customization, adding logic to the underlying query, creating calculations and connecting to additional views with joins.
As the development environment in S/4HANA cloud is quite different then the one in eclipse, it’s recommended to first understand which use cases can be covered in S/4HANA cloud and which ones requires the side by side extension framework, e.g. connecting to external data sources.
An overall comparison between the in-app and SAP Cloud platform (Side by Side) can be viewed in the following table:
The SAP S/4HANA Extensibility Patterns can be further explored here:
The Virtual data model in S/4HANA
In S/4HANA we can connect to HANA tables where the data is stored but not directly. On top of the source tables we will find the virtual components which are making the tables more business friendly so we can have more flexibility when working with table objects.
In the virtual data model there are re-usable components are made that make sense: a customer name, a location, a G/L account and so on. The main reason, is to allow business users to understand what’s the business meaning of the objects so they can drive their queries by themselves.
As a standard flow, VDM generally ends with an analytical query although other options are available such as dimensional cube and standard CDS. This means a component that is ready for consumption, the data can be read or displayed to an application like Fiori or multi-dimensional query.
The consumption view can be compared to a BW query and when creating an analytical query we will find some similar BW functionality available like creating restricted and calculated measures.
The Virtual Data Model flow diagram looks:
So a real scenario would be products that were sold to clients, but further information is required to enrich the customer view, and in that case the virtual model would like this:
The Use case
In the following scenario, a custom CDS view based on the journal entry data will be build, to add further custom due periods logic and date range filters in order to address a requirement so analyse the last two years ageing of debts.
Creating the custom CDS view
The first step in this process will be creating the custom CDS view by using the corresponding tile:
Once this option is picked the existing list of CDS will be visible, in our scenario the create option will be used:
From the various options available, the analytical cube will be picked, this option will enable to create an analytical query which can be further consumed in SAP Analytics Cloud:
The different use cases when creating a CDS view are described in the following note:
In the next step a primary data source will be picked after searching for it in the search engine, in this scenario the operational journal entry view will be used for this CDS:
In essence, knowing which view to use is the combination of the existing documentation and help guides, experience, knowledge, working closely with functional leads and testing where feasible.I would recommend to start with the documentation where each business area is covered with the main CDS views:
In the CDS view, we will define which available objects to use, define filters, calculations and joins if required:
As the view is quite rich in objects it is recommended to combine both search and scrolling in order to pick the relevant objects.
When creating an analytic cube it’s mandatory to pick at least one measure, any additional calculations can be added from the elements process step, using the add button, in our scenario we will create a days between calculation that will enable us to build a series of custom due period that covers additional periods.
Our main piece of code is the days between calculation that will count how many days have passed between the net due date (when the debt was set to be payed) to the system date.
(DATS_DAYS_BETWEEN(I_OperationalAcctgDocCube.NetDueDate, cast( $session.system_date as ABAP.DATS )))
If you are wondering why this calculation is performed in the CDS level and not in the SAC story it’s mainly because data difference calculation isn’t supported yet for live connections to S/4HANA and that’s one major consideration when developing a CDS which eventually will be consumed in SAP Analytics Cloud.
For the current supported calculation roadmap:
The aggregation type is set to none to keep the object qualification as a dimension so it can be used as a restriction in the next stage where the analytical query will be developed.
Once the query is ready it can be published, the publish process includes query structure and object syntax parsing, in case there are no errors the query will be published, my recommendation is to publish it from the elements process screen as it allows to view the query errors and warnings without navigation to the error log screen.
The CDS view can still be published when warning messages appear in the following cases:
- Join warning messages
- Syntax warnings
- General warnings, e.g.
Creating the analytical query
The analytical query enables to implement more advanced calculations, organize the dimensions and measures in a row and column structure, use sorts, filters and advanced time filters.Once the custom analytical tile is accessed the analytical query can be given a name and use the custom CDS view that was created in the previous step.
As it’s important to use naming conventions, I would recommend to use prefixes for each layer (custom and analytical) and being consistent with the existing naming convention:
In this case the AL prefix strands for: analytical layer
Once the fields are selected and placed in the row, column and free query areas, several calculations will be added based on the days gap calculation built in the custom CDS.
A new restricted measure named due period 1 will be added and In the fixed value area, the days gap dimension will be used as a restriction on the amount in company currency of days between 0-30:
This step will be repeated to create all the required due periods, unfortunately copy and paste isn’t available but it’s quite simple and fast and after creating all the due periods :
Tip: don’t forget to remove the default restriction from the restricted measure you just created.
Our last time calculation will be to ensure that the query is showing only postings that were created in the last two years to correspond with the due periods calculation series and the report requirement.
Through the manage data function tile we can create data functions that can be applied to the analytical query.
When accessing this tile, a list of existing data functions will be available, through the create function additional date functions can be added
Here, by using a simple range definition based on the previous year till the current date the data function can be defined and simulated. Once the data function will be saved, it will become available for further time calculations in the analytical query.
Going back to our analytical query, by marking the posting date in the filter area, the date function can be applied by changing the default value to the date function
The dynamic date function that we created in the previous step will be picked
The query can now published, as it generates an SQL view name it’s recommended to use it in the search engine in SAP Analytics Cloud when creating the data model.
Creating a data model and a story in SAP Analytics Cloud
In SAP Analytics Cloud we will create a model as this is the first step when connecting to S/4HANA live connection.When working with live data connections there is no data preparation and wrangling stage but a mapping of the underlying CDS view and saving it as a data model in SAP Analytics Cloud.
In the latest TechEd during the SAP Analytics Cloud roadmap session I asked the following question regarding the smart wrangler:
And the answer was..
so we might get some additional simplifications and parity enablement across the different types of connections,
Back to our development process:based on the existing live connection setup the analytical query will be picked for creating the data model
Once the CDS is picked a data model will be created, here we can add several definitions to the data model that will improve its usability and performance:
- Use grouping where possible to simplify the navigation across the dimensions in the story:
2.Once the model is saved, through the model properties, optimize the performance behaviour and create an index for the data model to enable search to insight.
When the story is created, the data range function that was set for the analytical query will appear and capture the last two years range:
After the data is retrieved we can start creating our story on top of the analytical CDS data model and analyse the due periods we created with some additional thresholds, tables and charts
As you can see, the development process is quite straightforward and enables to build step by step queries that can be used in SAP Analytics cloud and S/4HANA cloud.
The main challenges lies as in many cases in the query logic, complexity and the required calculations but it emphasizes another strong point when discussing the cloud and analytics landscape: you can fully leverage the filters, hierarchies, attributes and the query logic in SAP Analytics Cloud.
Hope you enjoyed reading and you are very welcomed to post your comments, own experience and questions.
Stay safe and happy