Technical Articles
SAP Commissions – Smart Data Integration[SDI] – Part 2
Dear All,
This article is intended for database admins, consultants, customers & partners to know more about SAP Commissions – SDI Architecture Flow to prepare your various available data integration process.
First thing First
You need to Install DP Agent to connect to clients/customer source system to bring data into business need and transform into SAP Commissions staging tables.
SAP Commissions – SDI Architecture Flow
Direct Approach using DP Agent (Most Recommended)
SDI is packaged as part of HANA Enterprise which comprises the following components:
- HANA Server (SDI Data Provisioning Server) hosted by SAP/Callidus Datacenter
- Data Provisioning Agent(DP Agent) hosted on-premise by customers
The DP agent is installed in the customer’s premise and is set up to connect with SDI and the HANA DB Server. The DP Agent connects to the data source systems (SQL Server, Hadoop,Flat Files, Odata, RDBMS, or others) and passes information between the source systems and the HANA Server using built-in adapters that are packaged with the DP Agent. Data is transmitted over HTTPS to and from the HANA Server.
Every SAP Commission tenant is provisioned with a separate HANA tenant database. Web IDE facilitates access to the HANA database and SDI components. A development space is provisioned in Web IDE XSA (Extended Application Service) to enable application developers to manage content integration development. The Web IDE DB Explorer plugin allows users to connect to the tenant database and access the schemas.
Term | Definition |
---|---|
Adapter |
SDI component which allows connectivity to external sources. |
Commissions Stage Tables |
Temporary data storage area in Commissions where data from SDI is placed during export. Data is validated in the staging area before transferring into the Commissions tables and executing the pipeline. See Commissions Administrator online help and Data Dictionaries for more details. |
DP Agent |
The Data Provisioning Agent hosts all SDI Adapters and acts as the communication interface between Hana and the Adapter. You can also install DP Agent from the HANA Tools site. |
Flat File |
Flat file allows you to specify data attributes, such as columns and data types table by table, and stores the data in plain text format. |
Remote Source |
A remote source can be used to create virtual tables or to use the linked database feature. |
Flow Graph |
A graphical user interface to develop data integration mapping and transformations. |
EXT Schema |
EXT Schema in HANA database is a temporary database that facilitates data validation, transformation, aggregation, and cleaning for large volumes of data. It allows creating custom tables and stored procedures to process bulk data. |
HANA Database |
SAP HANA database which is used by Commissions for data storage and processing. |
Pipeline |
Pipeline is a compensation computation process initiated from the Pipeline workspace in the Job Queue view or from the command-line utility. The pipeline produces compensation and pay results for payees assigned to variable compensation plans. See Commissions online help for more details. |
Commissions Workspace |
Designated area in Commissions where related compensation objects are grouped together so that a user can perform related tasks from the same place. |
ODATA |
Protocol for building and consuming REST APIs. |
Virtual Table |
A HANA component, which allows read and write of data from external sources. |
Web IDE |
SAP Web IDE is a browser-based integrated development environment (IDE), comprised of web-based UIs, business logic, and extensive SAP HANA data models, that are leveraged by SDI. Web IDE facilitates access to HANA database and SDI components. It is also a Web-based development environment for SAP Fiori, SAPUI5, and full-stack business apps.
|
Available Adapters
Data Provisioning Adapters
Custom Adapters
- SAP HANA SDI ServiceNow Adapter
- SAP HANA SDI Salesforce Adapter
- SAP HANA SDI Kafka Adapter
- SAP HANA SDI Amazon SQS Adapter
Alternative approach using Commissions Data Loader (CDL) without DP Agent
In alternative approach, when client/customer doesn’t allow their database network to connect via DP Agent…
Customer can push the raw data to SAP Commissions sFTP inbound folder in secured way, as it goes directly into Commissions Data Loader (CDL) as custom inbound file processing and that gets loaded into temporary table for further pre-processing the data through flow-graph and loads into Commissions Staging table. Virtual Table will trigger automatically when the data is ready for Validate&Transfer job to process.
References ;
SAP Commissions – Smart Data Integration[SDI] – Part 1
SDI Product Availability Matrix (PAM)
Uninstall the Data Provisioning Agent
Hello Yogananda, thanks for the article, pretty interesting.
I'm working in an implementation and I need to perform a transformation in an inbound data for which the ALTERNATIVE APPROACH you mentioned above seems to fit perfectly.
I have some doubts, so, I would like to ask you for a short guidance.
To create a custom inbound datatype I need to create a stored procedure, right? Do you have a sample of this procedure to share? I could find the sample for outbound, but not for inbound.
I imagine I need to create a temporary table in EXT, for this, must the table structure be equal the target stage table or must it fit the fields in the csv file?
I really appreciate if you could give some advices.
Thank you,
Alberto
Thank you Alberto Silva for following all the blogs and taking your time to read 🙂
SDI - We need to have DP Agent with filebased connector ... you it can pull the csv from your local directory to temporary table and procedure to insert into stage tables. Virtual table will pick the records for import.
SDI Blog in Part 6 is in detailed for your ask.
In CDL - without DP agent, customer will drop a file in sFTP folder that will move to temp table and move to stage tables with procedure call and its loads again through Validate and Transfer.
I will write a new blog later for Inbound file with sample procedure ..
Thank you very much Yogananda, using your insights and some research I was able to implement.
There is a trick part related to the table sequences for which I could not fully understand how the CDL generates the next sequence, however I was able to find out a way that worked.
Thanks again,
Alberto