Technical Articles
SAP Data Services Code Review using SAP Data Intelligence
This blog is demonstrating how we can leverage the power of SAP Data Intelligence around DEVOPS. In this blog we are giving glimpse how to check the ATL CODE of SAP Data Services developed by user following naming standard of an Organization or not
Using SAP Data Intelligence pipeline and metadata explorer we can proactively keep an eye on the development standards being followed or not in the SAP Data Services code development. Each company should have and follow development standards otherwise in the last it creates lot of confusion and the code would become fragile.
For this blog we have taken an example: According to the Naming Standards of an Organization, Datastore name should start from DST* i.e. DST_SAP, DST_LOOKUP. The same way can also be applied to tables, job names, project names, workflow names. For example staging tables should start from STG_EXT*. We can also check naming standards are being followed or not by directly accessing the metadata tables, but that way has not been considered in this blog because those tables are not accessible to everyone. We can also extract and deploy the SAP Data Services code to call AL_engine from DI pipeline but that is out of scope of this blog.
There are two prerequisites: –
ATL Code of SAP Data Services
Naming Standard Document Followed by an Organization.
Overview of SAP Data Intelligence pipeline processing of SAP Data Services ATL file to check the development standards being followed or not (*Created by me)
Processing of SAP Data Services ATL file
Processing of the ATL file in the Data Intelligence pipeline to extract the Datastore information. In the below pipeline we are reading the input ATL file from the WASB location and processing using the python script and extracting datastore information. After extracting the datastore information then we are loading into a HANA database.
SAP Data Intelligence Pipeline processing SAP Data Services ATL file
The following configurations we have setup for different operators in the above pipeline:
Source Connection to WASB (Windows Azure Storage BLOB) for storing ATL files of SAP BODS
Read File Operator Configuration
After that we have included two convertors ToBlob and then ToString convertor.
Python Operator Configuration
Python Operator Script
We can enhance further as per our need like input dynamically the string to search.
After python operator we have included ToMessageConverter.
HANA Operator Configurations
HANA Operator Configurations continued
HANA Operator Configurations continued
HANA Operator Configurations continued
Output After Processing the ATL File, Fetched Datastore Names
In the same way we have loaded the standard of Data services Naming convention in a table
SAP Data Intelligence Pipeline Processing Naming Standard Document of an Organization
Read File Operator Configuration
HANA Operator Configurations
HANA Operator Configurations continued
HANA Operator Configurations continued
HANA Operator Configurations continued
Output of Data Services Naming Convention store in a table
To continuously check the standards being followed or not, we can develop rules and report the failed data to respective user via mail or dashboard analytics
Rules Dashboard
Rules Dashboard continued: Depicting the trend and keeping an eye on the development standard being followed or not
Definition of the Rule
Output of the Rule to showcase those Datastore names which are not following Naming standards
The following datastores are not following the naming standards. We have run multiple times and in second last run we directed the user to correct the name of the Datastore and the user corrected and in the last run the DS_STG_MGMT_LKP moved from failed rows to passed rows
Output showing failed rows of Initial run
Output of Last run after the correction of Datastore name by the Developer
So, 2 datastore out of 4 datastores are still not complying the SAP Data Services Naming standards of an Organization.
Check with the user and inform either through mail configured in SAP DI pipeline using Send Email operator or through dashboard and give access to the user
Thank you for reading this blog 😊. Feel free to try it in your own and share your feedback with us
Important assets
Related Blog
Yogesh Verma
Nice blog cheers!
"Check with the user and inform either through mail configured in SAP DI pipeline using Send Email operator or through dashboard and give access to the user "
From above statement you meant to create a new graph for notification ?
I have similar requirement where purchase orders from users are updated in hana db via sdi. I now wanted to build dashboard for successful and failed po's and give access to users for viewing. how it can be achieved ?
also how can data in hana db be exposed as dashboard in SDI?