Skip to Content
Technical Articles
Author's profile photo Yogesh Verma

SAP Data Services Code Review using SAP Data Intelligence

This blog is demonstrating how we can leverage the power of SAP Data Intelligence around DEVOPS. In this blog we are giving glimpse how to check the ATL CODE of SAP Data Services developed by user following naming standard of an Organization or not

Using SAP Data Intelligence pipeline and metadata explorer we can proactively keep an eye on the development standards being followed or not in the SAP Data Services code development. Each company should have and follow development standards otherwise in the last it creates lot of confusion and the code would become fragile.

For this blog we have taken an example: According to the Naming Standards of an Organization, Datastore name should start from DST* i.e. DST_SAP, DST_LOOKUP. The same way can also be applied to tables, job names, project names, workflow names. For example staging tables should start from STG_EXT*. We can also check naming standards are being followed or not by directly accessing the metadata tables, but that way has not been considered in this blog because those tables are not accessible to everyone. We can also extract and deploy the SAP Data Services code to call AL_engine from DI pipeline but that is out of scope of this blog.

There are two prerequisites: –

ATL Code of SAP Data Services
Naming Standard Document Followed by an Organization.

Overview%20of%20SAP%20Data%20Intelligence%20pipeline%20processing%20of%20SAP%20Data%20Services%20ATL%20file%20to%20check%20the%20development%20standards%20being%20followed%20or%20not%20%28*Created%20by%20me%29

Overview of SAP Data Intelligence pipeline processing of SAP Data Services ATL file to check the development standards being followed or not (*Created by me)

 

Processing of SAP Data Services ATL file

Processing of the ATL file in the Data Intelligence pipeline to extract the Datastore information. In the below pipeline we are reading the input ATL file from the WASB location and processing using the python script and extracting datastore information. After extracting the datastore information then we are loading into a HANA database.

 

SAP%20Data%20Intelligence%20Pipeline%20processing%20SAP%20Data%20Services%20ATL%20file

SAP Data Intelligence Pipeline processing SAP Data Services ATL file

The following configurations we have setup for different operators in the above pipeline:

Source%20Connection%20to%20WASB%20%28Windows%20Azure%20Storage%20BLOB%29%20for%20storing%20ATL%20files%20of%20SAP%20BODS

Source Connection to WASB (Windows Azure Storage BLOB) for storing ATL files of SAP BODS

 

Read%20File%20Operator%20Configuration

Read File Operator Configuration

After that we have included two convertors ToBlob and then ToString convertor.

 

Python%20Operator%20Configuration

Python Operator Configuration

 

Python%20Operator%20Script

Python Operator Script

We can enhance further as per our need like input dynamically the string to search.

After python operator we have included ToMessageConverter.

HANA%20Operator%20Configurations

HANA Operator Configurations

HANA%20Operator%20Configuration%20continued

HANA Operator Configurations continued

HANA%20Operator%20Configurations%20continued

HANA Operator Configurations continued

 

HANA%20Operator%20Configurations%20continued

HANA Operator Configurations continued

Output%20After%20Processing%20the%20ATL%20File%2C%20Fetched%20Datastore%20Names

Output After Processing the ATL File, Fetched Datastore Names

 

In the same way we have loaded the standard of Data services Naming convention in a table

SAP%20Data%20Intelligence%20Pipeline%20Processing%20Naming%20Standard%20Document%20of%20an%20Organization

SAP Data Intelligence Pipeline Processing Naming Standard Document of an Organization

 

Read%20File%20Operator%20configuration

Read File Operator Configuration

HANA%20Operator%20Configurations

HANA Operator Configurations

 

HANA%20Operator%20Configurations%20continued

HANA Operator Configurations continued

 

HANA%20Operator%20Configurations%20continued

HANA Operator Configurations continued

HANA%20Operator%20Configurations%20continued

HANA Operator Configurations continued

 

Output%20of%20Data%20Services%20Naming%20Convention%20store%20in%20a%20table

Output of Data Services Naming Convention store in a table

 

To continuously check the standards being followed or not, we can develop rules and report the failed data to respective user via mail or dashboard analytics

 

Rules%20Dashboard

Rules Dashboard

 

 

Rules%20Dashboard%20continued%3A%20Depicting%20the%20trend%20and%20keeping%20an%20eye%20on%20the%20development%20standard%20being%20followed%20or%20not

Rules Dashboard continued: Depicting the trend and keeping an eye on the development standard being followed or not

 

Definition%20of%20the%20Rule

Definition of the Rule

 

Output of the Rule to showcase those Datastore names which are not following Naming standards

The following datastores are not following the naming standards. We have run multiple times and in second last run we directed the user to correct the name of the Datastore and the user corrected and in the last run the DS_STG_MGMT_LKP moved from failed rows to passed rows

 

Output%20showing%20failed%20rows%20of%20Initial%20run

Output showing failed rows of Initial run

 

Output%20of%20Last%20run%20after%20the%20correction%20of%20Datastore%20name%20by%20the%20Developer

Output of Last run after the correction of Datastore name by the Developer

 

So, 2 datastore out of 4 datastores are still not complying the SAP Data Services Naming standards of an Organization.

Check with the user and inform either through mail configured in SAP DI pipeline using Send Email operator or through dashboard and give access to the user

 

Thank you for reading this blog 😊. Feel free to try it in your own and share your feedback with us

 

Important assets

Related Blog

Overview of SAP HANA Operators in SAP Data Intelligence

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.