Hana as a Service Integration with CPI
Introduction: In this blog, we will see how to create an instance for Hana DB in SAP BTP cockpit and integrate with CPI to insert, select records for JDBC scenario. I have used BTP trial version instances for both CPI and Hana cloud service.
Create Hana DB Instance:
- Goto subaccount->CloudFoundry->Spaces and click on “SAP HANA Cloud“.
- Click on Create button and a new window will be opened as below.
- Provide a meaningful instance name, password as per the policies.
- Continue to next step and in step-3 advance settings, Select the Allowed connections as “Allow all IP addresses” based on requirement.
- If the DB creation is successful, you can see the below screen.
- Click Actions icon and select Execute SQL statement option. A new window will be opened. Now execute below statements to create schema and table. Schema is optional, default schema will be same as username.
- CREATE SCHEMA TESTDB;
- CREATE COLUMN TABLE TESTDB.EMPLOYEES (EMPID CHAR(10) PRIMARY KEY, ENAME CHAR (30) NOT NULL, LEVEL CHAR(5), DESIGNATION CHAR(30));
- Before moving to CPI configuration, click on DB name and then properties. A popup will be opened which has DB hostname, DB ID as below.
- Go to CPI overview tab and click on JDBC Material tile. Under JDBC Data source, click on Add button.
- Do the configuration as below, a meaningful name which will be used in JDBC receiver channel and other properties as below.
- Database Type: Hana as a Service
- User/Password: A valid user created in Hana DB, I am using the default user created “DBADMIN” with it’s password.
- JDBC URL: Format – “jdbc:sap://<hostname>:<port>/?databaseName=<database-name>&encrypt=true“
IFlow to Insert records in DB:
- JDBC channel supports XML SQL format which is similar to PI default message type structure. It supports Insert, Update, Select, delete etc.. For more information, click here
- Set the payload in content modifier which accepts JDBC channel and used General splitter since adapter cannot process more than one record in a single call. For more information on limitations and databases supported by Neo and Cloud Foundry environments, click here
- In the JDBC receiver channel, provide the name of JDBC resource created in overview.
- The response can be logged with a script but since only one record insert in a single call, multiple attachments creates as many number of records in input.
Note: Since I had created table in TESTDB schema, table name given as TESTDB.EMPLOYEES.
Records in Database:
IFlow to Select records from DB:
- Using content modifier set the select statement in message body as below. Configuration will be same for JDBC channel as above.
- SELECT EMPID,ENAME,LEVEL,DESIGNATION FROM TESTDB.EMPLOYEES_SELECT WHERE FLAG = ‘N’
- Using sequential multicast, set 2 branches. Branch 1 will be used to update the flag field so that same records won’t be fetched again.
- UPDATE TESTDB.EMPLOYEES_SELECT SET FLAG=’Y’ WHERE FLAG=’N’;
- Branch 2 will be used for actual message processing. Here I logged payload for reference.
Response XML from JDBC:
Response for the Update query will be as ” SQL statement returned OK, 2 row(s) affected”. No XML returned in this scenario.
- Batch statements execution not supported. Only the first statement will be executed and others will be ignored.
- Batch record processing not supported in Insert scenario. In case of multiple records sent without splitting, only first record will be inserted in DB and others will be ignored.