Skip to Content
Technical Articles
Author's profile photo Anil Yakkali

Hana as a Service Integration with CPI

Introduction: In this blog, we will see how to create an instance for Hana DB in SAP BTP cockpit and integrate with CPI to insert, select records for JDBC scenario. I have used BTP trial version instances for both CPI and Hana cloud service.

Create Hana DB Instance: 

  • Goto subaccount->CloudFoundry->Spaces and click on “SAP HANA Cloud“.

  • Click on Create button and a new window will be opened as below.

  • Provide a meaningful instance name, password as per the policies.
  • Continue to next step and in step-3 advance settings, Select the Allowed connections as “Allow all IP addresses” based on requirement.
  • If the DB creation is successful, you can see the below screen.

  • Click Actions icon and select Execute SQL statement option. A new window will be opened. Now execute below statements to create schema and table. Schema is optional, default schema will be same as username.
    •  CREATE SCHEMA TESTDB;
    • CREATE COLUMN TABLE TESTDB.EMPLOYEES (EMPID CHAR(10) PRIMARY KEY, ENAME CHAR (30) NOT NULL, LEVEL CHAR(5), DESIGNATION CHAR(30));
  • Before moving to CPI configuration, click on DB name and then properties. A popup will be opened which has DB hostname, DB ID as below.

SCPI Configuration:

  • Go to CPI overview tab and click on JDBC Material tile. Under JDBC Data source, click on Add button.
  • Do the configuration as below, a meaningful name which will be used in JDBC receiver channel and other properties as below.
    • Database Type: Hana as a Service
    • User/Password: A valid user created in Hana DB, I am using the default user created “DBADMIN” with it’s password.
    • JDBC URL: Format – “jdbc:sap://<hostname>:<port>/?databaseName=<database-name>&encrypt=true

IFlow to Insert records in DB: 

  • JDBC channel supports XML SQL format which is similar to PI default message type structure. It supports Insert, Update, Select, delete etc.. For more information, click here
  • Set the payload in content modifier which accepts JDBC channel and used General splitter since adapter cannot process more than one record in a single call. For more information on limitations and databases supported by Neo and Cloud Foundry environments, click here
  • In the JDBC receiver channel, provide the name of JDBC resource created in overview.
  • The response can be logged with a script but since only one record insert in a single call, multiple attachments creates as many number of records in input.

Request XML:

Note: Since I had created table in TESTDB schema, table name given as TESTDB.EMPLOYEES.

Records in Database:

IFlow to Select records from DB: 

  • Using content modifier set the select statement in message body as below. Configuration will be same for JDBC channel as above.
    • SELECT EMPID,ENAME,LEVEL,DESIGNATION FROM TESTDB.EMPLOYEES_SELECT WHERE FLAG = ‘N’
  • Using sequential multicast, set 2 branches. Branch 1 will be used to update the flag field so that same records won’t be fetched again.
    • UPDATE TESTDB.EMPLOYEES_SELECT SET FLAG=’Y’ WHERE FLAG=’N’;
  • Branch 2 will be used for actual message processing. Here I logged payload for reference.

Response XML from JDBC:

Response for the Update query will be as ” SQL statement returned OK, 2 row(s) affected”. No XML returned in this scenario.

Limitations:

  1. Batch statements execution not supported. Only the first statement will be executed and others will be ignored.
  2. Batch record processing not supported in Insert scenario. In case of multiple records sent without splitting, only first record will be inserted in DB and others will be ignored.

Happy Learning!

Assigned Tags

      10 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Yogananda Muthaiah
      Yogananda Muthaiah

      Great Article and keep posting ...

      Author's profile photo Denys van Kempen
      Denys van Kempen

      Clear and concise, Anil!

      One minor remark: to avoid any confusion, consider using the actual product names (and tag accordingly).

      HANA-as-a-Service (HaaS) was used in the past to reference the SAP HANA Service.

      For those less familiar and for the differences, see

      • https://blogs.sap.com/2020/03/13/at-your-service-sap-hana-2.0-an-introduction-2/
      Author's profile photo Anil Yakkali
      Anil Yakkali
      Blog Post Author

      Thanks Denys for the insight, will keep in consideration.

      Author's profile photo Sanjaya Kumar Bihari
      Sanjaya Kumar Bihari

      Nice blog Anil.

      Author's profile photo M Nivas
      M Nivas

      Hi Anil,

      Great Article and Very clear Explanation !

      Only one Request from my end ,kindly share the IFlow files. This will Really Great Help !

      Thank you in advance !

       

      Author's profile photo Sudhir Kamra
      Sudhir Kamra

      Great blog Anil, appreciate it.

      Author's profile photo Nan Wang
      Nan Wang

      Hi Anily

      Nice blog! one further question, do you have comments on performance for JDBC adapter? On hana cloud we can build OData service easily. which of JDBC adapter and OData service will be faster?

      Thanks so much.

      Author's profile photo Anil Yakkali
      Anil Yakkali
      Blog Post Author

      Hi Nan,

      I will incline more towards OData in this case. If we observe the insert mechanism, current JDBC adapter is not supporting batch of records so we have used splitter where as OData service supports batch. The number of calls to DB will be reduced consistently.

      Also, because of pagination in OData service, records will be fetched and processed as pages where as for the execution of Select query, if the number of records are huge the adapter may result into timeout error.

      Regards,

      Anil

      Author's profile photo Dinesh M
      Dinesh M

      Hi,

      Great blog! Could you let me know how do we retrieve the records from the HANA DB

      Thanks,

      Dinesh

      Author's profile photo Artiom Sargesjan
      Artiom Sargesjan

      Hi Anil,

      According to OSS 3073748, this functionality is only available for Cloud Integration Tenants in Neo Environment. In the blog it looks like you are working in the Cloud Foundry environment, but you don't mention the Access Token which is required in this case.

      Any comment on this?

      JDBC%20Data%20Source

      JDBC Data Source

      Thanks,

      Artiom