Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
fmsanchez
Explorer
SAP has recently published a new release of CPI with some new features, including JDBC on-premise connection. This is a long-awaited feature that will be used in many use cases. This blog post covers the steps that you need to follow to configure and test the connectivity in an end-to-end scenario.

Note: Some steps will be skipped because the documentation and different blog posts already describe them in sufficient depth. Anyway, all links to blog posts and documentation will be provided.

Index



  1. Cloud Connector configuration

  2. Data Source configuration

  3. iFlow configuration

  4. Test E2E scenario

  5. Troubleshooting


 

1. Cloud Connector configuration


This section covers the configuration of the mapping to the internal system which will allow the external inbound calls (CPI) to be redirected to the proper internal Host URL/port.


The initial configuration of the Cloud Connector can be found in this post: How to setup Cloud Connection.

Once everything is established in the on-prem server, it's time to add the mapping to connect Cloud to the on-prem system. As you can see in the screenshot below, I have added a new TCP connection.


Cloud To On-Premise access control table


The new inbound TCP connection will have this data:











































Field Value
Back-end Type Non-SAP System
Protocol TCP¹
Virtual Host <virtual host URL>² (e.g. mssqlserver)
Virtual Port <virtual host port> (typically 1433)
Internal Host <your internal host system URL> (e.g. your-server.com)
Internal Port <your internal host port> (typically 1433)
Principal Type None
Description <it's up to you>
Check internal Host Check/Unchecked (you can always check it manually later)

 

¹TCP is the protocol used for default JDBC connections.

²You can add any virtual host address as this is the one that you are going to call from CPI. In this scenario, we have replicated the same address name than the internal host for simplicity, but it is a good recommendation to establish a different naming convention to not expose any backend details.


Mapping virtual to internal system data


 

2. Data Source configuration


Now that the Cloud Connector has been configured in the CPI tenant, it's time to add the MS SQL Server Data Source in CPI.

You just have to follow some simple steps to configure the JDBC Data Source:

2.1. Download SQL Server Driver

2.2. Upload/Deploy the Driver into CPI's JDBC Material

2.3. Create the JDBC Data Source

These steps are explained further in this SAP' section Managing JDBC Material.

2.1. Download SQL Server Driver


Follow this link and extract the zip, you will be able to see something similar than the screenshot below:


MS SQL Server Driver folder



2.2. Upload/Deploy the Driver into CPI's JDBC Material


Now go to CPI cockpit and follow "Operations Link" > "JDBC Material" > "JDBC Driver".

In this screen, you should be able to upload new drivers if you have the correct roles. Click Add and a pop-up will appear, select the Database Type as Microsoft SQL Server and choose the Jar file and click deploy.


Select database type


Note: I have chosen mssql-jdbc-8.4.1.jre8.jar

After deploying the Driver, a new line will be added into JDBC Driver like the screenshot below.


JDBC Driver successfully deployed


It will take a couple of seconds to change the status to "Started" but once that is done, you don't have to do anything else in this menu.

 

2.3. Create the JDBC Data Source


Now that the driver has been deployed, the next step is to create the JDBC Data Source that will be consumed from the iFlow. Change the tab into JDBC Data Source and click Add button. This will pop-up a new form that needs to be filled.

The configuration of the JDBC Data Source will have these fields:











































Field Value
Name <The name of the Data Source> (e.g. TEST_DATA_SOURCE)
Description <Meaningful descripiton>
Database Type Microsoft SQL Server
User <Username of the SQL Server DB>
Password <Password of selected user>
Repeat Password <Same as above>
JDBC URL¹ jdbc:sqlserver://<Virtual Host>:<Virtual Port>;DatabaseName=<Schema Name>;
Location ID <Only if you have more than one Cloud Connector configured>
Cloud Connector Checked (it must be checked to go through the Cloud Connector)

¹Note that Virtual host and virtual port will have to be the same as you have configured before in Cloud Connector

The result will be something like this:


Add new Data Source


Once you have filled all the required fields, click Deploy and the configuration will be deployed and the Data Source created. In the screenshot below you can see how should look after deployed. The Data Source will appear as "Stored" once you initially deploy it, but after a few seconds should appear as "Deployed".


Data Source successfully added


 

3. iFlow configuration


Everything is now set to consume database data from a CPI iFlow using the new connection. This section won't cover how to create the package/iFlow and we are to ignore these steps.

Once the iFlow is created, connect the Sender to the Start point with an HTTPS connection and expose the URL endpoint as you wish. Add a new Request-Reply step and connect it to the SQL Server Receiver.

HTTPS Connection


This connection will allow us to call the process whenever we want. Just don't forget to add the required roles to your user (or the one you use to call the Process, we will need it later).


HTTPS Connection



JDBC Connection


This connection will forward everything that CPI's body contains and will try to process it, also, as we have exposed the process as HTTP, the result of each call will be returned back to the caller.


JDBC Connection


The whole iFlow will look like the following screenshot:


Integration Process


Once everything is configured in the iFlow, save the process and deploy it. If all goes fine, you should have an available endpoint that represents your iFlow.

If you want to get the entry_point of the Integration Process you can go to "Operations View" > "Manage Integration Content" > "All". Select your iFlow and get the Endpoint that should be there.


Classical option to get the endpoint


There is another option to get the URL if you use ConVista CPI Helper tool. Just click Info when you are inside the iFlow and you will see the URL there.



ConVista Helper tool


Note: This process follows Request-reply pattern and if you want to learn more about patterns you can follow my colleague bhalchandra.wadekar with his series of blog posts about Enterprise Patterns (EIPinCPI).

4. Test E2E scenario


With the iFlow deployed, you can consume the endpoint using Postman or any other REST client tool (in this case I will be using Postman).


Postman insert call


As you can see, the message sent above shows that you can send any SQL statement and it will be run in the Database. After the insert has been triggered, you can see that the result of a select statement also returns some values.


Postman Select



5. Troubleshooting


I will be adding more issues if I found them and how can be solved, as far as I faced, these are the most important that I have seen.





  • If you are getting this issue in the iFlow:


com.sap.it.rt.adapter.http.api.exception.HttpResponseException: An internal server error occured: The TCP/IP connection to the host localhost, port 1105 has failed. Error: "The driver received an unexpected pre-login response. Verify the connection properties and check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. This driver can be used only with SQL Server 2005 or later.".

--> You have to modify the TCP SSL to TCP





  • If you are getting this issue in the iFlow:


com.sap.it.rt.adapter.http.api.exception.HttpResponseException: An internal server error occured: Error while getting Datasource Service: ServiceReference is null.

--> You have an incorrect JDBC URL in the JDBC Data Source configuration and probably it has not deployed properly. You don't have the port established in the JDBC Data Source.





  • If you are getting this issue in the iFlow:


com.sap.it.rt.adapter.http.api.exception.HttpResponseException: An internal server error occured: The port number 1108/CPI is not valid..

--> You have an incorrect JDBC URL in the JDBC Data Source configuration and probably it has not deployed properly. You don't have ";DatabaseName=<DatabaseSchema>;" added in the JDBC URL or is not added properly.





  • If you are not getting any response from the DB:


--> See this reply from david.ruizdeazua




 

Disclaimer: The setup described in this post is just for experimental purposes and some of the configurations might need a further security review.

This is my first post and could have some mistakes, any feedback is appreciated.
24 Comments
Labels in this area