Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
lsubatin
Active Contributor
If I'm doing something for the first time and I know I will need to replicate it multiple times, I tend to take screenshots and notes for my self of the future. Since I got this question a couple of times in the last weeks, I'm publishing those notes.

This post is about setting up Smart Data Integration from an on-premise SAP HANA instance (HANA Express in my case) to my tenant of SAP Cloud Platform, SAP HANA Service on Cloud Foundry.



(as usual, check your license agreements...)

Check the existing tutorials first


I will not duplicate information that I have already published here: https://developers.sap.com/group.haas-dm-sdi.html 

I am using the FileAdapter there, I will use the HanaAdapter here so I will only cover the differences.

You will still need to generate the cross-container access, a user for the connection and a remote source. If you are struggling to understand these concepts and how the wiring works, you are not alone: here is a post with some drawings that hopefully helps.

Prereqs:



  • You have an instance of SAP HANA Service in Cloud Foundry (the trial HDI containers will unfortunately not work... this makes me sad too...)

  • You have access to the cockpit for your SAP HANA Service

  • You have a HANA instance that can be accessed from the outside world

  • You have whitelisted the IP address of your HANA express or on-premise instance

  • You have admin rights to create users in both HANA instances


Download the Data Provisioning Agent


I got it from the Software Downloads but you can also get it from the SAP Development Tools. Just make sure you have version 2.3.5.2 or higher so you can connect through Web Sockets.

At the time of writing this blog post, the version of the SDI adapter in HXE's download manager was not high enough, but that would have been the best option:
cd /usr/sap/HXE/home/bin
./HXEDownloadManager_linux.bin -X

You may already have it in your /Downloads folder in HXE too.

If your file ends with extension ".SAR" or ".CAR", you will need to use sapcar to inflate the file first. Else, good old tar -xvzf on your HXE machine command will do.

Check the ports


The DPagent will ask for two ports to listen. If you are running in the cloud like me, make sure you open those ports. The defaults are 5050 and 5051.

Install the Data Provisioning Agent


My on-premise SAP HANA, express edition instance lives on Google Cloud, so I uploaded the inflated dpagent file first and these commands did the trick:
sudo su - hxeadm
cd /extracted_folder/HANA_DP_AGENT_20_LIN_X86_64
./hdbinst

.

I hit enter on everything as I wanted the default directory and ports, but of course you can change these.

Note the installation path is /usr/sap/dataprovagent .

Configure the Data Provisioning Agent


Set the environment variables first and navigate into the directory where you installed DPAgent:
export DPA_INSTANCE=/usr/sap/dataprovagent
cd $DPA_INSTANCE
cd bin



And start the configuration of the Data Provisioning Agent
 ./agentcli.sh --configAgent

Go with option 2 first, Start or Stop Agent:



Don't forget! You will need to have whitelisted the public IP address of your source machine for access to your HANA service.

Go back to the main menu, choose 5. SAP HANA Connection to log in to your SAP HANA Service:



You will need your instance ID and the websocket endpoint. You get these from the Cockpit of your database instance:



 

Once you have connected successfully, register the agent with option 7. Agent Registration



You should now see the agent in the Database Explorer in HANA as a Service



Finally, register the HanaAdapter with 8. Adapter Registration:


Create a user in the source database


You will need a user with the right access in the source database (HANA Express in my case). Something like:
create user remousr password P455w0Rd NO FORCE_FIRST_PASSWORD_CHANGE;
grant select, trigger on schema FOOD to REMOUSR;



 

 

I also want to replicate data from HDI containers, so I open the SQL console as Admin:



And run something like:
set schema "ANALYTICS_1#DI";
CREATE LOCAL TEMPORARY TABLE #PRIVILEGES LIKE _SYS_DI.TT_SCHEMA_PRIVILEGES;
INSERT INTO #PRIVILEGES (PRIVILEGE_NAME, PRINCIPAL_SCHEMA_NAME, PRINCIPAL_NAME) values ('SELECT','','REMOUSR');
CALL "ANALYTICS_1#DI".GRANT_CONTAINER_SCHEMA_PRIVILEGES(#PRIVILEGES, _SYS_DI.T_NO_PARAMETERS, ?, ?, ?);
DROP TABLE #PRIVILEGES;



Some notes:

  • ANALYTICS_1#DI: Comes from doing right-click on a table in the HDI container, choosing "Generate SELECT statement" and adding "#DI" to the schema.

  • You can also grant roles, which is preferable. More information here.


Create a remote source


Finally, on the database explorer connected to you SAP HANA Service on Cloud Foundry, right-click on Remote Sources --> Add Remote Source:



You will see the agent and adapter you registered before.

Enter the host of the source machine (in my case, the public static IP address of my HXE machine.

Enter the port number where the indexserver of your source tenant database is listening (39015 for the default first tenant on HXE, 39013 for the SYSTEMDB which you should not be using anyways ).

If you don't know the port for your database, querying the table M_SERVICES will tell you the SQL_PORT of your indexserver.

Set the credentials to Technical User and enter the user and password for the user you created in your source database:



 

And you are ready to roll!



If you want to consume the remote source you have just created in Replication Tasks, Flowgraph and Virtual Tables from an HDI container in Web IDE, follow the steps in this tutorial group: https://developers.sap.com/group.haas-dm-sdi.html 

Was this useful? Hit that "Like" button and remember to keep encouraging other community members who go out of their way to share their knowledge.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 
2 Comments