Lost In Replication? Or, How to Replicate Data to Your App in the Cloud
By Martina Galabova, Valeri Sedevchev, Petra Lazarova and Daniel Vladinov
So now what?
Easy-peasy: The solution is to replicate the necessary backend data to your productive SAP HANA instance on SAP HANA Cloud Platform and use the data in your application.
As you may have already guessed, in this blog you will learn how to get the data you need, when you need it, and put it where you need it! Or, in simple words (for developers only 🙂 ): how to replicate your data from an on-premise system to your productive SAP HANA instance on the cloud via SLT (i.e. SAP Landscape Transformation Server).
The following diagram illustrates the process of replicating data on SAP HANA Cloud Platform
What you have:
- On-premise ERP system that holds your data – let’s call it Source;
- Productive SAP HANA instance – let’s call it Target system, available in your SAP HANA Cloud Platform account;
- SAP HANA Cloud Platform SDK: it brings the needed DB tunnel tool;
- SLT system installed as on-premise to replicate the data from Source to Target system via the DB tunnel;
- SAP HANA Studio to configure and manage the Target schema, users, roles and privileges.
What you want:
- SAP HANA XS application that consumes and processes the backend ERP data
- Near real-time data availability
What to do:
- Set the Users.
- Configure the Replication.
- Go and Replicate.
- Enjoy! 🙂
You need to have:
- An ERP installed with latest version of DMIS add-on version 2011_1_* with all related SAP Notes applied.
- An SLT server installed with latest version of DMIS add-on version 2011_1_* with all related SAP Notes applied and a SAP HANA secondary database library installed. Latest information can be found in the SAP LT Replication Server blog.
NOTE: You can see the version of the client via the cockpit -> Database Schemas.
- An account on SAP HANA Cloud Platform
- A productive SAP HANA instance on the SAP HANA Cloud Platform
- Downloaded and installed the SAP HANA Studio Developer Edition in accordance with the SAP HANA Studio Installation Guide
- Downloaded the latest version of SAP HANA Cloud Platform SDK (Java Web), in accordance with the Installing the SDK page, both on your local and your SLT machine. (You will need Java version 1.6 or higher previously installed.)
I. Set the Users
Desired Result: Getting a replication user with all required permissions that you will need for the actual replication.
The whole procedure with setting up the users requires three types of users:
- Your SCN user that you need to open the DB connectivity tunnel ( for example, p1234567)
- Once the tunnel is open, it returns an SAP HANA DB user that grants you access to the dedicated SAP HANA instance on SAP HANA Cloud Platform via your SAP HANA Studio (it has the same name as your SCN user; in our example, p1234567 )
- In the SAP HANA Studio, you create a third user (<DR_ADM_USER>) that you need for the replication configuration itself.
- Open a DB tunnel for secure connection from your local machine to SAP HANA Cloud Platform. To do this, use your SCN user and enter command neo open-db-tunnel. Then enter the required information, such as:
- Landscape host
- Your account
- Your landscape account User
- Target HANA database name – you will get this name once you order your productive HANA in your landscape.
neo open-db-tunnel -i <target_hanadb_name> -h hana.ondemand.com -a <my_account> -u <my_SCN_user>
For more information, see the Documentation. RESULT: The DB tunnel makes your productive HANA system locally accessible via the properties displayed in the console output: Host name (usually localhost), JDBC Url, Instance number, newly created database User, and Initial password for it.
- Open your SAP HANA Studio and add a new system, using the credentials provided by the tunnel. When prompted, change the initial password to a permanent one. Hint: this is the automatically created HANA database user with a name equal to your SCN user.
NOTE: Remember the specified permanent password – you’ll need it later to configure the Target credentials for replication.
- In the SAP HANA Studio, create a new user that will be used as SLT administrator. This automatically creates a new schema with the same name. Provide the name and initial password for it. In our example, we will use the name <DR_ADM_USER>. Note that the initial password will be changed in just a moment (step 5 below)!
- Configure the following settings for <DR_ADM_USER> :
- In the Granted Roles tab, add HCP_SYSTEM role.
- In the Object privilege tab, add REPOSITORY_REST SQL object with EXECUTE privilege.
- If SYS_REPL schema already exists, the EXECUTE, SELECT, INSERT, UPDATE and DELETE privileges should be grantable to others. SYS_REPL is a system schema used by SLT to store replication configuration data. Skip this step if SYS_REPL schema is not present!
- Save the <DR_ADM_USER> user settings.
- In the SAP HANA Studio, right-click on your system (from tab Systems), and select the “Add Additional User…” option to log on with the <DR_ADM_USER>. SAP HANA Studio will prompt you in a pop-up window to change the initial password.
The second entry will appear in the Systems tab so that you can log off from the p1234567 login session.
II. Configure the Replication
Desired Result: Establishing a secure connection from your SLT to the dedicated SAP HANA on SAP HANA Cloud Platform, and to configure the replication, so that everything is set and ready to go.
You need to open a DB tunnel from your SLT system to the dedicated SAP HANA instance on SAP HANA Cloud Platform. (Create replication configuration to establish the connection between your on-premise ERP system to your dedicated HANA instance on the platform.) To establish a secure connection from your SLT system to your dedicated SAP HANA instance, you have to open a DB tunnel from your on-premise SLT machine.
- Log in to your SLT machine.
- Download and install the latest version of SAP HANA Cloud Platform SDK (Java Web) in accordance with page Installing the SDK.
- Open a DB tunnel with your SCN user to your target schema. The tunnel allows SQL statements only for the owner of that schema to pass through it. To keep the tunnel permanently alive, you should add the additional parameter –timeout 0
neo open-db-tunnel -h hana.ondemand.com -a <my_account> -u <my_SCN_user>-i <target_hanadb_name> –timeout 0
For more information, see the Documentation.
- Create Replication Configuration
NOTE: This document applies to SP05 of DMIS add-on. In previous versions, all the data is populated on one screen.
- Log in to your SLT system and open transaction LTR.
- In the opened browser window, log on and open the New Configuration wizard.
- Enter replication name. The wizard will create a new schema in the Target HANA box in your SAP HANA Cloud Platform account with that name that holds your replication data.
- Choose Next.
- Specify Source System: Choose your RFC destination connecting to the Source (ERP) system.
NOTE: By default, only one replication is allowed per source system. If you want to create multiple replications (that is, replication to multiple SAP HANA systems) from your source system, select the “Allow multiple usage” checkbox.
- Specify Target System: In the System Data form, enter the following:
- Choose Next.
- Continue the configuration as described in the Documentation.
III. Go and Replicate
Desired Result: Laying back and enjoying the seamless and safe replication process, and enjoying your app anytime and from anywhere.
- Open your SAP HANA Studio and switch to the Modeler Perspective.
- Open Quick Launch with your <DR_ADM_USER> user.
Your system and username are listed there. You can switch the system by choosing the Select System… button.
- Open the Data Provisioning… editor from the Data pane in the bottom-middle part of the screen.
- Wait for your system tables (DD0xx) to be replicated. You can press the refresh button to see the latest status of the replication tables. The tables will go through few different actions and statuses. When they reach status “In Process” with action “Replicate” – which is an indication they are already (and will stay) replicated – you are ready to proceed with the replication of the tables needed for your cloud application.
- Choose the Replicate button and select the tables you want to replicate. For more details you can check this HANA Academy video.
- Enjoy! 🙂
So What Did You Just Do?
You can now quickly and easily replicate your on-premise data in SAP HANA Cloud Platform, where you can develop apps using this data.
Basically, you have conquered your own small data cloud in the big SAP HANA Cloud Platform family! 🙂
Martina, this is great. This is exactly the information I was looking to find.
I have one outstanding question however, with regard to the DB tunnel. Typically in an on premise HANA solution, as long as you are in the network, of course end users with BI clients (think SAP Lumira, Analysis for office, etc) would be able to connect to HANA via an ODBC/JDBC connection.
In the case of HCP, all connections require a tunnel to access (both Studio and SLT). Does this mean that business users wishing to use client based BI tools would also need a local DB tunnel configured like this to access HANA?
I am thinking the answer is yes, just thought I would ask.
the tunnel is needed to bridge the isolated networks (customer network and HCP network) in a secure way. ODBC/JDBC are not real protocols which offer e.g. encryption, this is added by the SSL tunnel.
We will integrate the DB tunnel soon into the SAP HANA Cloud Connector. Once available, one SCC instance can be used shared for all users of a company: You would then connect your local DB tools against the host + DB port of the cloud connector.
Best regards, Timo
Thanks for the info, very helpful. Technically, even when using the tunnel, the applications are still using the same protocol, except they 'see' the host as localhost when in reality it is a remote system, correct? And this would be true of ANY system wanting to connect, period.
Not too familiar with HCC quite yet, I'll have to dig into the documentation on that one. Many of these components are new to me! Any hints on when this change would happen? Just wondering how to communicate these components to potential customers, and it sounds like HCC would remove some complexity in the use case?
Yes, your description is correct. It would act as local proxy to the remote HANA DB, and tools connect to the local proxy instead to the DB directly.
I can't make concrete statements about the timeline, but we plan to add this feature soon, so we are rather speaking of few months, i.e. the near future.
Fair enough, until then, all access still requires a tunnel from each individual app needing access, ie would need to set up tunnel on each end user machine?
Any preferred reading material around HCC?
Using the command-line to open the DB tunnel locally to your application is the best approach at the moment. This format of the DB tunnel was meant for development scenarios, in which developers want to connect the HANA Studio against HANA in the cloud. For other scenarios, like yours, the upcoming solution with the cloud connector should be better suited, also as the SCC integrated solution will provide further enterprise level features, like monitoring, auditing, etc.
Regarding further reading, I am only aware of the official help about the DB tunnel (https://help.hana.ondemand.com/help/frameset.htm?6930850a8f9a40489c01ed1aa381946d.html), and this bog describing how to use it with SLT for replication.
Best regards, Timo
Thank you for your time today, really helped me clear my doubts.
I was referring to HCC with regard to further reading, are there any particularly good docs you are aware of?
There is an operator's guide published on SCN: SAP HANA Cloud Connector: Operator's Guide
Furthermore, I recently documented a sample application in SCN:
SFlight sample application showing how to extend an on-premise ABAP system using JCo/RFC
Further question, how is what you describe for the future development different from what is described here, I assume that only a handful of protocols, mainly HTTP/S are supported currently?
Then what you describe here as future development is ODBC/JDBC, RFC support, etc correct?
RFC is supported already in the Cloud Connector, JDBC/ODBC is supported in form of the DB tunnel / command line tool. As said, we plan to add this directly to the connector as well.
Thank you for entertaining my questions, just trying to understand the various component players here.
I hate to keep dragging this out, but what is the difference between Cloud Connector and Cloud Integration (HCI)?
HCI is a process integration solution - that's an integration on a higher level of abstraction. SAP HANA Cloud connector and its counter part in the cloud the Connectivity Service are offereing technical connectivity.
Thanks Markus, looks like this is specific to BODS too.
I would imagine that BODS too could leverage Cloud Connector once the DB tunnel is included too.
This is why I love SCN, so many people willing to pitch in for an answer. Many thanks gentlemen!
We have a HANA on premise. Hence the role HCP_SYSTEM does not exist. Can you tell the role content in order to re-build it on premise?