Skip to Content

By Martina Galabova, Valeri Sedevchev, Petra Lazarova and Daniel Vladinov

Here’s the deal:
You have a great application in the cloud that makes it possible to showcase your products or services to thousands of users over their mobile devices (put here any other scenario you might have in mind). Yet, the process of making all-important data (e.g. your public product catalog) feed this cloud application is still maintained in your on-premise hosted ERP backend system. You want to only replicate a subset of the data and, once such secure replication is set up initially, to have it running automatically and in real time. (In this way, all your users get fresh information for your latest and greatest products.)

So now what?

Easy-peasy: The solution is to replicate the necessary backend data to your productive SAP HANA instance on SAP HANA Cloud Platform and use the data in your application.

As you may have already guessed, in this blog you will learn how to get the data you need, when you need it, and put it where you need it! Or, in simple words (for developers only 🙂 ): how to replicate your data from an on-premise system to your productive SAP HANA instance on the cloud via SLT (i.e. SAP Landscape Transformation Server).

The following diagram illustrates the process of replicating data on SAP HANA Cloud Platform/wp-content/uploads/2014/03/p1_402512.png

The situation:

What you have:

  • On-premise ERP system that holds your data – let’s call it Source;
  • Productive SAP HANA instance – let’s call it Target system, available in your SAP HANA Cloud Platform account;
  • SAP HANA Cloud Platform SDK: it brings the needed DB tunnel tool;
  • SLT system installed as on-premise to replicate the data from Source to Target system via the DB tunnel;
  • SAP HANA Studio to configure and manage the Target schema, users, roles and privileges.

What you want:

  • SAP HANA XS application that consumes and processes the backend ERP data
  • Near real-time data availability

What to do:

  1. Set the Users.
  2. Configure the Replication.
  3. Go and Replicate.
  4. Enjoy! 🙂


You need to have:

Let’s start!

I. Set the Users

Desired Result: Getting a replication user with all required permissions that you will need for the actual replication.

The whole procedure with setting up the users requires three types of users:

  • Your SCN user that you need to open the DB connectivity tunnel ( for example, p1234567)
  • Once the tunnel is open, it returns an SAP HANA DB user that grants you access to the dedicated SAP HANA instance on SAP HANA Cloud Platform via your SAP HANA Studio (it has the same name as your SCN user; in our example,  p1234567 )
  • In the SAP HANA Studio, you create a third user (<DR_ADM_USER>) that you need for the replication configuration itself.


  1. Open a DB tunnel for secure connection from your local machine to SAP HANA Cloud Platform. To do this, use your SCN user and enter command  neo open-db-tunnel. Then enter the required information, such as:
    • Landscape host
    • Your account
    • Your landscape account User
    • Target HANA database name – you will get this name once you order your productive HANA in your landscape.
      neo open-db-tunnel -i <target_hanadb_name> -h -a <my_account> -u <my_SCN_user>/wp-content/uploads/2014/03/p2_399083.png
      For more information, see the Documentation.

      RESULT: The DB tunnel makes your productive HANA system locally accessible via the properties displayed in the console output: Host name (usually localhost), JDBC Url, Instance number, newly created database User, and Initial password for it.

  2. Open your SAP HANA Studio and add a new system, using the credentials provided by the tunnel. When prompted, change the initial password to a permanent one.  Hint: this is the automatically created HANA database user with a name equal to your SCN user.
    NOTE: Remember the specified permanent password – you’ll need it later to configure the Target credentials for replication.
  3. In the SAP HANA Studio, create a new user that will be used as SLT administrator. This automatically creates a new schema with the same name. Provide the name and initial password for it. In our example, we will use the name <DR_ADM_USER>. Note that the initial password will be changed in just a moment (step 5 below)!
  4. Configure the following settings for  <DR_ADM_USER> :
    1. In the Granted Roles tab, add HCP_SYSTEM role.
    2. In the Object privilege tab, add REPOSITORY_REST SQL object with EXECUTE privilege.
    3. If SYS_REPL schema already exists, the EXECUTE, SELECT, INSERT, UPDATE and DELETE privileges should be grantable to others. SYS_REPL is a system schema used by SLT to store replication configuration data. Skip this step if SYS_REPL schema is not present!
    4. Save the <DR_ADM_USER> user settings.
  5. In the SAP HANA Studio, right-click on your system (from tab Systems), and select the “Add Additional User…” option to log on with the <DR_ADM_USER>. SAP HANA Studio will prompt you in a pop-up window to change the initial password.
    The second entry will appear in the Systems tab so that you can log off from the  p1234567 login session.

II. Configure the Replication

Desired Result:  Establishing a secure connection from your SLT to the dedicated SAP HANA on SAP HANA Cloud Platform, and to configure the replication, so that everything is set and ready to go.

You need to open a DB tunnel from your SLT system to the dedicated SAP HANA instance on SAP HANA Cloud Platform. (Create replication configuration to establish the connection between your on-premise ERP system to your dedicated HANA instance on the platform.) To establish a secure connection from your SLT system to your dedicated SAP HANA instance, you have to open a DB tunnel from your on-premise SLT machine.


  1. Log in to your SLT machine.
  2. Download and install the latest version of SAP HANA Cloud Platform SDK (Java Web) in accordance with page Installing the SDK.
  3. Open a DB tunnel with your SCN user to your target schema. The tunnel allows SQL statements only for the owner of that schema to pass through it. To keep the tunnel permanently alive, you should add the additional parameter –timeout 0
    neo open-db-tunnel -h -a <my_account> -u <my_SCN_user>-i <target_hanadb_name> –timeout 0/wp-content/uploads/2014/03/p7_399092.png/wp-content/uploads/2014/03/p8_399093.png
    For more information, see the Documentation.
  4. Create Replication Configuration
    This document applies to SP05 of DMIS add-on. In previous versions, all the data is populated on one screen.
    1. Log in to your SLT system and open transaction LTR.
    2. In the opened browser window, log on and open the New Configuration wizard.
    3. Enter replication name.  The wizard will create a new schema in the Target HANA box in your SAP HANA Cloud Platform account with that name that holds your replication data.
    4. Choose Next.
    5. Specify Source System: Choose your RFC destination connecting to the Source (ERP) system.
      NOTE: By default, only one replication is allowed per source system.  If you want to create multiple replications (that is, replication to multiple SAP HANA systems) from your source system, select the “Allow multiple usage” checkbox.
    6. Specify Target System: In the System Data form, enter the following:
      • Administration User Name:<DR_ADM_USER>
      • Password:  the password for <DR_ADM_USER>
      • Host Name: localhost – the tunnel enables connectivity to the cloud via requests to localhost
      • Instance Number: the instance number retuned by the DB tunnel you opened from your SLT machine
    7. Choose Next.
    8. Continue the configuration as described in the Documentation.

III. Go and Replicate

Desired Result: Laying back and enjoying the seamless and safe replication process, and enjoying your app anytime and from anywhere.


  1. Open your SAP HANA Studio and switch to the Modeler Perspective.
  2. Open Quick Launch with your <DR_ADM_USER> user.
    Your system and username are listed there. You can switch the system by choosing the Select System… button.
  3. Open the Data Provisioning… editor from the Data pane in the bottom-middle part of the screen.
  4. Wait for your system tables (DD0xx) to be replicated. You can press the refresh button to see the latest status of the replication tables. The tables will go through few different actions and statuses. When they reach status “In Process” with action “Replicate” – which is an indication they are already (and will stay) replicated – you are ready to proceed with the replication of the tables needed for your cloud application.
  5. Choose the Replicate button and select the tables you want to replicate. For more details you can check this HANA Academy video.
  6. Enjoy! 🙂

So What Did You Just Do?

You can now quickly and easily replicate your on-premise data in SAP HANA Cloud Platform, where you can develop apps using this data.

Basically, you have conquered your own small data cloud in the big SAP HANA Cloud Platform family! 🙂

To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Former Member

    Martina, this is great. This is exactly the information I was looking to find.

    I have one outstanding question however, with regard to the DB tunnel. Typically in an on premise HANA solution, as long as you are in the network, of course end users with BI clients (think SAP Lumira, Analysis for office, etc) would be able to connect to HANA via an ODBC/JDBC connection.

    In the case of HCP, all connections require a tunnel to access (both Studio and SLT). Does this mean that business users wishing to use client based BI tools would also need a local DB tunnel configured like this to access HANA?

    I am thinking the answer is yes, just thought I would ask.



    1. Former Member

      Hello Justin,

      the tunnel is needed to bridge the isolated networks (customer network and HCP network) in a secure way. ODBC/JDBC  are not real protocols which offer e.g. encryption, this is added by the SSL tunnel.

      We will integrate the DB tunnel soon into the SAP HANA Cloud Connector. Once available, one SCC instance can be used shared for all users of a company: You would then connect your local DB tools against the host + DB port of the cloud connector.

      Best regards, Timo

      1. Former Member

        Thanks for the info, very helpful. Technically, even when using the tunnel, the applications are still using the same protocol, except they ‘see’ the host as localhost when in reality it is a remote system, correct? And this would be true of ANY system wanting to connect, period.

        Not too familiar with HCC quite yet, I’ll have to dig into the documentation on that one. Many of these components are new to me! Any hints on when this change would happen? Just wondering how to communicate these components to potential customers, and it sounds like HCC would remove some complexity in the use case?



        1. Former Member

          Yes, your description is correct. It would act as local proxy to the remote HANA DB, and tools connect to the local proxy instead to the DB directly.

          I can’t make concrete statements about the timeline, but we plan to add this feature soon, so we are rather speaking of few months, i.e. the near future.

          1. Former Member

            Fair enough, until then, all access still requires a tunnel from each individual app needing access, ie would need to set up tunnel on each end user machine?

            Any preferred reading material around HCC?

            Many thanks,


            1. Former Member

              Using the command-line to open the DB tunnel locally to your application is the best approach at the moment. This format of the DB tunnel was meant for development scenarios, in which developers want to connect the HANA Studio against HANA in the cloud. For other scenarios, like yours, the upcoming solution with the cloud connector should be better suited, also as the SCC integrated solution will provide further enterprise level features, like monitoring, auditing, etc.

              Regarding further reading, I am only aware of  the official help about the DB tunnel (, and this bog describing how to use it with SLT for replication.

              Best regards, Timo

              1. Former Member

                Thank you for your time today, really helped me clear my doubts.

                I was referring to HCC with regard to further reading, are there any particularly good docs you are aware of?



        1. Former Member

          Hi Justin,

          RFC is supported already in the Cloud Connector, JDBC/ODBC is supported in form of the DB tunnel / command line tool. As said, we plan to add this directly to the connector as well.

          Regards, Timo

            1. Markus Tolksdorf

              Hi Justin,

              HCI is a process integration solution – that’s an integration on a higher level of abstraction. SAP HANA Cloud connector and its counter part in the cloud the Connectivity Service are offereing technical connectivity.

              Best regards,



Leave a Reply