Skip to Content
Technical Articles

Real-Time Replication between SAP HANA Cloud and SAP HANA on-premises | Hands-on Video Tutorials

 

With this blog series we provide an update with the latest information on getting started with SAP HANA Cloud on the SAP Cloud Platform.

  1. About SAP HANA Cloud
  2. SAP HANA Cloud Getting Started
  3. SAP HANA Cloud and SAP Business Application Studio
  4. HDI with SAP HANA Cloud
  5. SAP Analysis for Microsoft Office and SAP HANA Cloud
  6. Cloud Foundry Advanced
  7. SAP HANA Cloud and SAP BTP Trust
  8. Data masking and data anonymization
  9. Predictive Analysis Library (PAL) and Automated Predictive Library (APL)
  10. Remote data sources and virtual tables
  11. OData with SAP HANA Cloud
  12. SAP HANA Cloud Graph
  13. Role Attributes
  14. SAP HANA Cloud and Smart Data Integration <=

For more information about the free trial, see

For the new features overview posts, see

Questions? Post as comment.

Useful? Give us a like and share on social media.

Thanks!

/wp-content/uploads/2016/02/sapnwabline_885687.png

Hands-On Video Tutorials

Smart Data Integration Master, Tahir Hussain Babar (a.k.a. Bob), just added a new series to the SAP HANA Cloud playlist on the SAP HANA Academy YouTube channel with nine new videos showing how we can configure real-time data replication between an SAP HANA platform system (on-premises) and SAP HANA Cloud.

In this blog, you will find the videos embedded with some additional information and resources. You can watch the nine video tutorials in about one hour and a bit. What you get back is:

  • How to setup the SAP Cloud Platform environment and create SAP HANA Cloud instance
  • How to configure the SAP Web IDE development environment to work with SAP HANA Cloud
  • How to install and configure an SAP HANA smart data integration (SDI) Data Provisioning Agent (DPA) and register the SAP HANA Adapter
  • How to create a remote source and virtual table (runtime)
  • How to create an HDI container and access external objects
  • How to create a virtual table (design time)
  • How to create and execute a Flowgraph
  • How to create a replication task for real-time replication between an SAP HANA platform system on-premises as source and SAP HANA cloud as target.

To bookmark the playlist on YouTube, go to > SAP HANA Cloud.

/wp-content/uploads/2016/02/sapnwabline_885687.png

About SAP HANA Smart Data Integration 

Documentation

For the documentation see the SAP HANA Cloud Getting Started Guide and the SAP HANA Administration Guide about how to configure remote data sources and virtual objects. See the SAP HANA Smart Data Integration guide(s) for information about how to use SAP Web IDE to develop the artifacts for real-time replication.

SAP Community Blog Posts

Highly recommended as well are the blog posts from Takao Haruki on this topic:

Learning Track

In this learning track, you will find a series of tutorials on how to incorporate SAP HANA Cloud into your existing data landscape based on SAP HANA on-premise.

/wp-content/uploads/2016/02/sapnwabline_885687.png

1. Series Overview

The first video introduces the series with two short demos showing SAP HANA Smart Data Access (SDA) and virtual tables and SAP HANA Smart Data Integration (SDI) with flowgraphs for batch mode or real-time replication. An architecture discussion explains the systems, technologies, and activities.

For this demo, AWS is used as cloud provider and the following resources have been created:

  • SAP HANA Cloud (SAP Cloud Platform, Cloud Foundry environment)
  • SAP HANA, express edition: simulating on-premises system
  • Windows system with Data Provisioning Agent (DPA)

In the subsequent videos, Bob explains how to set this up.

[0:00] – Introduction, about SAP HANA Cloud, documentation

[3:00] – Demo showing Smart Data Access virtual table

[4:00] – Demo showing Smart Data Integration with flowgraphs in SAP Web IDE for an ETL job, for execution in batch mode or for real-time replication.

[5:20] – Architecture explanation for DPA, Remote Source, HDI Containers, user-provided service, virtual tables, flowgraphs, and replication tasks

/wp-content/uploads/2016/02/sapnwabline_885687.png

2. Create SAP HANA Cloud Instance

In the second video tutorial, we learn how we can create an SAP HANA Cloud instance.

The API endpoints for the Cloud Foundry Org and for the SAP HANA Cloud instance service are needed for later reference.

For the documentation, see

[0:00] – Connect to the SAP Cloud Platform

[1:00] – Create SAP Cloud Platform subaccount (AWS)

[2:00] – Enable Cloud Foundry and create space. API endpoint for Cloud Foundry org

[3:30] – Configure Entitlements: add service plans: hana, hdi-shared, MEMORY 4GB

[6:00] – Create instance

[7:30] – API endpoint for SAP HANA Cloud service

[8:00] – Connect to SAP HANA cockpit

/wp-content/uploads/2016/02/sapnwabline_885687.png

3. Security and SAP Web IDE

In this video a development user account is created in the SAP HANA Cloud database. We also enable SAP Web IDE as SAP Cloud Platform service and configure the tool for our project.

For the documentation, see

[00:00] – Introduction

[01:00] – Create new database user using the SAP HANA database explorer SQL console using a script from the SAP HANA Academy GitHub repository. System privileges CREATE REMOTE SOURCE, ADAPTER ADMIN, and AGENT ADMIN are granted.

[02:00] – Connect as devuser.

[03:20] – Create new subaccount using SAP Cloud Platform Cockpit (HANA Tools) in Neo environment.

[04:10] – Enable service SAP Web IDE Full-Stack

[04:40] – Configure Service: add user as DIDeveloper

[05:10] – Configure Security > Trust Management: Local Service Provider: Principle propagation = Enabled

[06:00] – Launch SAP Web IDE

[06:30] – Workspace Preferences > Extensions: SAP EIM Smart Data Integration Editors and SAP HANA Database Development Tools

[07:00] – Workspace Preferences > Cloud Foundry: API Endpoint of Cloud Foundry subaccount hosting the SAP HANA Cloud environment (video #2 [02:00]).

[08:45] – Launch database explorer embedded in SAP Web IDE.

[09:30] – Change schema to devuser

/wp-content/uploads/2016/02/sapnwabline_885687.png

4. Data Provisioning Agent Installation and Configuration

In this video, Bob shows how to install the SDI Data Provisioning Agent (DPA) on a Windows computer and register the agent and the HanaAdapter with the SAP HANA Cloud database.

For the documentation, see

[00:00] – Introduction

[01:00] – Download SDI Data Provisioning Agent for your platform from SAP Development Tools [tools.hana.ondemand.com/#cloudintegration]

[01:30] – Install DPA and specify: Agent unique name, service account, default ports 5050 and 5051

[04:00] – Launch DPA. Specify HANA hostname [SAP HANA Cloud endpoint], port [443], agent admin user [DEVUSER]

[05:00] – Connect and register agent with agent name [sha_agent] and hostname [IP Public address]

[06:20] – Verify registration in the catalog browser of the database explorer (SAP Web IDE)

[06:40] – Register HanaAdapter

SELECT * FROM ADAPTERS;
SELECT * FROM AGENTS;

/wp-content/uploads/2016/02/sapnwabline_885687.png

5. Creating Remote Sources

In this video tutorial, we learn how to create a remote source and virtual in the SAP HANA Cloud database connecting to SAP HANA, express edition.

For the documentation, see

[00:00] – Introduction and recap

[01:00] – Showing SALES source table in SAP HANA, express edition

[02:20] – Create remote source in SAP HANA Cloud providing name, adapter, host, port number, database name, schema, and credentials. Highlighting CDC (Change Data Capture) settings.

[05:00] – Connect to remote source and create virtual object for the remote SALES table.

[06:00] – Validate access to remote object and open data.

[07:00] – Validate update change from source to target.

[08:00] – Validate update change from target to source.

/wp-content/uploads/2016/02/sapnwabline_885687.png

6. HDI Containers

SAP HANA HDI uses containers to store design-time artifacts and the corresponding deployed run-time (catalog) objects. The design-time container is used to store the design-time representations of the catalog objects that you want to create during the deployment process. The database run-time container (RTC) stores the deployed objects built according to the specification stored in the corresponding design-time artifacts.

In this video tutorial, we learn how to create a HDI Containers in the SAP Web IDE.

For the documentation, see

[00:00] – Introduction and recap

[02:15] – Build new project in SAP Web IDE: Cloud Foundry > SAP HANA Database Application for SAP HANA Cloud

[03:30] – Build project

[04:00] – Build database artifact

[04:30] – Show hdi-shared service instance and application

[05:15] – Add database to database explorer for HDI container. Remote sources are not available for HDI containers so we will need to create a user-provided service.

/wp-content/uploads/2016/02/sapnwabline_885687.png

7. HDI to Classic Schema Access

In this video tutorial, we learn how to access objects stored in the SAP HANA Classic schema from a HDI Container. User-Provided Services, mta.yaml files and “grants” files are discussed.

For the documentation, see

[00:00] – Introduction and recap

[01:00] – Create user-provided service with the credentials to the schema in SAP HANA Cloud where the remote table resides.

[02:00] – Update the configuration file for the multitarget application (MTA.yaml) created in the previous video adding the user-provided service to the modules and as resource

[03:45] – Rebuild database module

[04:00] – Create hdbgrants artifact with code snippet providing object owner rights and application user rights and build.

/wp-content/uploads/2016/02/sapnwabline_885687.png

8. Virtual Tables and Flowgraphs

In video 5 we created a virtual table as catalog object (runtime) to validate our configuration. In this video we perform the same task but this time as development artifact.

For the documentation, see

[00:00] – Introduction and recap

[02:00] – Create new virtual table in the project providing virtual table name, remote source, database, schema, and object.

[04:00] – Remote table will be visible in the HDI container as table

[05:30] – Create a Flowgraph in the project, add data source and select virtual table.

[07:00] – Add data target and build the artifact.

[08:00] – Execute the Flowgraph.

[08:30] – Target table and task (procedure) have been created.

/wp-content/uploads/2016/02/sapnwabline_885687.png

9. Replication Tasks

In the last video tutorial, Bob introduces creating Real-time Replication Tasks in a HDI Container when using Smart Data Integration (SDI) in the SAP HANA Cloud.

The SAP HANA Administration Guide documents the equivalent SQL statements and system views:

[00:00] – Introduction and recap

[01:30] – Create a new replication task in the project

[02:00] – Select remote source and object (dialog similar to video #5), and specify replication behavior.

[04:50] – Build the object

[06:00] – SOURCE_SALES and TARGET_SALES virtual tables are created in the target system (SAP HANA Cloud).

[06:45] – Remote subscription, replication procedure, and task are created in the target

[07:30] – Call procedure.

[08:00] – CDC (Change data capture) tables and triggers are created in the source system (SAP HANA, express edition).

[08:45] – Confirm replication works with UPDATE and INSERT statement.

/wp-content/uploads/2016/02/sapnwabline_885687.png

Share and Connect 

Questions? Post as comment.

Useful? Give us a like and share on social media. Thanks!

If you would like to receive updates, connect with me on

For the author page of SAP Press, visit

For the SAP HANA Cloud e-bite, see

5 Comments
You must be Logged on to comment or reply to a post.
  • Helle everyone,

    currently I face an issue which i can't solve myself. Maybe I will find a solution by posting my challenge.

    Scenario:

    • DB2 is connected via DB2LogReader Adapter to XSA using Smart Data Integration
    • Created a replication task which loads one single table
    • Load mode is "intial + realtime" and preserve all

    Note: The scenario described above worked pretty fine for about two hours. We captured every single data change coming from the DB2 source system. At one point we truncated the table we had in use for testing directly within DB2. Since then I am no longer able to get a replication task mode "initial + realtime" running although "initial only" works as fine as before.

    So the moment I execute the replication task in "initial + realtime" i receive the following error message:

    Could not execute 'CALL "DELTA_XXX"."DataWareHouse.Database.Replication ...'
    Error: (dberror) [686]: start task error: "DELTA_XXX"."DataWareHouse.Database.Replication Tasks::XXX_DELTA.START_REPLICATION": line 27 col 6 (at pos 1110): [140038] Error executing SQL command in task.;Error executing ALTER_REMOTE_SUBSCRIPTION_QUEUE with command: 'ALTER REMOTE SUBSCRIPTION "DELTA_XXX"."SUB_XXX" QUEUE'. ,exception 71000129: SQLException
    exception 71000256: QUEUE: SUB_XXX: Failed to add subscription for remote subscription SUB_XXX[id = 16579466] in remote source XXX_LogReader[id = 14918173]. Error: exception 151050: CDC add subscription failed: RS[XXX_LogReader]: Failed to add the first subscription. Error: Failed to subscribe table ["SCHEMA"."XXX"]. Error: Initialization for replication of database <XXX> has not been done.

    Maybe you guys have a working solution for me. Thanks in advance.

    • Hi Sunny,

      Would you mind posting this comment as a question on the forum: answer.sap.com? I have a few ideas but the Answers section of the Community is more suitable for an exchange. Thanks