Technical Articles
Real-Time Replication between SAP HANA Cloud and SAP HANA on-premises | Hands-on Video Tutorials
With this blog series we provide an update with the latest information on getting started with SAP HANA Cloud on the SAP Cloud Platform.
For more information about the free trial, see For the new features overview posts, see Questions? Post as comment. Useful? Give us a like and share on social media. Thanks! |
Hands-On Video Tutorials
Smart Data Integration Master, Tahir Hussain Babar (a.k.a. Bob), just added a new series to the SAP HANA Cloud playlist on the SAP HANA Academy YouTube channel with nine new videos showing how we can configure real-time data replication between an SAP HANA platform system (on-premises) and SAP HANA Cloud.
In this blog, you will find the videos embedded with some additional information and resources. You can watch the nine video tutorials in about one hour and a bit. What you get back is:
- How to setup the SAP Cloud Platform environment and create SAP HANA Cloud instance
- How to configure the SAP Web IDE development environment to work with SAP HANA Cloud
- How to install and configure an SAP HANA smart data integration (SDI) Data Provisioning Agent (DPA) and register the SAP HANA Adapter
- How to create a remote source and virtual table (runtime)
- How to create an HDI container and access external objects
- How to create a virtual table (design time)
- How to create and execute a Flowgraph
- How to create a replication task for real-time replication between an SAP HANA platform system on-premises as source and SAP HANA cloud as target.
To bookmark the playlist on YouTube, go to > SAP HANA Cloud.
About SAP HANA Smart Data Integration
Documentation
For the documentation see the SAP HANA Cloud Getting Started Guide and the SAP HANA Administration Guide about how to configure remote data sources and virtual objects. See the SAP HANA Smart Data Integration guide(s) for information about how to use SAP Web IDE to develop the artifacts for real-time replication.
- SAP HANA Cloud Getting Started Guide
- Data Access with SAP HANA Cloud, SAP HANA Administration Guide
- Installation and Configuration Guide, SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
- Modeling Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality, SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
SAP Community Blog Posts
Highly recommended as well are the blog posts from Takao Haruki on this topic:
- Transferring data from On-Premise SAP HANA to SAP HANA Cloud using SDI
- Transferring large amount of data to SAP HANA Cloud
- Checking the behavior of SAP HANA Cloud virtual table replica feature
Learning Track
In this learning track, you will find a series of tutorials on how to incorporate SAP HANA Cloud into your existing data landscape based on SAP HANA on-premise.
1. Series Overview
The first video introduces the series with two short demos showing SAP HANA Smart Data Access (SDA) and virtual tables and SAP HANA Smart Data Integration (SDI) with flowgraphs for batch mode or real-time replication. An architecture discussion explains the systems, technologies, and activities.
For this demo, AWS is used as cloud provider and the following resources have been created:
- SAP HANA Cloud (SAP Cloud Platform, Cloud Foundry environment)
- SAP HANA, express edition: simulating on-premises system
- Windows system with Data Provisioning Agent (DPA)
In the subsequent videos, Bob explains how to set this up.
[0:00] – Introduction, about SAP HANA Cloud, documentation
[3:00] – Demo showing Smart Data Access virtual table
[4:00] – Demo showing Smart Data Integration with flowgraphs in SAP Web IDE for an ETL job, for execution in batch mode or for real-time replication.
[5:20] – Architecture explanation for DPA, Remote Source, HDI Containers, user-provided service, virtual tables, flowgraphs, and replication tasks
2. Create SAP HANA Cloud Instance
In the second video tutorial, we learn how we can create an SAP HANA Cloud instance.
The API endpoints for the Cloud Foundry Org and for the SAP HANA Cloud instance service are needed for later reference.
For the documentation, see
[0:00] – Connect to the SAP Cloud Platform
[1:00] – Create SAP Cloud Platform subaccount (AWS)
[2:00] – Enable Cloud Foundry and create space. API endpoint for Cloud Foundry org
[3:30] – Configure Entitlements: add service plans: hana, hdi-shared, MEMORY 4GB
[6:00] – Create instance
[7:30] – API endpoint for SAP HANA Cloud service
[8:00] – Connect to SAP HANA cockpit
3. Security and SAP Web IDE
In this video a development user account is created in the SAP HANA Cloud database. We also enable SAP Web IDE as SAP Cloud Platform service and configure the tool for our project.
For the documentation, see
[00:00] – Introduction
[01:00] – Create new database user using the SAP HANA database explorer SQL console using a script from the SAP HANA Academy GitHub repository. System privileges CREATE REMOTE SOURCE, ADAPTER ADMIN, and AGENT ADMIN are granted.
[02:00] – Connect as devuser.
[03:20] – Create new subaccount using SAP Cloud Platform Cockpit (HANA Tools) in Neo environment.
[04:10] – Enable service SAP Web IDE Full-Stack
[04:40] – Configure Service: add user as DIDeveloper
[05:10] – Configure Security > Trust Management: Local Service Provider: Principle propagation = Enabled
[06:00] – Launch SAP Web IDE
[06:30] – Workspace Preferences > Extensions: SAP EIM Smart Data Integration Editors and SAP HANA Database Development Tools
[07:00] – Workspace Preferences > Cloud Foundry: API Endpoint of Cloud Foundry subaccount hosting the SAP HANA Cloud environment (video #2 [02:00]).
[08:45] – Launch database explorer embedded in SAP Web IDE.
[09:30] – Change schema to devuser
4. Data Provisioning Agent Installation and Configuration
In this video, Bob shows how to install the SDI Data Provisioning Agent (DPA) on a Windows computer and register the agent and the HanaAdapter with the SAP HANA Cloud database.
For the documentation, see
[00:00] – Introduction
[01:00] – Download SDI Data Provisioning Agent for your platform from SAP Development Tools [tools.hana.ondemand.com/#cloudintegration]
[01:30] – Install DPA and specify: Agent unique name, service account, default ports 5050 and 5051
[04:00] – Launch DPA. Specify HANA hostname [SAP HANA Cloud endpoint], port [443], agent admin user [DEVUSER]
[05:00] – Connect and register agent with agent name [sha_agent] and hostname [IP Public address]
[06:20] – Verify registration in the catalog browser of the database explorer (SAP Web IDE)
[06:40] – Register HanaAdapter
SELECT * FROM ADAPTERS;
SELECT * FROM AGENTS;
5. Creating Remote Sources
In this video tutorial, we learn how to create a remote source and virtual in the SAP HANA Cloud database connecting to SAP HANA, express edition.
For the documentation, see
[00:00] – Introduction and recap
[01:00] – Showing SALES source table in SAP HANA, express edition
[02:20] – Create remote source in SAP HANA Cloud providing name, adapter, host, port number, database name, schema, and credentials. Highlighting CDC (Change Data Capture) settings.
[05:00] – Connect to remote source and create virtual object for the remote SALES table.
[06:00] – Validate access to remote object and open data.
[07:00] – Validate update change from source to target.
[08:00] – Validate update change from target to source.
6. HDI Containers
SAP HANA HDI uses containers to store design-time artifacts and the corresponding deployed run-time (catalog) objects. The design-time container is used to store the design-time representations of the catalog objects that you want to create during the deployment process. The database run-time container (RTC) stores the deployed objects built according to the specification stored in the corresponding design-time artifacts.
In this video tutorial, we learn how to create a HDI Containers in the SAP Web IDE.
For the documentation, see
[00:00] – Introduction and recap
[02:15] – Build new project in SAP Web IDE: Cloud Foundry > SAP HANA Database Application for SAP HANA Cloud
[03:30] – Build project
[04:00] – Build database artifact
[04:30] – Show hdi-shared service instance and application
[05:15] – Add database to database explorer for HDI container. Remote sources are not available for HDI containers so we will need to create a user-provided service.
7. HDI to Classic Schema Access
In this video tutorial, we learn how to access objects stored in the SAP HANA Classic schema from a HDI Container. User-Provided Services, mta.yaml files and “grants” files are discussed.
For the documentation, see
[00:00] – Introduction and recap
[01:00] – Create user-provided service with the credentials to the schema in SAP HANA Cloud where the remote table resides.
[02:00] – Update the configuration file for the multitarget application (MTA.yaml) created in the previous video adding the user-provided service to the modules and as resource
[03:45] – Rebuild database module
[04:00] – Create hdbgrants artifact with code snippet providing object owner rights and application user rights and build.
8. Virtual Tables and Flowgraphs
In video 5 we created a virtual table as catalog object (runtime) to validate our configuration. In this video we perform the same task but this time as development artifact.
For the documentation, see
[00:00] – Introduction and recap
[02:00] – Create new virtual table in the project providing virtual table name, remote source, database, schema, and object.
[04:00] – Remote table will be visible in the HDI container as table
[05:30] – Create a Flowgraph in the project, add data source and select virtual table.
[08:00] – Execute the Flowgraph.
[08:30] – Target table and task (procedure) have been created.
9. Replication Tasks
In the last video tutorial, Bob introduces creating Real-time Replication Tasks in a HDI Container when using Smart Data Integration (SDI) in the SAP HANA Cloud.
The SAP HANA Administration Guide documents the equivalent SQL statements and system views:
[00:00] – Introduction and recap
[01:30] – Create a new replication task in the project
[02:00] – Select remote source and object (dialog similar to video #5), and specify replication behavior.
[04:50] – Build the object
[06:00] – SOURCE_SALES and TARGET_SALES virtual tables are created in the target system (SAP HANA Cloud).
[06:45] – Remote subscription, replication procedure, and task are created in the target
[07:30] – Call procedure.
[08:00] – CDC (Change data capture) tables and triggers are created in the source system (SAP HANA, express edition).
[08:45] – Confirm replication works with UPDATE and INSERT statement.
Share and Connect
Questions? Post as comment.
Useful? Give us a like and share on social media. Thanks!
If you would like to receive updates, connect with me on
- LinkedIn > linkedin.com/in/dvankempen
- Twitter > @dvankempen
For the author page of SAP Press, visit
For the SAP HANA Cloud e-bite, see
Great blog - right when I needed to share this information!
Much appreciated.
Amazing!! Thanks Denys...
Helle everyone,
currently I face an issue which i can't solve myself. Maybe I will find a solution by posting my challenge.
Scenario:
Note: The scenario described above worked pretty fine for about two hours. We captured every single data change coming from the DB2 source system. At one point we truncated the table we had in use for testing directly within DB2. Since then I am no longer able to get a replication task mode "initial + realtime" running although "initial only" works as fine as before.
So the moment I execute the replication task in "initial + realtime" i receive the following error message:
Could not execute 'CALL "DELTA_XXX"."DataWareHouse.Database.Replication ...'
Error: (dberror) [686]: start task error: "DELTA_XXX"."DataWareHouse.Database.Replication Tasks::XXX_DELTA.START_REPLICATION": line 27 col 6 (at pos 1110): [140038] Error executing SQL command in task.;Error executing ALTER_REMOTE_SUBSCRIPTION_QUEUE with command: 'ALTER REMOTE SUBSCRIPTION "DELTA_XXX"."SUB_XXX" QUEUE'. ,exception 71000129: SQLException
exception 71000256: QUEUE: SUB_XXX: Failed to add subscription for remote subscription SUB_XXX[id = 16579466] in remote source XXX_LogReader[id = 14918173]. Error: exception 151050: CDC add subscription failed: RS[XXX_LogReader]: Failed to add the first subscription. Error: Failed to subscribe table ["SCHEMA"."XXX"]. Error: Initialization for replication of database <XXX> has not been done.
Maybe you guys have a working solution for me. Thanks in advance.
Hi Sunny,
Would you mind posting this comment as a question on the forum: answer.sap.com? I have a few ideas but the Answers section of the Community is more suitable for an exchange. Thanks
Hi Dennis,
thanks for the advice. You can check on the issue HERE.
Hi Dennis,
This blog says on how to establish a real time replication between SAP HANA on-premise to SAP HANA Cloud.
We are planning to establish a real time replication between HANA Service to HANA Cloud. Is it possible to set up a real time replication connection between SAP HANA Service to SAP HANA Cloud?
How can we achieve that?
Hi Varnshi,
According to the SAP HANA SDI PAM (page 7), batch processing is supported but not real-time replication
https://support.sap.com/content/dam/launchpad/en_us/pam/pam-essentials/TIP/PAM_HANA_SDI_2_0.pdf
Hi Denys van Kempen
Thanks for the blog!
When I trying to create the remote subscription , I am getting an error:
"invalid adapter name: HANAODBC"
I am using SDA to create remote source and the adapter I have choose is hanaodbc, here is the query I used to create remote source as mentioned here
And I am able to create virtual tables and query them.
your advice would be really helpful
Hi Varun,
Suggest to post the comment as question to the forum > https://answers.sap.com
Tags: #saphana #saphanacloud
Those that follow the tag are notified (experts from SAP, partners, customers).
This might help to resolve the issue faster
Thanks for the rsponse Denys van Kempen .
I have posted it here
Great Blog Denys!
One question, is "SAP HANA Smart Data Integration" independent from "SAP Integration Suite"?
So if I don't have the integration suite can I use SDI?
Thank you
Hi Andrea,
Thanks!
Yes and yes.
FYI
SAP HANA smart data integration (SDI), is a "version/edition" of the generic SAP Data Services, Data Integration tool specific to SAP HANA, released around 2014. The technology originates from ActaWorks (late 1990s) via BusinessObjects.
SAP Integration Suite is a portfolio (box) of integration services gathered around Cloud Integration, previously marketed as SAP Cloud Platform Integration (CPI), SAP HANA Cloud Platform, integration service, and HANA Cloud Integration (HCI), released 2013. The technology originates from the same Data Integration tool complemented with process integration/orchestration, a web UI, Eclipse plugin-in, etc.
I am not happy with the hint "SDI is a version/edition of the generic SAP Data Services". The two do not share the technology, the use case, the capabilities.
Hi Werner,
Thanks for your input . You would know 😉
Would you mind sharing how SAP (BusinessObjects) Data Services technology relates to HCI, SDI, SDA, etc. (if at all)?
Thanks
Denys van Kempen I can try.
Use case:
Technology:
Relationship between SDI and BODS:
When we designed SDI, the thought was that BODS will be used as ETL tool for hybrid use cases and for cases where Hana is the center of the world, better have something with a better fit of Hana's realtime story. Hence SDI is using Hana data types, SDI supports federation&batch&realtime, does focus on transactional consistencies, etc. (As said before, BODS is batch only. Certainly no federation, almost no realtime and no real option to load data transactional.)
To enable that, SDI/Hana needs functionalities to ...
So the only possible relationship between BODS and SDI is the Flowgraph editor.
Because of the ease of use of BODS transforms to solve any data integration problem (and some more reasons), the Flowgraph transforms look mostly the same. Some transforms are missing in the Flowgraph editor even today, though, and no plans to add them as far as I am aware.
If you concur with my opinion that the Flowgraph editor lacks too much for being used in enterprise projects, no commonality between the two remains.
As of today I recommend to my customers:
HCI is a different beast. Its origins are in the Enterprise Application Integration space, where webservice calls are chained together in order to implement a simpler API, e.g. mark an employee as retired in SuccessFactors and at the same time remove the credentials in the Active Directory to prevent further login with this account. Very low level, where you manually set http headers, parse responses, etc.
You can obviously ask why there are different tools to call webservices but SAP is not special here. Almost all companies have a hard split between Data Integration tools and EAI tools despite the huge overlap. That is mostly for historical reasons but today we have the technologies to provide a single tool for both and the first combined products start to appear.
Thanks for taking the time for this extensive comment. Much appreciated.
Hello All,
we have a central cloud hana database and from there it is being sent to multiple other smaller application related database. Is it possible to achieve real time data transfer within the cloud to cloud ?
If yes can anyone share the readings related to the same?
Thanks!
Regards,
Ratish
Hi Ratish,
Best to post this comment as a question on the community forum
This allows for a better reach and more timely response.
(copy/paste is fine)
Hello Denys,
may I ask you what are the next steps after data replication into tables?
Table will be the part of HDI container. Is that possible to have these tables permanently stored in SAP Hana Coud database under specified schema instead of the schema of hdi container?
Many thanks for your answer!
Best Regards
Veronika Bajus