Skip to Content

Enterprise Data Delivery

Many enterprise customers are working on or thinking about building an enterprise level data delivery platform and analytics solutions based off of that, using SAP HANA. Along with the streaming / IoT data, enterprise data from multiple transaction systems can be brought into a single (or more) enterprise data warehouse system. SAP HANA with many of it’s capabilities works as one of the best option to be the enterprise data warehouse system.

For building this, data from multiple potential source systems or any other sources can be delivered to the target SAP HANA system either in real-time or using batch process. We currently have multiple delivery tools options like SAP Data Services, SLT, Smart Data Integration (SDI) etc., to replicate or load the data. In all this SDI has a huge advantage for few reasons. First, it is built in natively with SAP HANA to help start the data integration in much quicker fashion instead of depending on another external solution for data integration. Secondly, it supports integration to varied types source systems using many types of adapters available with the product. For example, SDI can help to integrate data from different databases systems based on Oracle, DB2 or MSSQL server, and also from other sources like social media portals like Facebook, Twitter, Googleplus etc.,

SDI Functional components

There are three major functional components in the SDI. First, Remote data capture functionality, which is configured in the source system itself. Second, data provisioning (DP) agent which helps for hosting multiple agents connecting the source system and the target HANA. This can potentially reside on the source system itself, but for ease of managing and avoid any performance issues, preference is to have it as a separate system hosted in a Linux or Windows environment. Thirdly the DP server, which is native in the SAP HANA itself. Please refer to the Architecture blog if you want to understand more on the SDI architecture framework.

LogReader Adapter vs. ECC Adapter

The focus in this blog is mainly to provide the functionality difference of two specific adapters provided with SDI. The LogReader adapter and the ECC Adapter.

LogReader Adapter is to read the changes in the remote system using the logs generated and help replicate these changes to the target SAP HANA system using the DP agent and the DP Server.

ECC Adapter is same as the LogReader Adapter in the remote system change read process, but it has additional functionality if the source system is ECC system.

At a high level, one can safely assume that the LogReader adapter is meant for acting as a DP provisioning adapter for any database application systems including SAP ECC and its extensions. This may be the correct option in many of the scenarios, but not the right adapter if you are planning to replicate logical tables that exist in the source systems but at the same time not available as database level physical objects to the underlying database.

In the SAP ABAP based systems, there are certain table objects namely the cluster and pool tables.  These are special table types that are only available in the ABAP dictionary and not available at the database level. For this reason, if any of the source system is an SAP ABAP based systems, these special tables will not be available for replication through the standard LogReader adapter, if there is a need to replicate them. In these cases, if the replication has to happen on the database level, one can configure the ECC adapter for use (specific to the database underneath) with the ECC system for real-time replication. (Of course, there are other adapters based off of the application level read like ABAP adapter, but they go through the application level to fetch the data. I will cover these detail in another blog in the near future). It is a good idea to confirm the certification of the adapter before implementing the ECC adapter in any of the SAP ECC extension systems like CRM, SCM etc.,

Replication of Cluster and Pool tables

To understand it better, let us take an example scenario. The tables BSEG and BSEC are few of the cluster tables under the Table cluster RFBLG. These tables will be available only through ECC Adapter if consumed through the database layer.

Fig 1. Table cluster definition in ECC, with cluster tables defined under it.

With just the LogReader adapter, if you want to replicate any of this cluster table or tables using SDI’s web workbench, you will not see the cluster tables even available for replication.

Fig 2. Missing Cluster table using LogReader adapter

Now, the ECC adapter are built on top of Data Provisioning log reader adapters as a wrapper, basically using the same log reader adapter for that database. In fact, for using the ECC adapter, the same LogReader API (same as the one for LogReader adapter) is still used for the remote data changes. The difference is what adapter you are registering at the DP agent level. In addition to supporting the normal transparent tables in the ECC database, ECC Adapter also supports the ECC metadata browsing and support for cluster and pool tables in SAP ECC. Also the ECC adapter provides the declustering and depooling functionality to create the cluster and pool tables as physical tables in the target SAP HANA system.

Currently the following databases are supported.

  • IBM DB2
  • Oracle
  • MS SQL Server
  • SAP ASE

To emphasize, the ECC Adapter has to be registered and configured in the DP Agent for the ECC remote source system. After this, a remote source can be created in the SAP HANA system using the ECC adapter to the specified remote system.

One other requirement that need to be completed before the cluster and pool table can be replicated. is, their metadata needs to be loaded into SAP HANA (target system) schema. SAP has made a procedure available for this purpose, called replicate_dictionary.sql. This procedure has to be executed with specific parameters for five specific data dictionary tables in ECC, which loads the metadata from the source system into the target HANA system in the specified schema. Following are the ECC dictionary tables that will be be loaded when the procedure is called.

  • DD02L
  • DD03L
  • DD16S
  • DDNTT
  • DDNTF

Before you run the procedure please make sure the the HANA_SCHEMA, remote_source_names, table_names are updated in the procedure one after another for all these five tables. Running the procedure will materialize these tables under the specified schema,

call materialize_dictionary_table(‘<HANA_SCHEMA>’,'<remote_source_name>’,'<table_name>’);

After this is executed, all the dictionary information will be loaded into the SAP HANA target system at the specified schema and it is available for the Web workbench where you can create the replication tasks, to choose the required table including transparent tables to replicate the data.

When you create the replication task, now when you search for a cluster table, you can see the table available now for replication.

Fig 3. Cluster table available using ECC adapter

Now after creating the replication task, you can run the task and the data will be replicated using the ECC adapter. And here is a screen grab that shows the successful data load in the Task Monitor screen of the replication task.

 

Few additional points to note:

You cannot have both the LogReader adapter and ECC adapter configured at the same time for the same source system for certain database source systems like DB2, please make sure you confirm on this for your specific source system database type.

SQL pushdown is limited with the Cluster tables, and not available for the Pool tables.

Summary

The below information gives the brief functionality differences between the LogReader and the ECC adapters.

LogReader adapter

  • It can read ECC’s normal tables
  • Uses the LogReader for reading the source data

ECC Adapter

  • It can read ECC’s normal tables
  • Uses the LogReader for reading the source data
  • Can read Cluster and Pool tables
  • Helps in de-clustering and de-pooling
To report this post you need to login first.

5 Comments

You must be Logged on to comment or reply to a post.

  1. Mohamed Judi

    Excellent article, Vasi! Thank you for putting this together. I found it very easy to read and contains the right level of details.

     

    (0) 

Leave a Reply