Hana Smart Data Integration – Adapters
This post is part of an entire series
Hana Smart Data Integration – Overview
- Hana Smart Data Integration – Adapters
- Hana Smart Data Integration – Batch Dataflows
- Hana Smart Data Integration – Realtime Table Replication
- Hana Smart Data Integration – Realtime Sources with Transformations
- Hana Smart Data Integration – Realtime Sources with History Preserving
- Hana Smart Data Integration – Architecture
- Hana Smart Data Integration – Fun with Transformation Services
The foundation to Data Integration is being able to connect to various sources. Looking at the SPS08 Smart Data Integration option and its connectivity, you can see the usual suspects: Oracle, SQL Server etc.
With SPS09 and its Adapter SDA-extension not much did change, except one: There is an Adapter SDK now and you can write your own Adapters in Java!
The question towards connectivity is an obvious one. Without the ability to connect to a source system directly a workaround has to be used, e.g. writing files, copying them and then loading them into Hana. That files are cumbersome to handle is obvious as well. Or do you support Twitter? Workaround might be to use its underlying restful protocol. Common to all these workarounds is that they put all the development burden on the user. The user has to write a program to create files. The user has to create and parse the restful messages.
For the common sources that is no problem, all tools support relational databases of the various vendors. But even there you might find features unique to one database that is either supported or not.
While that is the case for Smart Data Integration Adapters as well, thanks to the Adapter SDK every Java developer can write adapters for Hana without compromising the Hana stability.
The most important change, from an architectural point of view, was to move as much code as possible out of Hana’s IndexServer into separate processes.
All that remains in the IndexServer is the optimizer code, translating the user entered SQL into an execution plan containing remote execution and local execution in – hopefully – the most efficient manner. The part of the SQL that should be sent to the source is handed over to the Data Provisioning Server process, which is another Hana process. This contain all the logic common to all adapters. Most important, it contains the communication protocol in order to talk to an agent process, the host of all adapters.
This architecture has multiple positive side effects:
- If anything happens to the remote source, Hana Index Server is not impacted. Since the Index Server is the core process of Hana, any core dump in any line of code could have brought down the entire Hana instance.
- Because the agent is an installation of its own, you can install the agent anywhere. One option is to place it on the Hana server itself. But that might not be preferred because then the entire middleware of all sources has to be installed there and the network has to allow passage of those middleware protocols. More likely the agent will be installed on the specific source and talk to Hana via the network. No problem as one Hana instance can have as many agents as required. Or a server of its own is used – possible as well.
- Because the agent can be installed anywhere, it can be even installed on-premise and be connected to a Hana cloud. instance. Not even a VPN tunnel has to be used as the supported protocol includes https as well. In this case the agent does establish a https connection to the Cloud Hana instance just as any other web browser would do.
- Developing an Adapter is much easier. Hana Studio has a plugin so it acts as an agent and now the developer can watch the internals easily.
Deploying existing Adapters
All the SAP provided Adapters are part of the Hana Agent installation, which is a separate download in SMP.
see Installations and Upgrades -> Index H -> SAP HANA SDI
in A- Z Index | SAP Support Portal
Once the agent is installed deploying an adapter is as easy as copying a jar file.
see SAP HANA Enterprise Information Management Administration Guide – SAP Library for details
Writing your own Adapter
The most important question is how difficult it is to write your won adapter. The goal of development was to make it as simple as possible of course but most important, you do not need to be an expert in Java, Hana and the Source system. Any Java developer should be able to write new adapters easily, just by implementing or extending some Adapter base classes.
Frankly it is quite simple, you start a new project in Hana Studio, a Java osgi/Equinox plugin project, a wizard builds the base class for you and your task is to add the code to open a connection to the source, list the source tables and their structure and the such. The SAP help portal has an excellent manual describing all step by step.
Create a Custom Adapter Using a New Plug-in Project – SAP HANA Data Provisioning Adapter SDK – SAP Library
Building an Adapter by Extending the BaseAdapterClass – SAP HANA Data Provisioning Adapter SDK – SAP Library
Using the Adapters
All Adapters follow the Hana Smart Data Access paradigm.
In Hana Studio you can create a new remote source, you browse the remote tables. And for selected tables you will create virtual tables so that these look and feel like any other Hana table and can be queried. All the complexity underneath is hidden.
Just imagine the power of this! You deploy the e.g. File Adapter on your file server.
Obviously the Adapter needs to follow certain security rules, like in the case of the FileAdapter the developer decided that you can query not the entire server but only files and directories within a specified root directory. And in addition the adapter requires the consumer to authenticate himself as a valid user.
Then the allowed Hana instance, local or remote or in the cloud, can see files as regular tables with a user defined table layout. Each of these tables can be instantiated as virtual table and when selecting from a virtual table, all files – or the files specified in the where clause of the query – are parsed using the defined file format and their data is shown.
We have a usecase to replicate real time data from SAP Business Suite(on MaxDB) to HANA and we plan to use SDI. Can you please let me know how to access the ECC adapter in HANA STUDIO.
Hi Sandhya, you activate the ECC Adapter in the Data Provisioning Agent Tool. But as there is no generic way to read changes, not to mention in realtime, from ABAP, each adapter is reading the database transaction log. That is the reason why you will find an ECC adapter for Oracle, DB2 etc. For MaxDB there is no way to read the transaction log, hence we have a problem.
Option 1: You provide us with a way to read the MaxDB transaction log and we build an adapter, or you do. Writing an adapter is rather simple.
Option 2: You use other means of getting the changes, e.g. triggers and an adapter is built for that.
Option 3: SLT is using triggers, we write an adapter connecting to SLT. That is something the SLT team and I are working on at the moment.
is there a possibility to get the SAP HANA Data Provisioning Agent as developer for testing purposes? The download is hidden in the support portal.
Not sure about the current status. I can download the it from service market place, in the A-Z dictionary for H, SAP Hana SDI.
But without a database it does not help much.
From what I know Product Management is setting up Cloud instances, I have asked them to respond to this post here.
Could you please help me resolve the below issue while registering adapter with SAP HANA Data Provisioning Agent.
My guess is, you did not copy the SQL Server jdbc jar file. This is not included in the adapter itself for legal reasons. Doesn't the install manual talk about that?
Your guess is absolutely right. Issue resolved.
Thank you Werner.
Hello Werner Daehn,
What should be JDBC file name for OracleLogReaderAdapter? we are getting similar error but it’s for Oracle Adapter. Please advise.
Adapter "OracleLogReaderAdapter" could not be registered with HANA server.
Context: Executing SQL sattement "Create adapter "OracleLogReaderAdapter" at location agent 'dpagent_<servername> failed with SAP DBTech JDBC:: Internal error: cannot get adapter capabilities. Failed to install and start Oracle JDBC driver bundle.
Let's start the other way round. I assume you went to the Oracle page, downloaded the JDBC driver bundle and placed all in the dpagent directory specified in the manual?
I like this!
and have been able to install the DP Agent
i see Twitter/Hive/ Adapter available as a
where can i find the files do deploy/Register the File or RSS Adapter?
The file adapter is part of the shipment just like Twitter etc.
The RSS Adapter will be the first adapter to be released as open source in sap's github repository. We are going through the final stages with the legal department.
I am trying to register the OracleLogReaderAdapter with HANA but i got below issue (it says the agent has shutdown but it actually started):
Could you please help to advise what could be the reason here? I already tried to copy the Oracle JDBC driver ojdbc6.jar to the folder C:\usr\sap\dataprovagent\lib according to the chapter 6.2 "JDBC libraries" in EIM Administration Guide.
Is it possible the Hana server does not know your computer, where you are running this application and want to run the agent, by hostname?
In that case you would have to use the IP address in the agent's hostname field.
If your computer's IP address does change, you have to update the Hana setting again.
Thanks for your reply. I am running the HANA on Cloud (HCP's HANA instance) and the agent in local network (tried both on Windows 2008 Server and my laptop - Window 7). I connected to HANA via DB tunnel. I tried both the IP address and the host name but none of them are working. Does the agent need to have the public IP address (internet accessible)?
When you use the TCP protocol and not the https protocol, then yes, the Hana server does ping the agent by whatever you specified the as the hostname when registering it. That can be a short hostname (e.g. host1) or a long (e.g. host1.mydomain.com) or an IP address (e.g. 192.168.0.4).
I am not sure about the DB tunnel. I assume it does not open the full port range, in our case port 5050 and 5051.
You could use the https protocol instead. But be aware of the requirements, there needs an additional Delivery Unit be installed, an XS application that routes the incoming https commands to the Data Provisioning Server of your Hana instance.
And that's actually my biggest concern. A normal HCP instance does not have the dpserver activated yet. Did you do that?
Creating an inbound firewall rule to open TCP port 5050 in source system solved my similar issue of MS SQL adapter registration failure.
For using https protocol, I have downloaded and installed DU HANA_IM_DP.
This created an XS application which is available as 'https://myaccountname.hana.ondemand.com/sap/hana/im/dp/proxy/dpproxy.xsjs' after providing required accesses.
Any idea what should be my next step? or where can I find the 'Proxy' details which is expected to be created after importing the DU.
Let me check. HANA_IM_DP is something I have to play with myself some time soon.
Very nice blog and member comment and issue are very general as I too faced most of these.
I have HANA SP09 revision 91 and source system as MSSQL 2012.
Using MssqlLogReaderAdapter I have created remote source for MSSQL DB, connection is working fine.
My issue is from when I try to add a table as virtual table, I am getting below error.
I am unable to find any helpful link or SAP Note for this issue, has anyone faced such error.
If I am not mistaken, you are using the SDA (=ODBC) adapters. You can validate that easily: In the remote source, what adapter are you using and what is its location? My guess it is the ODBC adapter located in the index server.
What this blog post is about are the DP adapters, the ones that require the DPAgent and which support realtime push as well.
I am using MssqlLogReaderAdapter and its source location is Agent.
Also just for information if it is relevant here, at SUSE Linux OS level I have updated update as in screenshot.
The MssqlLogReaderAdapter is not using ODBC, it is using JDBC.
Can you show a screenshot of your remote source? Just to be sure.
Here is the screenshot.
Issue I am discussing is for this adapter only.
Now I have created one more Adapter MSSQL ODBC for the same source system
But that too is having below error message.
On My Suse server I logiin using hana user sidadm, when I try below isql command to connect that is also failing.
I was able to correct the issue in new adapter I created using ODBC MSSQL Adapter.
On Suse Linux I login as sidamd run the below command
Then based on result I cross checked and updated odbc file as I found /home .odbc file is having some missing details.
Now I again cross check my first remote source created using MssqlLogReader Adapter with a believe that it might get working too but it is having same issue.
But with this now I am sure that there is no privilege or authorization issue as before I was suspecting that too.
Using ODBC I can create virtual table now and getting them in catalog to view the content.
Philip, can we start all over again, please? That mixing ODBC and LogReaderAdapter does confuse me.
You have proven to me that you try to created a remote source using the LogreaderAdapter. What is the exact error message you get? Reading between the lines it is something around quotes but I am not sure, hence please post the exact error message. Okay?
Ok let close ODBC part as it is now working for me.
Now for let me explain the issue I am facing while using "MssqlLogreaderAdpater" for my remote source that is MSSQL 2012 (Non-SAP). My HANA System is SP09 revision 91
Now this connection is connecting successfully with source SQL 2012 DB.
But When I right click on any table and try to add a virtual Table it fails as below error.
Hope it would clear the issue....
Can you try creating the virtual table via SQL? Just to narrow down the issue.
That would be
CREATE VIRTUAL TABLE virtualtable AT "remotesourcename"."<NULL>"."<NULL>"."database.owner.table";
I used the command to check the owner
select * from ownership where object_name='myremotesource';
further I used the command you provided.
CREATE VIRTUAL TABLE xxxxxx AT "Myremotesource"."<NULL>","<NULL>"."remotedbname.ownername.table";
but it failed with below error message.
Further I found one SAP Note 2170913 for my error message which suggest to Change SAP HANA Client to match the target HANA database.
In My case I have HANA DB as 1.00.091.00 and My remote source has HANA Client as 1.00.097.02. So I updated client on HANA Server with Hope it should work but it did not worked.
But I think as per note I have to reinstall HANA client on remote source as 1.00.091.00
In this post
Hana Adapter SDK - Interaction via SQL
I went through the SQLs the UIs execute. In particular the procedure which returns the list of unique names.
Can you use that to get any unique name back and then use that in the create virtual table statement?
My hope is we can isolate the problem before I ask development for support.
following your saying I refer to the SQL command you provided in your blog.
I run the sql command as below to get list of unique names.
So here I can see a lot of UNIQUE names (same were visible directly in HANA GUI remotesource)
So here in my case what SQL command shud be used to add virtual table.(for test I can add any table as virtual table).
I see there is difference in the example you showed in blog and the case here I see (In your command UNIQUE name and Table Name as same, but here UNIQUE names are not the table)
CREATE VIRTUAL TABLE V_XXXX AT "MetabuilderSQL","<NULL>","<NULL>","XXXX";
what shud I put at XXXX for instance I put Class in XXXX and it gave below error message
FYI I have updated my HANA revision from SP09 91 to 97, and remote system is having HANA Client as SP09 rev 97, But still issue is same.
What the procedure did return was the first level only. Call it again, this time with the second parameter being 'dbo' instead of ''. Okay?
The issue is resolved, I have opened message with SAP.
"They replied back as double quotes are not supported in SP09 and there is no plan to fix it. The issue is fixed in SP10 so recommend to upgrade to SP10."
I have upgraded to SP10 now able to add table as virtual table.
I have installed DP agent in a source system (Azure cloud VM with SQL server) and I am trying to 'connect to HANA', i.e HCP in my case.
(I am using cloud connector)
DP server is up as below,
Also another issue that I found is that, I could not see the sap.hana.im.dp.admin::Administrator role when checked from SAP HANA Studio. Please let me know your thoughts.
Connected after unchecking 'Hana on cloud'
Very nice blog.
We want to replicate the data to SAP HANA from our SAP ECC. Our ECC is on Oracle 11, is there any adapter to connect to Oracle 11 in SDI?
According to the PAM for Hana SP9 EIM Option the OracleLogReaderAdapter supports the Oracle 11 versions (11.1, 11.2), yes.
Thanks for your reply, we installed DP agent and when we look at the Oracle adapter default it is showing us Adapter Version as "12C", is there any additional steps to be done to connect to different Versions.
This is the ODBC based SDA adapter. No idea why the SDA team supports 12c only.
I am installing DP Agent in SAP ECC6.0 EHP7 system with MSSQL 2012 database.
I am able to connect to HANA system via DP agent but unable to see all adapter list in DP Agent console.
I have installed DP agent on MSSQL 2012 DB (non-sap system), where it was showing all the adapters list but not able to figure out why in this system it not showing list of all adapters.
Odd. Need to check if anybody else has an idea...
Got this issue solved.
Issue was system was having installed java as JDK1.6.20, I compared version as in non-sap system where DP Agent showing all adapters there JDK was 1.8.60.
I upgraded the Java and took restart of server and after that DP Agents showed all the Adapters.
Gee, I had considered asking that. But then I thought, no it would have materialized in other error messages.
We should really add a test for the java version. Isn't that the value of Java OSGI plugins development?
We are also trying to bring data from SQL Sever 2012 to HANA DB using SDI and Data Provisioning Technologies. We have created HANA virtual tables on SQL Server database’s tables and it has same structures and data types as tables in SQL Sever database.
But we have encountered an issue and it is very critical for us in our project.
Issue: – when we do data preview on HANA virtual table which has same table structure and data type as SQL Server, we seeing that when columns of numeric data types (SMALLINT) may get inserted into the database with incorrect values.
For example, a SMALLINT value of 1 may be seen inserted into the database(HANA) as value 256.
We have installed our HANA SPS 11 on IBM Power 8 System which is on Linux OS. I am not sure if it is bug in JDBC driver or OS level issue.
So, I wonder if you have encountered similar issue in your system as your have used similar technologies.
It would really great if you have any suggestions on this.
Look forward to hear from you.
I am facing Error when i try to connect Twitter adapter ..
SAP DBTech JDBC: : internal error: Cannot get remote source objects: Agent has shutdown.
Thank's in advance..
That means that Hana cannot find the agent. Assuming you are not using the https (cloud) as connector, then very likely your computer running the agent with its hostname is not known to the Hana server. Either use IP address of your agent-computer instead when registering the agent in Hana or check the name resolution.
I have created a remote source using file adapter but do not see the configured CFG files as remote source tables under the remote source in HANA Studio. Am I missing some configuration steps?
Your Hana Studio version is broken. I have the same issue but thought it is because of an odd combination of versions I am using. Anyway, if you look carefully at the create remote source screen, none of the settings are related to the file adapter and the adapter is set to run in the index server?!?! Can't be.
I haven't figured out what the problem with Hana Studio is - lack of time - and I create the remote sources manually instead. I would appreciate if you could file a bug for Hana Studio telling that the screen does not match the selected adapter. It seems to show the screen of the next adapter in the drop down list, so if you pick FileAdapter it shows the Hadoop Adapter screen.
The syntax to create a remote source manually for the file adapter is:
CREATE REMOTE SOURCE "mysource" ADAPTER "FileAdapter" AT LOCATION AGENT "myagent"
CONFIGURATION '<?xml version="1.0" encoding="UTF-8"?>
WITH CREDENTIAL TYPE 'PASSWORD' USING
The italic entries you need to adapt to your values.
Thanks a lot, Werner. creating remote source through SQL console worked.
I am convinced that the issue was due to HANA Studio.
btw, the WebIDE catalog editor works fine. I just checked.
And I wrote a post about the File adaper today, just for you.
Hana Smart Data Integration - The File Adapter
Thanks so much. What more can I ask for!!!
Thanks for this blog.
I am facing difficulty in registering an agent:
Please help me proceed.
Amit Kumar Pathak
Did you fix it? You need to use the FQDN of your host (INLN5...) and not just the hostname.
Thanks for the amazing blog.
I am unable to start the SAP HANA Data Provisioning Agent. When I start the agent it gives the following error:
"Failed to start Agent service. Return code: 5"
Can you suggest what might be wrong?
You could start the agent manually by simply executing the dpagent.exe program. An educated guess though would be to check the Java version. Is JVM 1.7 or greater installed?
Thanks Werner. I started the agent manually and the error still pops up but now the agent is running.
Another error I am facing is that the HANA on cloud option is not working for me. I am getting the following error when I try to connect the agent to my HANA instance.
Could you please help me resolve the below issue while registering adapter with SAP HANA Data Provisioning Agent.
Hi, I too faced same issue..
I just removed all registered adapters and started with new and its worked for me..
I have created a remote data source using ABAP Adapter. But when I try to access the tables, I am getting the following error:
Could help me on how to proceed?
Likely a permission problem. What the adapter does use under the cover is the function module /BODS/MODEL_NAVIGATE. I just checked the source code, it is the only function module called when expanding on the ABAP Tables.
So can you connect to SAP via SAPGUI, use the same username, run transaction /nse37 and simply execute above function module with 'SAP_ANW' as parameter. Somewhere along those steps you will get a permission error, I guess.
Thanks for your prompt reply, you were right about the permission error.
I tried executing the /BODS/MODEL_NAVIGATE function module, looks like it does not exist.
I assume none of the /BODS/* function exists? Then you are missing a SAP Note for the SAP_NW component related to Data Services. Yes, this adapter does use some functions from Data Services and these should be installed on any recent ABAP system. But you can add them without installing the latest NW version by a SAP Note.
Let me know if you can't find it, then I will try to locate it.
I found the following 4 SAP Notes (0001775077, 1775080, 1914980, 1863694), which one to implement.
on HANA SPS09, is there any reason why i cannot see my registered agent in the "source location" for a new remote source?
installation and all, appeared to go fine, and i see my agent in SYS.AGENTS
My guess is you are using a non-matching Hana Studio. I have seen that occasionally. The Remote Source screen is not fault tolerant (or buggy) in certain releases of Hana Studio.
This post describes the SQL way of creating remote sources and others, maybe you can use that as a workaround?
Hana Adapter SDK - Interaction via SQL
hi werner, well your guess was good
however, i went straight the the Web IDE, and looked there, and sure enough, the DP Agents now appear.
1. i could not see any existing Remote Sources on the WebIDE
2. i could not see the indexserver as an option, therefore, no previous older Adapters.
so am i to assume the older adapters are no longer supported ?
It's getting odd now. The old (SDA provided) ODBC adapters running in the Index Server are supported still. And the WebIDE should show all remote sources. But I am not aware of any issue in the WebIDE and as the WebIDE is part of the Hana install, it cannot be a wrong version either.
yes, very odd. take a look at this, and let me know what you think ?
Eclipse on the left, Web IDE on the right
No DP Agents
All Adapters present in system, and corresponding tables, but the (ODBC) indexserver Adapters not visible in Web IDE, as i mention above
1. No matter what you do, WebIDE and Studio should show the same remote source. Always.
2. When creating a new remote source as your screen shows, you pick the Adapter Name first, e.g. FileAdapter. Then the Location and Agent will show where this adapter is available. The FileAdapter is not available in the indexserver. It is an Agent hosted adapter, not an ODBC adapter coming from SDA. see select * from adapter_locations; So that part is understandable.
3. An agent which got got dropped, cannot have a remote source. We do not allow to rop an agent but not its remote source.
Are you sure it is the same system????
1. yes, i agree
2. with the Eclipse IDE, going through your steps, you cannot choose a Source Location pertaining to the DP Adapters. see below
3. as you can see above, we have 4 agents registered, and only dev****agent2 has a FileAdapter configured (still not working, but that's another story). I thought, i could de-register the DP Agents i no longer need, and they would be removed from the system. i suppose, performing the reverse actions of registering them. but to no avail.
oh, and yes, we are definitely in the same system
Okay, I have to give up here. I drop and create agents every other day. Can you switch to the forums or open a support call?
hi werner, so in the end ...
i have access to another system on SP11, the one i am having issues with is SP09.
the SP11 system works fine, and shows none of the issues i have encountered with SP09.
even in SP11, connections created in Studio, are now visible in Web IDE
so i guess, the best thing to do in future would be to leave SP09 alone. hopefully, it only can get better.
thanks for you help
While creating a remote source using the HANA Adapter for a HANA Instance of an ABAP system, I am getting the following error message.
I was able to add the same system on eclipse. Please help with the same.
agent is registered but file adapter is not getting registered says agent not found
When you click on "register adapter", the create-adapter SQL is executed in Hana. This in turn triggers reading the adapter settings from the agent......seems Hana cannot reach the agent.
Can you check the Hana Cockpit or via SQL, select * from m_agents;?
A typical problem is when the agent is installed on a DHCP enabled computer and the Hana server cannot resolve the hostname. Or you used the current IP address when registering the agent but the IP address changed meanwhile.
I am working on SDI for HCP. i have successfully configured FileAdapter and HanaAdapter but when i am trying to configure MssqlECCAdapter i am getting below issue.
since this is expected error because JDBC driver is not registered with dp agent. So i followed the steps mentioned above and tried to include jdbc driver in C:\usr\sap\dataprovagent\lib path as below.
As per the guidelines i should create */lib folder manually but in my case it is already created under the given path. i tried copying the JDBC jar file under this but even after this i am not able to register MssqlECCAdapter adapter in my data provisioning agent.
Note: I am trying to do this on windows desktop system instead of any server.
Kindly request you to please suggest me any possible fix for this. Thanks
I don't like the name sqljdb4-3.0.jar.
The download I used (and told in help.sap.com) is a plain sqljdbc4.jar file.
My bad, It worked with sqljdbc4.jar file.Downloaded the wrong one earlier.Thank you.
I am facing one issue while creating replication task using SDI-ABAP Adapter. Request you to please share your input on this.
I am able to create remote source using ABAP Adapter and i can see all the objects in remote source panel as below.
But when i create replication task and try to add objects for replication then i don't see all the sub folders under ABAP Tables. Please find the below screen shot of the same. I need to replicate Logistics Tables but i don't see them here. Am i missing any specific privileges for this?
Request you to please share your input on this. Thanks
We are looking at the possibility to have a two way connection with a Windows based Java server with our InfoCube based on HANA DB. There is a calculation engine on the Java server where we need to send a parameter from the HANA and receive the calculated value to our cube in HANA.
Would it be possible to achieve this using the HANA SDI? If so, which adapter would help get such connections?
Looking for adapter to upload xml file data into HANA using SDI. Can we use any of the currently available adapter to upload xml file ?
Thank for the fantastic blog.
I am writing a custom Google Bigquery adapter, after exporting it as deployable plugin and fragments, when I try to register it with HANA using DP agent running on my laptop, I get the following error-
I have put the relevant jars in C:\usr\sap\dataprovagent\lib as well. Currently the deployable jar file don't have jar files packed in it.
I also tried to pack them by adding \lib folder in bin.includes under MENIFEST file. All jars were present in lib folder in project directory.
How to pack and deploy adapter with third party jars?
Also if adapter is deployed using DPconfig Tool, how to debug the plugin in Eclipse?
Issue is resolved now, I added external jars at runtime classpath under Manifest file.
exported the project as deployment plugin and it is deployed successfully.
When the adapter is running and a exception is raised, where to see the logs for more details?
Also in source code logger.info is used, where does the log file gets stored?
have you succeded in deploying a connector for Big Query for SAP HANA SDA?
We are experience the same issue, could you share your step in order to make it work, please?
Thanks in advanced
Can we use SDI with s4 HANA ABAP CDS View (Source)to insert data into HANA Table(Target)?
If yes please provide details of same.
We are using SAP HANA 2.0
Thanks in advance.
We are facing below issue while register ABAP adapter with Hana. Please help with the same
You are connected with Hana. That means the dpagentconfig tool was able to create a JDBC connection with your Hana. Your Hana on the other hand wants to ping the the dpagent but cannot find it.
That can be multiple reasons, e.g. your host running the dpagent is not know to the server running Hana. Very often the problem is either that you install the dpagent on your laptop and the Hana server does not know about those by name. Or it would know if you have specified the full qualified name including domain and maybe a dhcp part.
So the solution is to login to the operating system the Hana server runs on and to enter ping commands, pinging your local computer. (assuming it does respond to pings in general; usually it does)
Then unregister the agent and register it again with the correct hostname. A nice test can be using the IP address instead of the name. This is a proper long term solution only in cases where the dpagent computer does not get its IP address via DHCP but has a fixed IP assignment.
Does SDI supports peoplesoft adaptor to extract data.
We have requirement where in we want to extract data from PS and then use MC to load the data on S4Hana on cloud.
If yes please provide details of same.
Thanks in advance.
Werner Could you please suggest us a way that would be a great help.
I have to admit I have not used the REST Adapter yet. There are two colliding approaches
Take twitter for example. Does twitter support REST APIs? Sure it does. Anything special about those? Absolutely. The login call returns you an URL that you have to use for this session. It has a fixed table layout with timestamp, id, user, location, text.
When using a REST Adapter the adapter needs to allow for the dynamic URL provided by the login, it needs the login call performed first. And it returns the payload as string in Json format.
Not nice. Hence we built a Twitter adapter to shield the user from these complexities.
To cut a long story short: Do you want to use the REST Adapter because you have so many completely different sources? Or do you have a single type of source and it would be better to build an adapter hiding all the complexities, supporting all you need and internally is using REST calls to interact with the server?
I have a requirement to extract the data from ECC system via HANA SDI ( using ABAP adapter ) into HANA Database of SAP Cloud Platform. I have created a Remote Source and Replication task ( with initial load ) and able to connect with ECC system successfully .
Now I want to capture the changes which was done in the backend ECC tables using filter. I have created a REPTASK with replication behavior as ‘Initial + realtime” but unfortunately getting below error message :
[com.shell.TestSDI:DOCTP.hdbreptask] column not allowed: “MANDT” in select clause: line 1 col 8 (at pos 7)
Please refer below image :
If I remove the filter ( “ATLAS_DOCTYP” = ‘I’ ) then REPTASK is getting saved but during execution it is giving below error :
InternalError: dberror($.hdb.Connection.executeProcedure): 256 – SQL error, server error code: 256. sql processing error: “TESTSDI”.”com.shell.TestSDI::DOCTP.START_REPLICATION”: line 42 col 4 (at pos 1812):  (range 3) sql processing error exception: sql processing error: QUEUE: com.shell.TestSDI::DOCTP.SUB_VT_ABAPTABLES_DS1_FI_MT_DOCTP: Failed to optimize the subscription query string for remote subscription com.shell.TestSDI::DOCTP.SUB_VT_ABAPTABLES_DS1_FI_MT_DOCTP.Error: column not allowed: “MANDT” in select clause at /sapmnt/ld7272/a/HDB/jenkins_prod/workspace/HANA__FA_CO_LIN64GCC48HAPPY_rel_fa~hana1sp12/s/ptime/query/checker/check_clause.cc:4987: line 1 col 1 (at pos 0) at /sapmnt/ld7272/a/HDB/jenkins_prod/workspace/HANA__FA_CO_LIN64GCC48HAPPY_rel_fa~hana1sp12/s/ptime/query/plan_executor/trex_wrapper/trex_wrapper_body/trex_llvm.cc:914
Sometimes I am getting this error during replication task :
InternalError: dberror($.hdb.Connection.executeProcedure): 686 – SQL error, server error code: 686. start task error: “TESTSDI”.”com.shell.TestSDI::DOCTP.START_REPLICATION”: line 3 col 1 (at pos 148):  (range 3) start task error exception: start task error:  executor: plan operation failed;Error executing “INSERT INTO “TESTSDI”.”com.shell.TestSDI::DOCTP.ABAPTABLES_DS1_FI_MT_DOCTP” (“BLART”, “SHKZG”, “ATLAS_DOCTYP”) SELECT “BLART”, “SHKZG”, “ATLAS_DOCTYP” FROM “TESTSDI”.”_SYS_CE__popid_4_5CF0D47EC7E85229E10000000A4E721F_26″”: Could not insert into table due to a unique constraint violation. at /sapmnt/ld7272/a/HDB/jenkins_prod/workspace/HANA__FA_CO_LIN64GCC48HAPPY_rel_fa~hana1sp12/s/ptime/query/plan_executor/trex_wrapper/trex_wrapper_body/trex_llvm.cc:914
Which replication behavior is used to get the changes done in the ECC system ( remote system ) in HANA DB via SDI ABAP Adapter ?
Your expert advice is really needed here.
Thanks in advance.
Hello! I have the same problem. Could you resolved? What is the replication behavior that should be used for this? Or which adapter is optimal for replicating ABAP tables with SDI in SAP HANA?