Export data from SAP IBP to SAP APO through HCI-DS via Web Services
In this post I’d like to share with you one way to transfer data (key figures data) from cloud solution SAP Integrated Business Planning (IBP) to on-premise SAP Advanced Planning Optimization (APO) by means of SAP HANA Cloud Platform, Integration services for Data Integration (for simplicity I will call it HCI-DS) and Web Services at target (APO) side. It will be a step-by-step guide for newbie with lots of pictures. So let’s start!
SAP HANA® Cloud Platform (HCP), integration services (previously called SAP HANA Cloud Integration) integrates processes and data between cloud apps, 3rd party applications and on-premises solutions. It is an open, flexible, on-demand integration system running as a core service on SAP HANA Cloud Platform.
SAP HCP, integration services consist of two parts:
Process Integration – Integrates business processes covering different companies, organisations, or departments within an organisation;
Data Integration – Allows the efficient and secure movement of data between on-premise systems and the cloud.
SAP IBP increases visibility of data to support decisions around controlling costs, improving customer service, and much more. It works with data from operational systems such as SAP ERP and SAP APO. IBP collects and then returns data to the operational systems. Data export from SAP IBP can be performed using:
– API via a RESTful WS;
– HANA Cloud Integration for Data Services – this option is shown in this guide.
– Working connection to IBP and an on-premise data services agent in HCI enviroment;
– The source planning area must be available in IBP;
– Source planning area must contain a generated calculation scenario for key figure data export.
Part 1: SAP Advanced Planning Optimization
All steps listed below are performed in RSA1 transaction.
Create Web Service source system.
Create DataSource for Web Service system.
Check that WEBS_PUSH adapter was selected.
Maintain fields which will be extracted, afterwards activate DataSource.
Create InfoPackage for DataSource.
Check that system creates FM and Web Service for InfoPackage.
Go to schedule tab and click Assign Daemon.
You need to create Daemon and then Assign Daemon to InfoPackage. Here you have to maintain how often Daemon will request data.
After assigning – Daemon is activated.
After sending data from IBP you have to stop real-time data acquisition load process in APO in order to successfully upload data.
Next go to SOAMANAGER transaction and choose ‘Web Service Configuration’.
Choose Functional Module that was created for InfoPackage. *Yes, you are correct – here we have new name, not CQTEST* because the screenshot shown productive FM, not test, but in your case the names will be the same 🙂
Here you need to Create Service and then copy WSDL URL which needed to create Datastore in one of the next steps.
Part 2: SAP HANA Cloud Integration for Data Services
In HCI-DS go to ‘Datastore’, choose your IBP datastore and click ‘Import Objects’. Expand ‘Calculation Scenario Folder’, choose needed planning area and calculation scenario. Click ‘Import’ button.
Create new Datastore of type SOAP Web Service. Click ‘Save’.
Confirm that connection to WS is working with ‘Test’ button.
Import definition of Web Service -> Click ‘Import Objects’ button.
Now we are ready to create Data Flow!
Create new Task within a Project (if needed – create new Project first) with IBP datastore as a Source and newly created Web Service as a target.
In general our Data Flow will look as follows:
In the Query transform step you need to map columns as required from the input to the output (web service). Optionally you can enter required filters (for example: filter null or zero values).
In my case besides key figures I needed to choose the YEAR and MONTH for WS and I did it with help of substring fuction as follows:
YEAR: substr(CALC_SCENARION_NAME.TSTFR,0,4) || substr(CALC_SCENARION_NAME.PERIODID,0,0)
substr(CALC_SCENARION_NAME.PERIODID,0,0) is a FAKE part of mapping and in my case PERIODID is not needed in WS target structure, but without query PERIODID selection won’t work. So sometimes you need to query not needed columns as a fake.
I think that this step is optional but in my case without it transfer of several millions of rows fails with web service timeout error. At first we did additional configuration of DSConfig as suggested by SAP Support, but it didn’t help. So we decided to go with BATCH step.
In this step you must specify the number of rows per batch to transfer in 1 call of Web Service.
In this step firstly click ‘Generate Schema’ button and then choose ‘Generate schema from web service function’ to create a hierarchical structure for the Web Service call.
Map all columns of Web Service with selected columns from Input.
After that generate an ‘iteration rule’. Select the name of the transform, choose the ‘Iteration Rule’ tab and then choose ‘Propose rule’.
Optionally you can Filter data in columns, group or order columns on the corresponding tabs.
Here you only need to map input top level node to the output top level node.
That’s all! Save Data Flow and Test!
Thank you for attention! I hope you enjoyed this guide. If you think that the instructions need to be supplemented or you find a mistake – write in the comments.
This article was written in collaboration with my collegue Alexander Boldesh, APO/IBP expert.
If you want to load data from APO to IBP -> read SAP HANA® Cloud Platform, integration services – An overview of how to connect IBP with APO blog post by Sarath Chandra Pavan Suryadevara
I will have to face this integration but with ECC. The Batch example is going to be useful.
Could anyone please provide resolution for this error:
While Creating flow graph we are using data source (Virtual Function), it contains SOAP Url’s. While getting data from these urls all data types String.String automatically gets converted into template table NVARCHAR(5000). In this scenario we could able to successfully load upto 1000 records. But we want to load huge amount of data (like Transactional Data). In this situation we got struck by facing the below error.
(dberror) 2048 – column store error: task framework:  executor: plan operation failed;Cannot execute remote function call: ::fetchResponse : Exception while processing request “SDA Request for SOAPAdapter of type FEDERATION_GET_LOB”.
Hello Kranthi.. Here is one of the possible suggestion that might help you in this regard:
upgraded to 102.06
changed max channel parameter to 40k
changed handler parameter
removed range partitioning and configured only hash partition.
The first step says that I need to create a new web service based planning area. Now the APO users are already having a data in their designated data source which is seen in the planning area (let us call it ZPLANNINGAREA).
so how am I connecting this new web based datasource with this existing planning area. All the products & their CVCs exist in this planning area only?