This blog is for beginners or folks who are new to HCI DS
Firstly, HCI DS is an Integration Cloud Platform for scenarios generally involving IBP system, SAP ECC on-premise and SAP APO systems.
The HCI DS only has 2 instances i.e. Sandbox and Production
- Access to the HCI tenant
- User ID, PWD to access tenant and Organization ID to logon to the same.
This is provided in the access email from the SAP
Basic understanding of the components in HCI DS
- Agents – It acts as a connector/adapter between systems and cloud integration platform
This provides security and is a onetime activity done generally by the basis team
We usually have one agent for each environment
Ex – Agent_test, Agent_Prod
- Data stores – This is the metadata / data structural components to be used for transmission between the systems
Ex – Tables, SAP extractors for each system is imported under the data stores
- Projects – It acts as a folder to hold all the tasks created
Ex – Usually all the similar tasks are categorized or created under same project. Module wise or data wise similar objects are created under same project
- Task – The object created under project is a task and it requires source system and target system at the time of creation.
5. Data flow – The actual flow / the process is designed in the data flow. It is created inside a task.
Steps to create a scenario from APO to IBP (from a developer perspective)
- Create a data store for APO and IBP in the data store tab
- Import the SAP extractor/ table for SAP APO under APO data store
- Import the target staging table for the IBP data store
- Create a project under the project tab or use the existing project
- Create a task under the project. Select the source and target systems
- Create a data flow under the task created in the above step
- Click on the import target table and import the required staging table from the IBP instance
- The target query object is also created by default along with the IBP target table and APO source SAP extractor
- Drag the transform query object from the palette and do the filters, mappings as required
10. Connect this transform to the target query and do the required mapping to the target structure
- Click on validate, once the design is completed
12. In case of errors, check the mapping as per the error details. Can proceed to Run Test in case of only warnings
13.Select the Task and click on Run Now and select the view history option
- View the monitor log in the view history to check the records that are selected based on our filter criteria
- Check the error log in the view history to view any errors for the runs.
Few important points to remember:
- Select the task and click on promote, to move the objects from Sandbox to Production
- Do not open the object in edit mode unless we are editing the object.
- There is no version history for the changes done to the object unless it is manually saved as a version separately after doing the changes.
- To check if the job is successful in IBP, go to the IBP portal, under Administration -> Data integration jobs, select the run and download the full report to check for any errors.
- Execution properties tab in the task needs to be filled for the run time properties to be assigned.
For IBP at target, few mandatory execution properties need to be mentioned under this tab
- The promoted column under Projects tab shows if the object has been moved to Production or not
Monitoring & scheduling:
- Click on the view history button to check the monitoring status of any task
- Click on the scheduling button under the Dashboard tab to monitor the scheduled tasks