SAP integrations provide a framework for connecting the Omni-commerce capabilities of SAP Hybris Commerce with other SAP products.
Architecture of SAP Integrations
SAP integrations involve various possible system landscapes, types of communication, data transfer processes, and SAP-specific configuration settings.
For now, let’s focus on Hybris Data Hub Concept.
- Data hubs are an important component in information architecture.
- A data hub is a database which is populated with data from one or more sources and from which data is taken to one or more destinations.
- A database that is situated between one source and one destination is more appropriately termed a “staging area”.
Hybris Data Hub
Receives and sends data to SAP ERP or to hybris (data replication in both directions).
example: Hybris Data Hub can replicate product master data from SAP ERP to hybris, or send orders from hybris to SAP ERP.
- It can be used to connect hybris to non-SAP systems.
Execution takes place in following major steps:
- Load (raw format)
- Composition (canonical format)
- Publication (target format)
- It also acts as a staging area where external data can be analysed for errors and corrected before being fed into hybris.
1.Raw items come directly from the source system.
- The data they contain may undergo some preprocessing before entering the Data Hub.
2.Canonical items have been processed by Data Hub and no longer resemble their raw origins.
- They are organised, consistent, and ready for use by a target system.
3.Target items are transition items.
- The process of creating target items reconciles the canonical items to target system requirements
Feeds and Pools
- Data Hub enables the management of data load and composition with the use of feeds and pools.
- Fragmented items can be loaded into distinct data pools using any number of separate feeds.
- Feeds and Pools allow a fine level of control over how data is segregated and processed.
- Data feeds enable you to configure how raw fragments enter Data Hub.
- When data is loaded using one of the available input channels, it is passed on to a data feed as specified in the load payload.
- The data feed then delivers raw items for processing to a specified data pool.
- Data pools enable you to isolate and manage your data within Data Hub, for composition and publishing.
- During the composition of raw items into canonical items, only data isolated within each pool is used.
Step-by-step Procedure for Data hub setup in our local system.
1.Hybris – v6.2.0 (I’m using)
2.Apache Tomcat 7.x
Create folders as below structure in any of the drive
- Open your tomcat folder -> go to config -> create Catalina folder -> localhost folder -> datahub-webapp (create xml file)
Copy the data from below to the datahub-webapp (xml file)
<Context antiJARLocking="true" docBase="<YOUR_PATH>\hybris\bin\ext-integration\datahub\web-app/datahub-webapp-220.127.116.11-RC13.war" reloadable="true"> <Loader className="org.apache.catalina.loader.VirtualWebappLoader" virtualClasspath= "<YOUR_PATH>/datahub6.2/config; <YOUR_PATH>/datahub6.2/crm/*.jar; <YOUR_PATH>/datahub6.2/erp/*.jar; <YOUR_PATH>/datahub6.2/others/*.jar"/> </Context>
- Go to the above path and copy both the path and file name
The copied path and file name should be pasted in the tomcat folder
Open your tomcat folder -> go to config -> Catalina folder -> localhost folder -> datahub-webapp
Now Start the Tomcat
Go to the following path
- Run the command startup.bat through command prompt
- Automatically runs the command in new command prompt as below
Observe the change in the folders after the server starts
Now go to the datahub6.2 folder which we have created earlier
create the following files
encryption-key ———– text format
local.properties ———– properties file
- Copy the key 32digit key and paste in encryption-key file
- Copy the following code into local.properties file
#DB Setup dataSource.className=com.mysql.jdbc.jdbc2.optional.MysqlDataSource dataSource.driverClass=com.mysql.jdbc.Driver dataSource.jdbcUrl=jdbc:mysql://localhost/datahub?useConfigs=maxPerformance dataSource.username=root dataSource.password=root #media storage mediaSource.className=com.mysql.jdbc.jdbc2.optional.MysqlDataSource mediaSource.driverClass=com.mysql.jdbc.Driver mediaSource.jdbcUrl=jdbc:mysql://localhost/datahub?useConfigs=maxPerformance mediaSource.username=root mediaSource.password=root #Not why we need to set it here. datahub.extension.exportURL=http://localhost:9001/datahubadapter datahub.extension.username=admin datahub.extension.password=nimda #Not why we need to set it here. targetsystem.hybriscore.url=http://localhost:9001/datahubadapter targetsystem.hybriscore.username=admin targetsystem.hybriscore.password=nimda # Encryption datahub.encryption.key.path=encryption-key.txt # enable/disable secured attribute value masking datahub.secure.data.masking.mode=true # set the masking value datahub.secure.data.masking.value=******* kernel.autoInitMode=create-drop #kernel.autoInitMode=update #==== sapcoreconfiguration.pool=SAPCONFIGURATION_POOL #sapcoreconfiguration.autocompose.pools=GLOBAL,SAPCONFIGURATION_POOL #sapcoreconfiguration.autopublish.targetsystemsbypools=GLOBAL.HybrisCore datahub.publication.saveImpex=true datahub.server.url=http://localhost:8080/datahub-webapp/v1
Go to the following path
copy the highlighted jar files into the following folder
Go to erp folder
Go to others folder
Download the mysql-connector-java-5.1.30-bin.jar file and copy to the others folder only the jar file after extracting the Zip file
Login to MYSQL workbench
- Create a Schema as below
Start Tomcat server
- To Disable Spring secure
Stop the server
Go to the following path
- web (open the xml file) and comment the following code
Now Start the server
- Default feeds and pools will be displayed
- Go to localhost:8080/datahub-webapp/v1/data-feeds
Tables will be loaded in a schema which was created in step 13.
In the next blog, I will be posting how to load data into data hub using IDOC, CSV and also further by sending IDOC directly from SAP ERP to Hybris (Asynchronous Order Management).
Thanks for reading 🙂