Skip to Content

This blog is continuation of SAP Hybris – Customizing Data Hub

Load IDOC to Hybris

  • IDoc is an SAP object that carries data of a business transaction from one system to another in the form of electronic message
  • IDoc is an acronym for Intermediate Document.
  • The purpose of an IDoc is to transfer data or information from SAP to other systems and vice versa.
  • The transfer from SAP to non-SAP system is done via EDI (Electronic Data Interchange) subsystems whereas for transfer between two SAP systems, ALE is used.

Step-by-step procedure.

Step 1: After successfully loading the customproduct 

Step 2: Create the xml (spring.xml)

 

Note: The below is the sample IDoc (just for an example, you try with your own IDoc)

<?xml version="1.0" encoding="UTF-8"?>
<ZCRMXIF_PRODUCT_MATERIAL>
 <IDOC BEGIN="1">
  <EDI_DC40 SEGMENT="1">
   <IDOCTYP>ZCRMXIF_PRODUCT_MATERIAL</IDOCTYP>
   <MESTYP>CRMXIF_PRODUCT_MATERIAL_SAVE</MESTYP>
  </EDI_DC40>
  <E101COMXIF_PRODUCT_MATERIAL SEGMENT="1">
   <PRODUCT_ID>EOS-30D-1623432_V1</PRODUCT_ID>   
  </E101COMXIF_PRODUCT_MATERIAL>
 </IDOC>
</ZCRMXIF_PRODUCT_MATERIAL>

Step 3:

Now add custom raw data to “customproduct-raw-datahub-extension.xml” , based on the idoc we will add custom rawitems to this XML file and also to “customproduct-raw-datahub-extension-spring.xml” .

  • customproduct-raw-datahub-extension.xml
<extension xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.hybris.com/schema/"
		   xsi:schemaLocation="http://www.hybris.com/schema/ http://www.hybris.com/schema/datahub-metadata-schema-1.3.0.xsd"
		   name="customproduct-raw">

	<dependencies>
		<dependency>
			<extension>customproduct-canonical</extension>
			<extension>saperpproduct-canonical</extension>
		</dependency>
	</dependencies>
<rawItems>
		<item>
			<type>RawCustomProduct</type>
			<description>Raw representation of a sample raw item</description>
			<attributes>
				<attribute>
					<name>E101COMXIF_PRODUCT_MATERIAL-PRODUCT_ID</name>
				</attribute>				
			</attributes>
		</item>	
</rawItems>
<canonicalItems>
		<item>
			<type>CanonicalCustomProduct</type>
			<attributes>
				<attribute>
					<name>productId</name>
					<transformations>
						<transformation>
							<rawSource>RawCustomProduct</rawSource>
							<expression>E101COMXIF_PRODUCT_MATERIAL-PRODUCT_ID</expression>		
                    </transformation>							
					</transformations>
				</attribute>			
			</attributes>
		</item>
</canonicalItems>	
</extension>

 

 

  • customproduct-raw-datahub-extension-spring.xml
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context" xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-xml="http://www.springframework.org/schema/integration/xml"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
                    http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd
                    http://www.springframework.org/schema/integration/xml http://www.springframework.org/schema/integration/xml/spring-integration-xml.xsd
                  http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
                  http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<!-- ========================== -->
<!-- Spring-Integration Content -->
<!-- ========================== -->

	<int:channel id="idocXmlInboundChannel">
		<int:interceptors>
			<int:wire-tap channel="logger" />
		</int:interceptors>
	</int:channel>

	<int:logging-channel-adapter log-full-message="true" id="logger" level="DEBUG" />

	<bean id="idocInboundService" class="com.hybris.datahub.sapidocintegration.spring.HttpInboundService">
		<property name="idocXmlInboundChannel" ref="idocXmlInboundChannel" />
	</bean>

	<!-- Data Hub input channel for raw data -->
	<int:channel id="rawFragmentDataInputChannel" />

	<!-- Maps received IDOCs by value of header attribute: "IDOCTYP" to corresponding mapping service -->
	<int:header-value-router input-channel="idocXmlInboundChannel" header-name="IDOCTYP">
		<int:mapping value="ZCRMXIF_PRODUCT_MATERIAL" channel="ZCRMMATMAS" />
	</int:header-value-router>

	<!-- sap crm product -->
	<int:service-activator input-channel="ZCRMMATMAS" output-channel="rawFragmentDataInputChannel" ref="customproductCRMMappingService"	method="map" />
	
	<!-- Dummy implementations of mapping services implemented elsewhere -->	
	<bean id="customproductCRMMappingService" class="com.hybris.datahub.sapidocintegration.IDOCMappingService">
		<property name="rawFragmentDataExtensionSource" value="customproduct" />
		<property name="rawFragmentDataType" value="RawCustomProduct" />
	</bean>	
</beans>

Step 4:

Now add custom canonical data to customproduct-canonical-datahub-extension.xml”

 

  • customproduct-canonical-datahub-extension.xml
<extension xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.hybris.com/schema/"
		   xsi:schemaLocation="http://www.hybris.com/schema/ http://www.hybris.com/schema/datahub-metadata-schema-1.3.0.xsd"
		   name="customproduct-canonical">

<canonicalItems>
		<item>
			<type>CanonicalCustomProduct</type>
			<description>Canonical representation of sample item</description>
			<status>ACTIVE</status>
			<attributes>
				<attribute>
					<name>productId</name>
					<model>
						<localizable>false</localizable>
						<collection>false</collection>
						<type>String</type>
						<primaryKey>true</primaryKey>
					</model>
				</attribute>		
        </attributes>				
		</item>
	</canonicalItems>
</extension>

Step 5:

Now add custom target data to customproduct-target-datahub-extension.xml”

  • customproduct-target-datahub-extension.xml
<extension xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.hybris.com/schema/"
		   xsi:schemaLocation="http://www.hybris.com/schema/ http://www.hybris.com/schema/datahub-metadata-schema-1.3.0.xsd"
		   name="customproduct-target">

	<dependencies>
		<dependency>
			<extension>customproduct-canonical</extension>
		</dependency>
	</dependencies>

<targetSystems>
		<targetSystem>
			<name>HybrisCore</name>
			<type>HybrisCore</type>
			<exportURL>${datahub.extension.exportURL}</exportURL>
			<userName>${datahub.extension.username}</userName>
			<password>${datahub.extension.password}</password>
			<exportCodes>
			</exportCodes>
			<targetItems>
				<item>
					<type>TargetCustomProduct</type>
					<exportCode>Product</exportCode>
					<description>Hybris Platform representation of Product</description>
					<updatable>true</updatable>
					<canonicalItemSource>CanonicalCustomProduct</canonicalItemSource>
					<status>ACTIVE</status>
					<attributes>
						<attribute>
							<name>identifier</name>
							<localizable>false</localizable>
							<collection>false</collection>
							<transformationExpression>productId</transformationExpression>
							<exportCode>code[unique=true]</exportCode>
							<mandatoryInHeader>true</mandatoryInHeader>
						</attribute>											
					</attributes>
				</item>
			</targetItems>
		</targetSystem>
	</targetSystems>

</extension>

 

Step 6: Run command mvn clean install

Step 7: Now go to the

<YOURPATH>\datahub6.2\archetype\customproduct\customproduct-canonical\target

Step 8: Copy above highlighted jar file to the below path folder

<YOURPATH>\datahub6.2

Same procedure will be followed for raw and target also

Step 9: Go to the path <YOURPATH>\datahub6.2\archetype\customproduct\customproduct-raw\target

Copy the jar file into crm folder

Step 10: Go to the path <YOURPATH>\datahub6.2\archetype\customproduct\customproduct-target\target

Copy the jar file into crm folder

Note: If any changes are done in customproduct – raw, canonical, target  Step 5 to Step 10 should be followed and restart the Tomcat server.

Step 11: Now go to chrome via https://chrome.google.com/webstore/detail/postman/fhbjgbiflinjbdggehcddcbncdddomop?hl=en

Step 12: Click on launch the app button

Step 13: Open Postman App and add the header details as below

Headers : Content-Type – application/xml

URL: http://localhost:8080/datahub-webapp/v1/idoc/receiver

 

Step 14: Go to Body-> raw -> add the IDoc and click on send button

  • We will get a response as 200 which means the process is a success.

Step 15: Now go to mysql workbench and check the “rawitem” table data.

 

DATA COMPOSITION

Data composition means the transfer of data from raw items to canonical items.

Step 16: Post the

URL:  http://localhost:8080/datahub-webapp/v1/pools/GLOBAL/compositions

as shown in below

Step 17: Now go to MySQL workbench and check the “canonicalitem” table data.

DATA PUBLICATION

Data publication means the transfer of data from canonical items to target items.

Step 18: Post the URL below

http://localhost:8080/datahub-webapp/v1/pools/GLOBAL/publications

Step 19: Now go to MySQL workbench and check the “targetitem” table data.

 

Step 20: Start the hybris server with recipe “sap_aom_som_b2b_b2c”.

If we don’t get datahubadapter in the local.properties xml file in config .Then add it manually

“<extension name=”datahubadapter” />”

Step 21: Go to hmc,catalog->products enter the product id and click on search.

  • Product which is sent via IDoc can be viewed in hmc.

Thanks for reading 🙂

 

To report this post you need to login first.

13 Comments

You must be Logged on to comment or reply to a post.

  1. Aditya Mishra

    I completed all the steps successfully except publishing the data, in the process of publishing the data, I’m getting de.hybris.platform.impex.jalo.imp.InsufficientDataException, target type Product is not permitted by current header – type Product requires missing column [catalogVersion], how to resolve this error

    (0) 
  2. Srinivas Jayanna

    Hi  Sharmila,

    Completed all the steps. but some issue with data publication. step18 worked fine got 200 response.

    In MySQL, Raw and Canonical items looks good, but Target item not reflecting. Can you know what could be the issue.

     

    Thanks & Regards

    Srinivas H J

    (0) 
  3. Sharmila Gurubelli Post author

    Hi Srinivas/Mehamood,

    If the status is 200, the product will definitely be replicated to Hybris.

    I don’t know why it is not showing in SQL.

    But, Please repeat the steps composition, publication and check in Hybris -> hmc -> products.

    In Products search your item.

     

    Thank you,

    Sharmila.

    (0) 
    1. Srinivas Jayanna

      Hi Mehamood,

      In my case MySQL configuration were wrong in local.property file. After correcting its working fine.

      Check your MySQL configuration in local.property file and make sure you have provided right schema name and credentials.

      Thanks & Regards

      Srinivas H J

       

      (0) 
  4. Pramod Khamkar

    Hi Sharmila,

    I followed above steps and when I am following step 18 I am getting error like

    status:400 Bad Request

    Target system ‘HybrisCore’ not found for request publication data: TargetSystemPublicationData{publicationId=null, startTime=null, endTime=null, status=’null’, targetSystemName=’HybrisCore’, actionId=3, poolName=’null’, numberOfErrors=0, canonicalItemCount=null}

    and also in MySQL table as well as in HMC product catelog not able to see the product.

    In MySQL inside rawitem table and canonicalitem table I am able to see the data.

    I have set request type as post and content type application/json while posting data.And data is as follows:

    {
    “poolName”:”GLOBAL”,
    “targetSystemPublications”:[
    {
    “targetSystemName”:”HybrisCore”
    }
    ]
    }

    Not able to figure out where I am doing wrong.

    Please help.

    Thanks in advance.

    Regards,

    Pramod Khamkar

     

    (0) 
    1. Srinivas Jayanna

      Hi Pramod,

      Target system name should match what you have provided in step5. Make sure if you made any changes to these file you need replace as mentioned in step 10.

      (0) 
      1. vishal patil

        Hello Srinivas,

        I followed above steps and when I am following step 18 I am getting error like

        java.lang.IllegalStateException: The target system HybrisCore at https://localhost:9002/datahubadapter is unavailable for target system publication id 2 and pool GLOBAL:
        avax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No name matching localhost found.
        at com.hybris.datahub.service.impl.DefaultTargetSystemPublicationAvailabilityService.pingTargetSystem(DefaultTargetSystemPublicationAvailabilityService.java:70) [
        atahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at com.hybris.datahub.service.impl.DefaultTargetSystemPublicationAvailabilityService.lambda$isTargetSystemAvailable$55(DefaultTargetSystemPublicationAvailabilityS
        rvice.java:39) [datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:276) [spring-retry-1.1.3.RELEASE.jar:na]
        at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:172) [spring-retry-1.1.3.RELEASE.jar:na]
        at com.hybris.datahub.service.impl.DefaultTargetSystemPublicationAvailabilityService.isTargetSystemAvailable(DefaultTargetSystemPublicationAvailabilityService.jav
        :48) [datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at com.hybris.datahub.service.impl.DefaultPublicationActionService.lambda$filterUnavailableTargetSystems$68(DefaultPublicationActionService.java:124) [datahub-ser
        ice-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at java.util.ArrayList.forEach(ArrayList.java:1249) ~[na:1.8.0_144]
        at com.hybris.datahub.service.impl.DefaultPublicationActionService.filterUnavailableTargetSystems(DefaultPublicationActionService.java:123) [datahub-service-6.2.0
        1-RC1.jar:6.2.0.1-RC1]
        at com.hybris.datahub.service.impl.PublicationActionHandler.handlePublicationAction(PublicationActionHandler.java:108) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-R
        1]
        at com.hybris.datahub.service.impl.PublicationActionHandler.handleAction(PublicationActionHandler.java:95) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at com.hybris.datahub.service.impl.PublicationActionHandler.handleAction(PublicationActionHandler.java:70) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at com.hybris.datahub.command.impl.AbstractPerformCommand.lambda$execute$153(AbstractPerformCommand.java:57) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626) ~[na:1.8.0_144]
        at com.hybris.datahub.service.ExceptionHandlingAsyncTaskExecutor$2.run(ExceptionHandlingAsyncTaskExecutor.java:80) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_144]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_144]
        at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_144]
        2017-08-27 22:38:10,933 [DEBUG] [c.h.d.s.i.DefaultEventPublicationService] Publishing data hub event : TargetSystemPublicationCompletedEvent{publicationId=2}
        2017-08-27 22:38:10,965 [DEBUG] [c.h.d.s.p.i.DefaultDataHubPublicationService] setting publication status to FAILURE
        2017-08-27 22:38:10,981 [DEBUG] [c.h.d.p.i.DefaultProcessMonitor] Unregistered Publication #2, 0 processes running
        2017-08-27 22:38:10,981 [DEBUG] [c.h.d.s.i.DefaultEventPublicationService] Publishing data hub event : PublicationCompletedEvent{actionId=2}

        Thanks,

        Vishal Patil

        (0) 
  5. Mehamood Mastansab

    Hi Sharmila,

     

    I published the product successfully. Could you please provide a sample idoc for customer master data, like DEBMAS07 or something like that.

     

    Thanks,

    Mehamood

    (0) 
  6. vishal patil

    Hello Sharmila,

    I followed above steps and when I am following step 18 I am getting error like

    java.lang.IllegalStateException: The target system HybrisCore at https://localhost:9002/datahubadapter is unavailable for target system publication id 2 and pool GLOBAL:
    avax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No name matching localhost found.
    at com.hybris.datahub.service.impl.DefaultTargetSystemPublicationAvailabilityService.pingTargetSystem(DefaultTargetSystemPublicationAvailabilityService.java:70) [
    atahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at com.hybris.datahub.service.impl.DefaultTargetSystemPublicationAvailabilityService.lambda$isTargetSystemAvailable$55(DefaultTargetSystemPublicationAvailabilityS
    rvice.java:39) [datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:276) [spring-retry-1.1.3.RELEASE.jar:na]
    at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:172) [spring-retry-1.1.3.RELEASE.jar:na]
    at com.hybris.datahub.service.impl.DefaultTargetSystemPublicationAvailabilityService.isTargetSystemAvailable(DefaultTargetSystemPublicationAvailabilityService.jav
    :48) [datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at com.hybris.datahub.service.impl.DefaultPublicationActionService.lambda$filterUnavailableTargetSystems$68(DefaultPublicationActionService.java:124) [datahub-ser
    ice-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at java.util.ArrayList.forEach(ArrayList.java:1249) ~[na:1.8.0_144]
    at com.hybris.datahub.service.impl.DefaultPublicationActionService.filterUnavailableTargetSystems(DefaultPublicationActionService.java:123) [datahub-service-6.2.0
    1-RC1.jar:6.2.0.1-RC1]
    at com.hybris.datahub.service.impl.PublicationActionHandler.handlePublicationAction(PublicationActionHandler.java:108) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-R
    1]
    at com.hybris.datahub.service.impl.PublicationActionHandler.handleAction(PublicationActionHandler.java:95) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at com.hybris.datahub.service.impl.PublicationActionHandler.handleAction(PublicationActionHandler.java:70) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at com.hybris.datahub.command.impl.AbstractPerformCommand.lambda$execute$153(AbstractPerformCommand.java:57) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626) ~[na:1.8.0_144]
    at com.hybris.datahub.service.ExceptionHandlingAsyncTaskExecutor$2.run(ExceptionHandlingAsyncTaskExecutor.java:80) ~[datahub-service-6.2.0.1-RC1.jar:6.2.0.1-RC1]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_144]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_144]
    at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_144]
    2017-08-27 22:38:10,933 [DEBUG] [c.h.d.s.i.DefaultEventPublicationService] Publishing data hub event : TargetSystemPublicationCompletedEvent{publicationId=2}
    2017-08-27 22:38:10,965 [DEBUG] [c.h.d.s.p.i.DefaultDataHubPublicationService] setting publication status to FAILURE
    2017-08-27 22:38:10,981 [DEBUG] [c.h.d.p.i.DefaultProcessMonitor] Unregistered Publication #2, 0 processes running
    2017-08-27 22:38:10,981 [DEBUG] [c.h.d.s.i.DefaultEventPublicationService] Publishing data hub event : PublicationCompletedEvent{actionId=2}

    Thanks,

    Vishal Patil

    (0) 

Leave a Reply