Hi all,

“Hybris marketing “which was previously known as SAP Customer Engagement Intelligence (SAP CEI) has become a game changer, enabling individualized, contextual marketing on an unlimited scale. The Solution draws on a deep pool of customer data to identify each customer’s unique intent, including: past purchases and propensity scores, and implicit buying signals observed during recent or live browse sessions. Based on this real-time customer context, Hybris marketing delivers unparalleled customer engagements across channels to drive growth and customer loyalty.

What is Hybris Marketing Integration?

Unlike any other system (ECC, BW, CRM etc.) , Hybris marketing does not have any data of its own so in any Hybris Marketing implementation, there is a need to replicate/migrate the data from a source system (could be SAP/non SAP System like CRM /ECC both on Premise/Cloud OR it could be some data from some customer Websites ) . This process of replicating/migrating the data to the target Hybris Marketing system is called as “Integration “.

There are various techniques currently available for integration in Hybris Marketing as shown below:

PS: For Latest Information on these integration techniques, refer to this Link.

Recently we worked on a project for Hybris Marketing Implementation for a Travel Industry wherein, we used “SAP Data Services” for integration. Data Services is an ETL tool (Extraction Transform Load) which comes free with the Hybris Marketing license so the customer wanted to use the same tool to save the cost. Also, as per the requirements, a lot of transformation/massaging had to be performed on the source data for which using Data Services was a great fit.

Hybris Marketing Integration via Data Services

Data Services is an ETL tool which is capable of extracting the data from various SAP/Non SAP sources, transform the data and load it into target   Hybris Marketing system.

We did some analysis/brainstorming and found that there is a RDS named as “Hybris Marketing Data Load RDS” which is recommended by SAP for such a scenario.

Hybris Marketing Data Load Best Practices

This RDS has data services jobs for loading Interaction data (source was Interaction Header data in flat files and target Hybris Marketing system). For those who are not aware of Data Services Jobs, a Job is an executable object. Whatever we do in data services, any logic etc. has to be a part of a Job to make it executable.

This RDS offers 3 Data services jobs, which reads data from CSV, EXCEL and load it to Hybris Marketing system via OData or FM.

In our project, the requirement was quite similar. We had Hybris commerce data for Products and Interactions (available in Flat files) to be used as SOURCE and TARGET was Hybris Marketing System.

We further analyzed the best practices in detail and found that this RDS can be a good start and can be customized further as per our specific project requirement.

 

Which job to use as a reference?

The first Question in our mind was whether to use Job with OData (first 2 jobs in screenshot above) or Job with FM (last job in the screenshot above) to load the data.

Jobs with OData have dependency on Python script which means another Skill set is required. Also
for writing these Python scripts , we need to create a user defined transform in data services which means additional license/data Quality is required.

Jobs with FM have an inbuilt GUI for making Remote Function call in Data Services. No additional coding/skill set is required to use the same . Hence we decided to use Data services job “JOB_DM_Interactions_FM_SCI” which loads data via Remote Function call.

In this section we will talk about this job only. Information about other 2 jobs will be shared in Part 2 of our blog which will be available soon.

 

Snapshot of standard RDS Job “JOB_DM_Interactions_FM_SCI”.

This job uses standard RFC “CUAN_INTERACTIONS_POST_FLAT” .

Interaction Header data (similar to one in screenshot below) can be easily uploaded with this standard Job using this standard RFC.

Now , what if we have Item level Data as well for load? What if along with Interaction Header , we have item level details like Room data , Data for Recommendations , Pax details etc availaible in multiple worksheets ? (see screenshot below for reference )

In order to load such data , standard Job or Standard RFC will not help. We need to customize the data services Job and also the RFC.

Customization of Data Services Job and RFC:

We picked up the job “JOB_DM_Interactions_FM_SCI “and customized it further to read source data in multiple worksheets and pass data to a single RFC. We added some more functionalities in job like extensive data transformation/massaging of data, email notifications, exception handling, creating erroneous file for bad records etc.

 

Snapshot of Data Services Job we created for Interaction data Loads.

In below screenshot we are reading data from multiple worksheets (First worksheet has header data followed by other sheets which have item data like Rooms ,Recommendations ,Pax etc)

RFC is enhanced by adding structures for each source file.(shown below)

(Example – IT_ACCO_DATA will be mapped to the Header data, IT_ROOM will be mapped to the room data etc during data load via RFC)

Our custom  RFC “ZCUAN_INTERACTIONS_POST_DS” – has structures of all Header data as well as item data.(shown below)

This is how RFC call is configured in data services to perform data loads

Each source data file is mapped to the corresponding structure in RFC.

After doing all this  , we were able to load the entire Interaction data (header + item level data ) successfully to the Hybris Marketing system.

Happy Reading …..

 

 

 

 

To report this post you need to login first.

5 Comments

You must be Logged on to comment or reply to a post.

  1. Margoth Del Rosario

    Hello,

    Sorry for my English.

    I am building a JOB using a RFC function from CRM that has a tables parameters.
    I´ve got building the Flow and it runs successfully, but it returns null into its all field (returns a table parameters).

    Could you send me a complete tutorial about how to setting a DataFlow correctly with RFC function with tables parameters?

    It could be perfect for me. I think Im loosing a step or something else.

    I hope you can help me.
    Thanks

    (0) 
    1. Rupesh Dahuja Post author

       

      Hello ,

      Thanks for your question !

      Did you perform a small test via SE37 and check what is your FM returning for the same data which you are loading via your JOB ? (may be you can check for 1 or 2 records) ?If you get the same behavior , then probably the issue is in the FM. Hope this helps !

       

      Do let us know.

      (0) 
      1. Margoth Del Rosario

        Hello, thanks for answer. Yes, I tested with few record and it worked. I had to take another way but I continue with the doubt.

        I notice something weird about format date, because I have to send a date parameter. In the SE37 the date is writing: 2017.01.01, but in DS Im sending 2017-01-01, I tried to send in DS 2017.01.01 but I got a error because it is not a correct format date. So, Which is the correct way or what do you advice me?

         

         

        (0) 
        1. Rupesh Dahuja Post author

          Thanks for the follow up question !

          In your DS Job , you can send the date as 20170101 (sending it as 2017-01-01 might read it as 10 characters)and it should be loaded to the  target system successfully. When you view in the target system , it will be visible as 2017.01.01.

          Hope this helps.  Do let us know  !

           

          Regards

          Rupesh

          #Hybris Marketing

          (0) 

Leave a Reply