Skip to Content

Summary:

This paper provides the overview of BW data push to third party system (INFORMATICA). In particular, it highlights the major topics such as challenges faces by any organization in present day and how to overcome. The purpose of this document is to know how to improve the performance and how to face the issues while connecting with Non-SAP system.

Present SAP BW Work flow:

We are have few standard and customized data sources and we have DSO’ in EDW layer and top of it we have Propagation layer DSO’s. On top of the Preparation layer DSO’s we have built Open Hubs  as Third party Target system and RFC destinations will be pointing to INFA.

INFA team will convert the data and sends to OOBI which is used for reporting business reports.

Firstly, we should create an RFC destination path between BW and INFARMATICA, once the RFC is build we can start building Open Hubs’s which targets has “Third party “ and up we should maintain the same RFC destination name in all the Open Hub’s which are needed to push data to INFA environment.

Example : RFC destination name : ZBIBD1

/wp-content/uploads/2014/01/1st_359919.png

Once the RFC is given, click on parameters tab and we should maintain: WORKFLOW & FOLDERNAME details which we INFA team will be giving.

Workflow: Is the table name which INFA team will have creating.

Work flow will be maintained in a certain folder at INFA, so we need to get that details from INFA team.

This ends the initial steps to be done from BW.

/wp-content/uploads/2014/01/2_359920.png

We have created the above Open Hub on top of Standard DSO and Info cubes:

Standard DSO:

Data Flow–>Data source–>DSO1–>DSO2–>Open hub

On top of DSO2 we have created the Open hub.

Prerequisites which needs to be handled while pushing data from a DSO to INFA via Open Hub:

·    –>All the Keys fields in the DSO should be the made as the primary keys at INFA end also because uniqueness of the data is pushed by keys.Whenever there is a change in the structure of the DSO, we should make INFA team aware of the changes done from BW and same should be done at INFA end so that load will not fail.

W—->We should handle updated data records like New, Before Image & After Image because we are dealing with Standard DSO- In few projects this can be handle in INFA end but in our case it was handled at BW end byfiltering out before imagine  “X” values at DTP level.

/wp-content/uploads/2014/01/99999999999_359924.png

·    –>If there is  delete and full load from BW end with new updated data  then we should make INFA team to be  informed so that they will truncate the table’s from their end if required.

·    –>Make sure these changes moved to DEV to QUA or PROD system from BW end first then INFA changes moved to DEV to QUA then trigger the load in QUA system…

      Infocube: 

      Data Flow–>Data source–>DSO1–>INFOCUBE–>Open hub

·    –>We have created on info cubes because the underline data flow is delete and reload of the data. If we use Standard DSO then it will hamper time to active the table which leads to delay.

·    –>All the characteristics in the cube should be given as the Primary keys at INFA end because we don’t have any tables as standard DSO.

·    –>Data load is daily delete and reload from BW end, so at INFA end new records will be inserted or updated or old records will be rejected.

·    –>If there is  delete and full load from BW end with new updated data  then we should make INFA team keep informed so that they will truncate the table’s from their end if required (same as DSO).

       Error Handling:

       

·    –>Once the data is triggered from BW in few case it might got failed due to connection issues or data issues at that point we always need to delete the failed request with program “RSBK_DEL_DTP_REQ_FROM_Ohd

Go to Tcode : SE38 give the above program name and execute.

    /wp-content/uploads/2014/01/3_359925.png

Provided the OH name and failed request id from DTP, make sure to uncheck the TEST RUN tab and execute it and the failed request will get deleted.

·    —>For few data loads time might take much longer and the request will be in “YELLOW” state without any update of data, at this point we should not start the new load ,first we should make the running load status to RED then we can start the new load.

To change the status to RED we can use Function Module “RSB_API_OHS_REQUEST_SETSTATUS”

Open SE37 give function module name and execute:

/wp-content/uploads/2014/01/4_359926.png

Provide Request ID from DTP (yellow) and put status as “RED” then you can give the message as you wish.

Now execute, request will turn RED then you can use the SE38 Program as mentioned above to delete the request .Once this is done we can go ahead with next data load .

       

        Process chain:

      

       

When including this Open Hub to a process chain then we should make sure to  maintain “SEND STATUS “Customized program, which includes RFC destination check, PARAMETERS & SEND STATUS (customized function module).

  1. RFC Destination–>RFC_SYSTEM_INFO (function module)
  2. PARAMETERS–>RSB_API_OHS_DEST_SETPARAMS(function module).

/wp-content/uploads/2014/01/5_359936.png

POOLING FLAG:

With this indicator you can control the behavior of the main process when you have distributed processes. Distributed processes, such as the loading process, are characterized as having different work processes involved in specific tasks. With the polling flag you determine whether the main process needs to be kept as long as the actual process has ended. By selecting the indicator, a high level of process security is guaranteed, and external scheduling tools can be provided with the status of the distributed processes.

/wp-content/uploads/2014/01/6_359937.png

Hope this document helps and please provide the feedback

Thanks for reading

To report this post you need to login first.

11 Comments

You must be Logged on to comment or reply to a post.

  1. Kanungo Gaurav

    Hi Rohith,

    Very helpful document, appreciate. I have a question here. You are using “RSB_API_OHS_REQUEST_SETSTATUS” and RSBK_DEL_DTP_REQ_FROM_Ohd programs to handle the error requests and delete them. Is it a standard SAP methodology to do this? We are following the below steps (mentioned on some other blog)which are in sync to yours. But is it a workaround solution or just the way SAP wants us to handle this?

    1) check table RSBREQUID3RD there should be an entry with the OH Destination and the request number

    if there is none (for example when the process got stuck somehow) create an entry, if there is one, change the 3rd party status to “G”

    2) check table RSBKREQUEST there should be an entry for the request number. Make sure the Technical Request status TSTATE is equal to 2 (Processed successfully)

    3) run function module RSB_API_OHS_REQUEST_SETSTATUS and give as parameters the request number and STATUS = “R”

    4) run program RSBK_DEL_DTP_REQ_FROM_OHD and give as parameters the OH destination and the request number. uncheck the test run. This program will back track and delete the indicated request all subsequent requests in that OH destination

    5) Verify that the data request (and subsequent) was deleted from the OH destination by displaying the contents of table /BIC/OH<your OH destination name here>

    (0) 
    1. Rohith D Post author

      Hi Gaurav,

      Using FM & Program(“RSB_API_OHS_REQUEST_SETSTATUS” and RSBK_DEL_DTP_REQ_FROM_Ohd”)is always the better and safe way  to delete or change the request status)because changing the status at TABLE level is not a best practice ,some project might have limited access to do so in Prod landscape.

      To be in a safer side its better to use FM & Program.

      Thanks,

      Rohith

      (0) 
        1. Rohith D Post author

          Hi Gaurav,

          This is how SAP suggested to do so thats why this FM & Prg are being created and being used in many projects.

          I belive this is not an workaround this is the right way to deal in SAP and i dont think it is a product bug.

          Thanks,

          Rohith

          (0) 
    1. Rohith D Post author

      Yes Janaina , we can load data from Informatica to BW.

      It can be done by creating RFC connection between the systems.

      Create a data source with required fields from INFA table strcture in BW.

      Data Source — Extraction Tab

      Adapter – from which table u need data .

      DB connection type — basis team can provide you.

      DB User —  basis team can provide you.

      Table/View — required table name or workflow name.

      Thanks

      Rohith

      (0) 
  2. Anders Kortbæk

    Anyone with any experience in regards to performance?

    Currently our throughput is about 135 rows/sec when sending data to Informatica.

    This is very low. Any idea on how to increase this?

    Thanks, Anders

    (0) 

Leave a Reply