Applies to:

SAP SCM 7.2 Support Pack 8 and HANA 1.00.74.390550 (NewDB100_REL)

Summary

This document gives an insight into configuring function to replicate planning data (transaction data) from your SAP APO Live Cache to an SAP HANA system through a secondary database connection. Download and Install the HANA content model for Business Analysis.

Master data is replicated using SAP Landscape Transformation (SLT) but that is not possible for planning data as it does not store in database tables, but in SAP liveCache.

We run the reports or schedule the delta to push the planning data in Demand Planning (DP) and Supply Network Planning (SNP) to be available HANA system for Business to analysis.

Author:          Santanu Bose

Company:      Wipro Ltd.

Created on:    26 July 2014

Author Bio

santanuJPG1.jpg

Table of Contents

Step 1:  Install HANA DB Client in APO system………………………………………….3

Step 2: Setup Secondary DB Connection with HANA…………………….………….…3

Step 3: Set up a replication model and activate it.………………………………………4

Step 4: Run Full Replication..……………………………………………………..…………5

Step 5: Run Full Replication and Scheduling………………………………….…………6

Step 6: HANA Content Model installation…………………………………………………9

Step 7: Run Business Analysis on data in HANA………………………………….…..11

REFERENCE………………………………………………………………………………….12

Disclaimer and Liability Notice…………………………………………………………….12

Step 1:  Install HANA DB Client in APO system.

Need to install HANA Database client in SCM (APO) System.

Step 2: Setup Secondary DB Connection with HANA

Create Secondary DB connection   (/ndbco)

/wp-content/uploads/2014/07/ndbco_506784.jpg

Step 3: Set up a replication model and activate it.

Create Replication model (SAPAPO/REPL_MOD_VIE)

REPL_MOD_VIE.jpg

Define the Key Figures:

KeyFigure.jpg

Activate Replication Mode (/n/SAPAPO/REPL_MOD_MNG)

REPL_MOD_MNG.jpg

Upon activation the tables should be created in HANA database.

HANATABSTR.jpg

Step 4: Run Full Replication.

Replication of Planning Data to SAP HANA (/SAPAPO/REPL)

SAPAPO_REPL.jpg

Step 5: Run Delta Replication and Scheduling

Schedule delta replication (/SAPAPO/REPL_DELTA )

REPL_DELTA.jpg

deltaReplJob.jpg

The table’s data should be available in HANA database.

/wp-content/uploads/2014/07/hanadata2_506800.jpg

Step 6: HANA Content Model installation

Download HANA Content Model (/SAPAPO/REPL_REPLICATION_MODE)

HANACNTMDL.jpg

Install the HANA Content Model using import function in HANA Studio

/wp-content/uploads/2014/07/content_506819.jpg

Step 7: Run Business Analysis on data in HANA

Run analysis on the data in Calculation view been installed.

Analysis.jpg

REFERENCE

http://help.sap.com/saphelp_apo700_ehp03_on_erp/helpdata/en/94/ae821e58fa4dbdac0c71028b943696/content.htm?frameset=/en/bb/b2d7873071401e902a21e1e64a055b/frameset.htm&current_toc=/en/95/2954f96ba8403f97aaaf01954ff10d/plain.htm&node_id=5&show_children=false

http://help.sap.com/saphelp_apo700_ehp03_on_erp/helpdata/en/bb/b2d7873071401e902a21e1e64a055b/content.htm?frameset=/en/15/533802a43249208cc6e3348240ee76/frameset.htm&current_toc=/en/95/2954f96ba8403f97aaaf01954ff10d/plain.htm&node_id=6

Disclaimer and Liability Notice

This document may discuss sample coding or other information that does not include SAP official interfaces and therefore is not

supported by SAP. Changes made based on this information are not supported and can be overwritten during an upgrade.

SAP will not be held liable for any damages caused by using or misusing the information, code or methods suggested in this document,

and anyone using these methods does so at his/her own risk.

SAP offers no guarantees and assumes no responsibility or liability of any type with respect to the content of this technical article or

code sample, including any liability resulting from incompatibility between the content within this document and the materials and

services offered by SAP. You agree that you will not hold, or seek to hold, SAP responsible or liable with respect to the content of thisdocument.

To report this post you need to login first.

19 Comments

You must be Logged on to comment or reply to a post.

  1. Justin Molenaur

    Hi Santanu – great writeup of the work we had done. I’m glad I could help you resolve some of these hurdles and get it up and running.

    Happy HANA,

    Justin

    (0) 
  2. Justin Molenaur

    Also to add on here, it’s important that your secondary database connection has the same name as the SLT connection to HANA. The reason being is that the target schema for SLT will be the same as the APO, ie the livecache data will be dropped into tables within the same SLT replication schema.

    In addition, as a prerequisite there are a ton of OSS notes to consider applying in SCM depending on what version you are on. These will affect both the replication functions as well as the Supply Chain InfoCenter if that is part of your solution.

    1784479 – OSS notes relevant if you are on SCM SP06

    1863896 – OSS notes relevants if you are on SCM SP08

    1904500 – OSS notes relevants if you are on SCM SP08

    Happy HANA,

    Justin

    (0) 
    1. SANTANU BOSE Post author

      Thanks Justin for sharing your thought and knowledge of OSS messages.

      But can we get some more clarity on the same naming for secondery database connection in SCM and SLT to HANA. Cause I think the user name parameter of the Database connction is the driving factor to connect to the targer schema in HANA and yes that has to remain same in both SLT and APO to get the master and transaction (planning) data unders same SLT repliaction schema.

      Thanks,

      Santanu

      (0) 
    2. Sohil Shah

      Thanks Santanu & Justin for helpful information.

      @Justin: Also to add on here, it’s important that your secondary database connection has the same name as the SLT connection to HANA. The reason being is that the target schema for SLT will be the same as the APO, ie the livecache data will be dropped into tables within the same SLT replication schema.

      How is this possible, could you please elaborate? I mean we will create SLT schema and in HANA Studio it gets created. But what would be the password for the same schema as i believe we never log in with the same schema user ID. Please help identifying the password for SLT schema.

      (0) 
      1. Justin Molenaur

        Hey Sonil, the actual user that is used in the secondary DB connection in APO needs to be the same user as the HANA schema owned by SLT.

        You’re right, by default you won’t get the password for the SLT user defined from the initial SLT configuration, a few extra steps are required. In the example above the dbco doesn’t show a realistic user.

        1) Set up SLT connection from APO->HANA, you should now have a schema like “APO_HDB” created in HANA and also the same named user created.

        2) Ensure that the SLT master job is stopped

        3) As system user or similarly provisioned security admin, change the password for user APO_HANA in HANA.

        4) In SLT, in dbco, update credentials for the HANA connection with the new password.

        5) For the dbco configuration step in APO (like laid out by Santanu), use the same APO_HANA user to connect.

        6) In SLT, restart the master job and the new credentiala should work.

        7) APO replication config can continue

        In this way, the live cache replication will drop the data into the same schema that SLT uses for replicating actual tables from APO. This is not super clear in the reference material, it’s only lightly referred to, but the above steps should make it more clear.

        Happy HANA,

        Justin

        (0) 
        1. Sohil Shah

          Thanks Justin. This is indeed very helpful insights.

          Well, I have quick question, the method of Live cache data will take care of ONLY time series planning data but is there any way we can load Order based planning data in HANA? Just curious if there is any way we can do so.

          Once again much appreciated !

          (0) 
        2. Purav Mehta

          I used System user, but getting an error

          Could not open connection <CONNECTIONAME>.

          DBI error    16  occured.

          Please note, I have used connection name same as APO server connection created in SLT.

          (0) 
  3. Kamal Mehta

    Hi Shantanu,

    Thanks.

    Can you please throw some light on ‘How to Transfer this Replication Model Configuration’ to other Environments such as Quality and Production from Dev.

    This would help a lot .

    Regards

    Kamal

    (0) 
    1. Matthew Bruckner

      Kamal,

      I created an OSS message to ask SAP this very question.  According to SAP OSS the replication model needs to be created locally.

      Thanks,

      Matt

      (0) 
        1. Justin Molenaur

          Yes – this is an unfortunate fact – none of the replication models are transportable. I have taken 3-4 models to production at my current client and in each system it must be created manually. This means you must take care in the model creation or risk missing a measure.

          Regards,

          Justin

          (0) 
  4. Matthew Bruckner

    Hi Shantanu,

    I have followed your instructions, and I am sending data out to Hana from live cache.  One gap that we find is that we cannot see the initial bucket in the Hana DB.  Is it possible to send this to the Hana DB?

    Thanks,

    Matt

    (0) 
    1. Justin Molenaur

      You don’t have any control on the time buckets that are replicated. This is all going to be set in APO based on the time horizons for the specific planning data view (I assume). In the 4 models I have worked on (3 SNP), there was never an initial bucket available in any of them.

      I’m not sure how an INITIAL bucket would be replicated, since the C_MONTH is part of the characteristics “key”.

      It may be worth opening an OSS message on this. There are only a handful of processors that specialize on this solution and they are usually quite fast and knowledgeable.

      Regards,

      Justin

      (0) 
      1. SANTANU BOSE Post author

        Justin –

        Thanks much for your reply really appreciate it.

        I do not have any access in SAP source systems for last one year so  could not check on this 🙁 as I have been working in Oracle ERP as source.

        Thanks,

        Santanu

        (0) 
  5. Business Warehouse Global IM

    Hi Santanu,

    Doe this model support real-time reporting from APO LIVE Cache? We have a requirement to have reporting on APO LIVE Cache data and would like to know if that is possible.

    Thanks,

    Balaji Venugopal

    (0) 
    1. Justin Molenaur

      The best you can get is near real time by setting the replication job frequency in APO appropriately. I don’t see a reason you couldn’t set this to run every 1 minute if you really want to.

      Regards,

      Justin

      (0) 

Leave a Reply