Financial Management Blogs by SAP
Get financial management insights from blog posts by SAP experts. Find and share tips on how to increase efficiency, reduce risk, and optimize working capital.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member639063
Participant

With more than a year since SAP Profitability and Performance Management Cloud (SAP PaPM Cloud) was released to the market, customers are now starting to configure, model and import more extensive and demanding content. These environments are utilizing input data that ranges from hundreds up to the millions worth of data. And as a Modeler, you might want to know the most efficient way in uploading a dataset so that you could focus your energy on actual modeling activities.


To make it simple, I'll be using ranges to differentiate small and large input files. Under these two sections are step-by-step procedure on how to make these data available in your Modeling environment in SAP PaPM Cloud  – in which I think would be the most efficient way. So, let's begin!


Update as of 12/12/2022 

Data upload functionality has been improved in SAP PaPM Cloud! The headers and text in this blog post has been updated accordingly. The old images still can serve as a procedure reference.

I. Small [Input File < 20, 000 data records]  


As a Modeler you would like to upload less than 20,000 data records into your Modeling environment. The procedure below will show you how to make use of the Upload feature of a Model Table.




  1. In the Modeling environment, create a model table, maintain its properties and declare the necessary fields. Then choose Save.
    Or you can use an existing model table and proceed to the next step.
    *Depending on your current screen resolution, images used in this blog post might appear blurry, simply click on the image to see the actual resolution*

    NOTE: If you have an XLS* file and it only consists of around 1,000 records or less, a simple Copy [CTRL+C] from source file and Paste [CTRL+V] to the data table can be performed as an alternative to the succeeding steps. The Data Editor will only appear if Transport Data setting is checked.

  2. Under General Property section of the Model Table, uncheck Transport Data option.
    Data Editor will be disabled and Upload icon appears. Then, Save your changes.

  3. Click on the Upload icon and select the file to be uploaded from your local machine.

  4. In the Data Editor, a Data Upload prompt appears. If the upload file's first row is as a Header (for fields) check the Skip header row setting. Otherwise, uncheck it. Then, choose Add.

  5. Notification: Upload completed will appear and number Data rows are shown in the Properties Panel Data Editor section.

  6. Under header Tools section choose Show icon to view the Model table records.


II. Large [Input File > 20, 000 data records]  


As a Modeler you would like to upload more than 20,000 data records and utilize it in your modeling environment without introducing new fields. Since SAP PaPM Cloud is not meant to be an Extract, Transform and Load (ETL) tool, the most efficient way of uploading large sum of data records are through the tenant's underlying SAP HANA Cloud database. The following procedure will show you how to upload an input file using SAP HANA Database Explorer.




  1. From any of the SAP PaPM Cloud screens, in the upper left corner, expand the Menu > Administration > Settings

  2. Under Database Settings section, click on the HANA Dashboard URL link.

  3. In the SAP Business Technology Platform login prompt, you may opt to use the Default Identity Provider or sign-in using an Alternative Identity Provider credentials.

  4. In the Database Overview screen Credentials dialog box, login using user SAP_PAPM_ADMIN and the password as provided under Database Settings from Step #2.
    Then choose SQL Console.

  5. In SAP HANA Database Explorer, click on the Database entry dropdown > Catalog > Tables
    View the available Schemas then search for SAP_PAPM and Select it.This will expose all Tables under SAP_PAPM schema which is the default schema of SAP PaPM Cloud. Meaning, all generated tables from the application are stored here.

    USEFUL TIP: In your web browser tool bar, you might want to bookmark the current URL as this will point automatically to Database of your SAP PaPM Cloud Tenant. Having a direct link, will save few steps when accessing the DB the next time around.

  6. 1.The goal of this step to copy the CREATE statement of an auto-generated Y-TABLE from the application and use it to create another HANA table that will be used for uploading.
    NOTE: This step is optional, if you already have a created Model Table HANA with the desired fields you may proceed to next step.6.2. Using the Model Table from the previous example of Small data records, use the search bar to find its Y-Table using the following naming convention:
    <Y105><ENVID><VERSION><FUNCTION_ID>


    6.3. Right click on the Table > choose Generate CREATE Statement.

    6.4. Focus on the SQL Console statement. Change the schema from SAP_PAPM to SAP_PAPM_ADMIN (or another schema desired which is existing in the database).
    Optionally, you can change the HANA Table name from "Y105XXXXXXXX" to a name that you desire.
    6.5. In the SQL statement, remove ALL "FS_PER*" columns.


    6.6. Remove the Primary Key statement.


    6.7. Execute the Statement. Upon doing so, notice that a new table is created on the left pane under SAP_PAPM_ADMIN schema.




  7. 1. Right-click the Model Table HANA created and that will be used for uploading data records, then, choose Import Data

    7.2. In the Import Data wizard, choose Import Data then proceed to Step 2.
    7.3. In the Import Source, choose the relevant settings depending on the input file then choose Browse. Proceed to succeeding steps 3 and 4.
    7.4 In the Table Mapping, make adjustments as necessary then proceed to Step 5.
    7.5 Choose the desired Error Handling option then proceed to Review.
    In the Import Summary, double-check the details then choose Import to Database.
    7.6. In just a few seconds, 10,000 data records were imported successfully. Note that this is not the maximum number of records that can be uploaded and was just used as an example.



  8. Still in the SAP HANA Database Explorer, execute the following statement:
    GRANT ALL PRIVILEGES ON SCHEMA "SAP_PAPM_ADMIN" TO SAP_PAPM;​

    NOTE: Replace SAP_PAPM_ADMIN schema in the statement if the HANA table was created on a different schema.
    With this statement executed successfully, the default schema SAP_PAPM will have the privileges to read/write into artifacts that include previously created HANA table inside SAP_PAPM_ADMIN schema.



  9. Revert to the SAP PaPM Cloud application, choose the Menu > Administration > Connections

  10. In the Connection Management screen, create a new connection using the Add icon (+). Maintain the connection properties and choose Confirm

  11. Back the Modeling environment, create a Model Table HANA by dragging it from the Palette to the Modeling diagram.
    Then, maintain the Model Table HANA Function ID and description as desired. Maintain the Connection Name.
    Then choose the pencil icon to enter field mapping screen.

  12. In the Field Mapping Environment Fields section, map the fields to the existing environment fields in the environment. Then, disable the full screen.

  13. In the Modeling screen, choose Save. Activate. Then, Show.
    NOTE: If the input file contains NULL records after uploading it as a HANA table and causing you troubles in your Model, I've published a blog post that might be helpful in such cases: https://blogs.sap.com/2022/06/07/how-to-deal-with-imported-input-datas-null-values-and-consume-it-in...


I really appreciate that you've reached this part of my blog post; tried my best to ensure that the procedures were concise and can be easily followed with the help of images as references.


To summarize:




  • Datasets under 1, 000 records >> Copy + Paste

  • Datasets under 20, 000 records >> SAP PaPM Cloud Model Table Upload functionality

  • Datasets above 20, 000 records >> SAP HANA Database Cloud Import data functionality


With the guidance of the methods shown above, I trust that you were able to import the required input data regardless of its the size in your modeling environment and could now proceed with further modeling activities in SAP Profitability and Performance Management Cloud.


If you found this blog post helpful, a like and quick share to your colleagues would be awesome! Feel free to comment your questions or if you have suggestions for a next topic of a blog post that would benefit the community.


Cheers!

8 Comments