Skip to Content

Business Planning and Consolidation version for Netweaver extensively uses process chains for running the BPC processes. These process chains are automatically invoked by the BPC application when the BPC user executes processes from the front end. Should these process chains be exclusively executed only within BPC alone or should we able to execute them outside BPC, using native Netweaver BW, may be from within any custom process chains that we may create? Is there any need to do so? And finally, is there any way to do that? Let try to answer these questions in this blog. Let us begin with trying to see if we have any business reason to run the BPC process chain outside BPC application. In order to do that we need to understand how the optimization process works in BPC version for Netweaver.

Optimizing the BPC data model:

A dimension in BPC is equivalent to a characteristic in Netweaver BW and dimension members in BPC are equivalent to characteristic values in Netweaver BW. Taking this further, when a user creates a dimension in BPC version for Netweaver, a Netweaver BW characteristic is generated in the BPC namespace for the same. When a user creates a dimension member for that dimension in BPC version for  Netweaver, a characteristic value is generated in Netweaver BW in the master data of characteristic corresponding to that BPC dimension. When a user creates a BPC application in BPC version for  Netweaver by selecting a few of the BPC dimensions, an infocube (as well as a multiprovider containing that infocube) is generated in the BPC namespace that includes all the characteristics corresponding to the selected BPC dimensions. (You can read more about the BPC namespace at A reservation of a different kind – why, what and how of BPC namespace)

We should distinguish the BPC dimension from the Netweaver BW dimension. In Netweaver BW, the term dimension is used to group the characteristics. How the characteristics in a BPC infocube are organized among the Netweaver BW dimensions within the generated BPC infocube? Well, it depends upon the number of dimensions included in the BPC application. If the number of BPC dimensions in the BPC application is 13 or fewer, then all of them are automatically modeled as line item dimensions in the BPC infocube. This is because Netweaver BW allows upto 13 user defined Netweaver dimensions in an infocube. If the number of BPC dimensions exceeds 13, then the BPC infocube model is automatically generated for those BPC dimensions. The data modeling thus generated while creating the cube may not remain the most optimized one as the fact table of the cube begins to grow. BPC version for Netweaver gives the option to the BPC user to optimize the data model from the front end. As shown below, there are two options to optimize – Lite optimize and Full optimize..

image

The Lite Optimize option does not make any changes to the data model. It just closes the open request; compresses and indexes the cube and updates database statistics. The Full optimize option is the one that may rearrange the characteristics among the 13 user defined Netweaver BW dimensions. The Full Optimize process will check if the size of the dimension table is less than 20% of the fact table or not and create as many line item dimensions as possible. In order to do this reconfiguration, it takes the appset offline, creates a shadow cube with optimal data model; links the new optimal cube to the multiprovider for the application; moves data to the shadow cube; deletes the original cube; closes the open request; compresses and indexes the cube; updates database statistics and brings the appset online again. Though this results in creating a new infocube, the multiprovider remains the same and all the BPC reports are built on the multiprovider and not the underlying infocube. Hence this optimization does not affect the BPC reports reporting this data. 

Using ETL for BPC infocubes:

Since the data that the BPC user enters from the BPC front end is stored in the underlying real time infocube for that application, one may ask whether it is possible for us to load data to that cube with normal Netweaver BW ETL process. The answer to that is ‘yes’ – but with a caveat.

We can use Netweaver BW ETL for the BPC infocubes. Here is an example of a DTP to load data through a flat file to a BPC infocube.

image 

Now if the BPC user chooses to do a Full Optimize’ for this application, it may result in creating a new infocube with more optimal data model. That new infocube, though gets automatically linked to the multiprovider for the BPC application, at present, does not inherit the ETL structure that was built on the original cube. So in the above example, if the BPC user executes a ‘Full Optimize’ for the Finance application, the new optimal infocube for the Finance application may not inherit the DTP created on the original /CPMB/DZID30P infocube. The source system, data source, infosource etc will remain but the transformation that links these to the infocube will get deleted and has to be recreated. If this optimization happens in the production system then the transformation may have to be recreated and transported up the landscape.

A way to obviate such situation is to execute the process chains used by BPC to load data using native Netweaver BW tools, outside the BPC application. In the above example, a flat file is being loaded to the BPC infocube using Netweaver BW ETL tools. However, BPC application itself offers a front end functionality of Data Manager to load data either through a flat file or from any other Infoprovider. Data Manager uses BPC process chains in the background to load the data as shown below.

image

If we can run these process chains outside BPC  – from the EDW layer using the native Netweaver BW, then not only we can integrate this with the custom process chains but also obviate the issue of ETL structures getting deleted on ‘Full Optimize’. Running BPC process chains outside BPC is also important if we are using open hub and want to automate the flat file load to BPC cubes by creating a user defined process chain that integrates the file creation of the open hub and loading of that file to BPC cube. If by any means, our user defined (custom) process chain (that we create in transaction ‘rspc’) can run the BPC process chain to load the data to BPC cube, then we have an ‘industrial strength’ solution for loading data to BPC infocubes using Netweaver toolset. The question now becomes how to accomplish this. Let us try to understand the steps involved.

Steps in using BPC process chain within non-BPC process chain:

The first step is to upload the flat file. If we want to use open hub then the open hub can place the file at any specified location on the BPC application server  or we can upload the flat file to the BPC File service (transaction ‘ujfs’) as shown below.

image

The second step is to create a transformation file using the BPC front end. Though we want to run the BPC process chain with native Netweaver tools, this is the only step that we have to do with the BPC front end. This is because the BPC process chain looks for the XML version of the transformation file. When we process the transformation file from the BPC front end, this XML version of the transformation file is automatically created and stored in the file service.

image

The third step is to create an answer prompt file that passes the required parameters to the BPC process chain. This file should be a tab delimited file. The format of the answer prompt file is as follows:

      %FILE%    ‘csv file path in file service’

      %TRANSFORMATION%                 ‘transformation file path in file service ‘

      %CLEARDATA%       1/0

      %RUNLOGIC%         1/0

      %CHECKLCK%        1/0

Here is an example of the answer prompt file:

image

The fourth step is to run program ujd_test_package with the right Appset and Application. We should use the answer prompt file created in the above step and save the variant for the program as shown below.

image

image

However, please note that this ujd_test_package program was originally designed to assist in debugging the data manager packages. Hence it may not be a bad idea to copy this program to a user defined program and use the user defined program in the next step – just to be on safer side so that if future development changes the nature of this program, then we shouldn’t get unnecessary surprises!

Now in the final step, we are ready to create our custom process chain that executes the BPC process chain. As shown below, create a user defined process chain in transaction ‘rspc’ and include a process type to execute ABAP program. Include ujd_test_package program (or the user defined program created based on ujd_test_package) with the saved variant.

image

image

Activate the process chain and execute the process chain.

image

Thus we can run the BPC process chain from within non-BPC process chains. These steps will work not only for the process chain to load flat file into BPC infocube with open hub, but also for loading data from other Infoprovider to BPC infocube (using the BPC process chain to load data from Infoprovider)

To report this post you need to login first.

8 Comments

You must be Logged on to comment or reply to a post.

  1. Jay Roble
    Great Blog.. a few questions on auto loading flat files to the BPC File Service, specificaly for loading MASTER data from BI to BPC.

    1. Is there a way to load direct to the BPC File Service from Open Hub vs. the UJFS manual “Upload Document to DM”?

    2. You said open hub can place the file at any specified location on the BPC application server.
    – Is that the BI ABAP application server (not the .NET server)?
    – Do you still have to do the UJFS manual “Upload Document to DM”?
     
    3. Can a custom ABAP that creates a flat file  deposit it direct to the BPC File Service?
    – Are there FM or Methods that could be called to upload to the BPC File Service?

    4. Use Generic Data Source for BPC master Data load.
    – Is there a way we could create a Generic Data Source in the BW (Tran RS02) against the BI InfoObejct/master data & then link a BPC Load Master data Process Chain to it?
    – Or even have the Generic Data Source called to deposit the file on the BPC File Service?

    Thanks
    Jay
    PS. Your images don’t appear, might not be linked in correct, but the URL works.

    (0) 
  2. Pravin Datar Post author
    Thanks for the comments.
    1. If you want to use open hub then it is easier to load the file onto the BPC application server instead of the file service. Otherwise, unnecessarily, you will have to write a progrma if you want to load it to file service.
    2. BPC application server is the server that you access in the field ‘BPC Server name’ when you start your BPC client. If you put your flat file on the BPC application server, there is no need to upload the document to the file service.
    3. A custom ABAP can deposit the file to the file service if you want it to go that route. There are function modules that can help you with that. Look in the uj* function modules in SE37 for a list of all BPC related APIs
    4. This discussion in the blog pertains more to the transactional data than master data because for master data, you can safely use ETL process if you want to load data using Netweaver BW toolset. In your question, you are referring to the BPC load master data chain. I believe that that master chain works with flat file load of master data.
    Hopefully the images will start working in a day or two. There seems to be some technical issue with images not showing up in spite of being uploaded successfully.
    Thanks and best regards
    Pravin 
    (0) 
  3. Jay Roble
    You said

    “BPC application server is the server that you access in the field ‘BPC Server name’ when you start your BPC client. If you put your flat file on the BPC application server, there is no need to upload the document to the file service.”

    However, I don’t think this is correct. Since we are running BPC 7.0 version for NET WEAVER on Unix. When we start the BPC client, we are choosing the WINDOWS .Net server, NOT the Unix BW NW Application server!.

    When we run a Data Manager Load file for a flat file, seems it can’t see the Unix server! Which is our application server.

    Is it true the DM is running on the WINDOWS server & trying to log onto the UNIX NW app server, but it can’t see it.

    Instead it should log into the BW application & then find the files?

    Any light you can shed on this would be helpful.

    thanks
    Jay

    (0) 
    1. Pravin Datar Post author
      Hi Jay,
      The app server mentioned here is the BPC application server and not the Netweaver application server that you get in transaction ae11. You can ftp your file from the Netweaver app server to the BPc app server and it will work fine as described in the blog. Sorry for the confusion.
      Regards
      Pravin
      (0) 
  4. Jay Roble
    Thanks for the quick response.

    Since the file must be on BPC App server vs. NW BW App server, this white paper is a little mis-leading.

    How To… Automate BPC 7.0, version for the NetWeaver Platform Master Data Loads from BW to BPC
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00380440-010b-2c10-70a1-e0b431255827

    It indicates you use open hub to create the flat file, which places the file on the NW App server (AL11 DIR_HOME).

    Then on page 24 item 10.1 says
    “The Import File which is what we export via the Open Hub Destination”

    And the screen print shows:
    Import file:
    vpal205sapmntDEVBMGS00work OHUB_CC.csv.

    This looks like the same file path in AL11 on our Unix BW NW app server.
    Perhaps the white paper needs to be updated to reflect that this should be the BPC app server not the NW appserver. And the options to move the file there.

    In the long run, BPC DM should be enhanced to:
    – Be able to read via BPC_the NW appsever & read the DIR_HOME directory files using the BPC_SYSADMIN or BPC_USER system id’s to log into the NW app server.

    Until the load is automated, would be great if you could provide the code for a custom ABAP or custom Process chain to deposit the file to the UJFS file service.

    Thanks
    jay

    (0) 
  5. Marc COSTE
    Hello,

    I’d like to find out how to run automatically the /CPMB/FULL_OPTIMIZE and LIGHT_OPTIMIZE.
    Is there a solutionto do that?

    Best regards.

    Marc

    (0) 
  6. Dear Pravin,

    I would like to ask you if there is a limit in the number of dimensions in a application or if there is a performance issue. I also would like to ask you if there is a limit in the number of properties in a dimension.

    Thanks in advance.

    (0) 

Leave a Reply