Skip to Content
Author's profile photo Former Member

BW Process Chain Design

Queries on BW process chain design(especially about meta-chains,which process types to use etc) regularly appear in the BI forums.This blog attempts to give some know-how on this topic.(Information is applicable till BW ver 3.5)

  • For Data Load from Source System to Infocube
  • Steps..

    Start Variant–>Delete Indexes on the Cube–>Infopackage–>Generate Indexes on Cube–>…

    Index creation step can be followed by steps for constructing DB Statistics,deletion of overlapping requests,roll up of filled aggregates,compression etc(in that order).Links:1,2.

    If complete data target contents are to be deleted before each data load(used in case of full updates to data target),use process ‘complete deletion of data target contents’.

    Start Variant–>Complete Deletion of Data Target Contents–>Delete Indexes on the Cube–>Infopackage–>Generate Indexes on Cube–>…

    image

  • For Data load from Source System to ODS and then to further Data Targets
  • Steps..

    Start Variant–>Infopackage to Load to ODS–>ODS Activation–>Delete Indexes on Target Cubes–>
    Further Update from ODS to Targets–>Generate Indexes on Target Cubes–>…

    If multiple infopackages load to the same ODS,you do not need to place multiple ODS activation steps.All unactivated requests loaded to the ODS will get activated in a single activation step.Link:3.

    Important Note:The checkboxes for ‘Activate ODS object data automatically’ and ‘update data targets from ODS Object automatically’
    in ODS->Change screen hold no significance when ODS is loaded in process chains.Irrespective of the checkboxes being ticked or not,steps for both these activities have to be explicitly placed in the chain(as shown in above steps).

    image

    Similar concept applies to infopackages that use setting ‘only PSA,update subsequently to data target’ in the processing tab.Placing only this infopackage in the chain will bring data only till the PSA.
    Data will goto data target only when you use the process type ‘read PSA and update Data Target'(place it as the immediate step after the infopackage).Links:4,5.

    image

    image

  • For Master Data Loads
  • Steps..

    Start Variant–>Infopackages to Load Master Data…—>AND process–>Hier/Attr Change Run

    In the variant for Hier/Attr change run,specify all the MD infoobjects that are getting loaded in that chain.

    image

    AND process is a ‘collector’ process.It triggers the subsequent process(es) when ALL of the steps which link to it(or precede it),are successful.
    To explain,in above example:the hier/attr change run should run only after all master data infopackages in the chain have run successfully.So AND process is used.

    Other collector processes are OR and EXOR(exclusive-OR) process.
    OR process triggers next process when any of its preceding steps runs successfully.
    EXOR process triggers next process only once(as opposed to the OR process) when any of its preceding steps first runs successfully.

    image

  • Running Chains within Chains(Meta-Chain concept)
  • When a process chain contains other process chains in its steps,it is known as a ‘meta’ chain.And the chains contained in it are called as ‘local’ chains.

    We use this concept when we want the execution of a chain to be dependent on the successful execution of one or more chains.While creating a meta-chain,use process type ‘Local Process Chain’ to insert a local chain as one of its steps.

    image

    Note that for ‘local’ chains,start variant should have the setting ‘start using meta chain or API’ and not as ‘direct scheduling’.

    image

    For more information please see this link(help.sap for BW 3.5 ‘process chains’ section)

Assigned Tags

      9 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Kenneth Murray
      Kenneth Murray
      This blog has no introduction and no context.  Is it meant to be a how to?  Add Value different from the SAP Help Links provided in the blog? 
      Author's profile photo Former Member
      Former Member
      Perhaps the bit in the description from was missing, happens sometimes people forget that the description  is only for the topic/most recent page and the RSS/EMail subscription and does not show when reading the page - I added it to the body as well - that help any or do you still feel that something is missing? At which case the author needs to jump in here and make some corrections.
      Author's profile photo Former Member
      Former Member
      Hi Kenneth,

      Many Thanks for posting your valuable feedback.

      And apologies because I missed out pasting the introduction from the blog description,to body of the blog text.

      I believe I have given additional information than what is mentioned in help.sap.And gave the links for more detailed reference.

      Craig~Thank You for adding the description to the blog text.

      regards,
      Vishvesh

      Author's profile photo Former Member
      Former Member
      1. Let me introduce my self; this is raghavendra kolli working for LGS Ltd (India).

      2.Many Thanks for posting your valuable feedback. I feel that something is missing here is how it functions.

      3.I have a small suggestion on load errors in process chains and solutions on these would be great. For example: How AND, OR , EXOR – Processes if fails at the execution, the basic steps where we need to check.

      Regards,
      kolli

      Author's profile photo Former Member
      Former Member
      Thanks for the feedback Kolli.

      I will try to include the requested information in further blogs.

      cheers,
      Vishvesh

      Author's profile photo Former Member
      Former Member
      Hi Vishvesh,

      I activated a simple process chain on our local server using your method (For Data Load from Source System to Infocube) and checking the cube (Manage>Request) I find that there is no job request there which by right the job should have been executed after the process chain is activated.

      Curious, I tried executing job scheduling and this error was produced:-

      Job BI_PROCESS_DROPINDEX could not be scheduled. Termination with returncode 8
      Message no. RSPC065

      Diagnosis
      Program RSPROCESS is to be scheduled as job BI_PROCESS_DROPINDEX under user ALEREMOTE.

      System Response
      Scheduling terminated with return code 8. The meanings of the return codes are as follows:

      SY-SUBRC = 4:
      Scheduling terminated by user
      SY-SUBRC = 8:
      Error when scheduling job (JOB_SUBMIT)
      SY-SUBRC = 12:
      Error in internal number assignment
      Procedure
      Check the system log for more detailed information.

      Execute Function

      Appreciate if you could shed some light on the issue that I'm currently facing.

      Thanks!

      Cheers!
      Eric Tang
      Consultant
      Firefly Sdn Bhd

      Author's profile photo Former Member
      Former Member
      Sorry..I was unable to help you with this issue.

      If the problem is resolved,could you please post the solution in reply.

      Thanks,
      Vishvesh

      Author's profile photo Former Member
      Former Member
      Hi
      Your blog helped me start with PCs but after referencing it couple of times realize little more detail would be of great help.

      For example I have an ODS that feeds other 2 depending on Update rules and from this two ODS all converge into a Icube how can I implement this ??

      Thanks

      Author's profile photo Former Member
      Former Member
      Hi,

      Thanks for the feedback.I didnt want to make the blog too long,thats why I supplied limited information and gave links to the help.sap pages.

      help.sap has all required help on process chains.There are also 2-3 good PDF's on process chains..in SDN.Search for 'process chains' in left hand search box and change drop down to 'library' or downloads.

      There are also many threads that talk on this data loading scenario.

      Ok..now for this issue..

      I understood that u are first loading ODS1 and it in turns loads 2 ODS's (lets call them ODS2 and ODS3)..then..these ODS2 and ODS3 load to a single infocube..say cube1.

      So now we sort of merge sections 1 and 2 from the blog..

      Start Variant-->Infopackage to Load to ODS1-->ODS1 Activation-->Further Update from ODS to Targets(ODS2 and ODS3)-->Activation of ODS2 and ODS3--->Drop Indexes on Cube1--->Updates from ODS2 and ODS3 to Cube1-->Generate Indexes on Cube1

      U can also split this long chain into multiple local chains and then use them in a meta chain.

      Hope this helps!

      cheers,
      Vishvesh