BW Process Chain Design
- For Data Load from Source System to Infocube
-
Steps..
Start Variant–>Delete Indexes on the Cube–>Infopackage–>Generate Indexes on Cube–>…
Index creation step can be followed by steps for constructing DB Statistics,deletion of overlapping requests,roll up of filled aggregates,compression etc(in that order).Links:1,2.
If complete data target contents are to be deleted before each data load(used in case of full updates to data target),use process ‘complete deletion of data target contents’.
Start Variant–>Complete Deletion of Data Target Contents–>Delete Indexes on the Cube–>Infopackage–>Generate Indexes on Cube–>…
- For Data load from Source System to ODS and then to further Data Targets
-
Steps..
Start Variant–>Infopackage to Load to ODS–>ODS Activation–>Delete Indexes on Target Cubes–>
Further Update from ODS to Targets–>Generate Indexes on Target Cubes–>…If multiple infopackages load to the same ODS,you do not need to place multiple ODS activation steps.All unactivated requests loaded to the ODS will get activated in a single activation step.Link:3.
Important Note:The checkboxes for ‘Activate ODS object data automatically’ and ‘update data targets from ODS Object automatically’
in ODS->Change screen hold no significance when ODS is loaded in process chains.Irrespective of the checkboxes being ticked or not,steps for both these activities have to be explicitly placed in the chain(as shown in above steps).Similar concept applies to infopackages that use setting ‘only PSA,update subsequently to data target’ in the processing tab.Placing only this infopackage in the chain will bring data only till the PSA.
Data will goto data target only when you use the process type ‘read PSA and update Data Target'(place it as the immediate step after the infopackage).Links:4,5. - For Master Data Loads
-
Steps..
Start Variant–>Infopackages to Load Master Data…—>AND process–>Hier/Attr Change Run
In the variant for Hier/Attr change run,specify all the MD infoobjects that are getting loaded in that chain.
AND process is a ‘collector’ process.It triggers the subsequent process(es) when ALL of the steps which link to it(or precede it),are successful.
To explain,in above example:the hier/attr change run should run only after all master data infopackages in the chain have run successfully.So AND process is used.Other collector processes are OR and EXOR(exclusive-OR) process.
OR process triggers next process when any of its preceding steps runs successfully.
EXOR process triggers next process only once(as opposed to the OR process) when any of its preceding steps first runs successfully. - Running Chains within Chains(Meta-Chain concept)
-
When a process chain contains other process chains in its steps,it is known as a ‘meta’ chain.And the chains contained in it are called as ‘local’ chains.
We use this concept when we want the execution of a chain to be dependent on the successful execution of one or more chains.While creating a meta-chain,use process type ‘Local Process Chain’ to insert a local chain as one of its steps.
Note that for ‘local’ chains,start variant should have the setting ‘start using meta chain or API’ and not as ‘direct scheduling’.
For more information please see this link(help.sap for BW 3.5 ‘process chains’ section)
Many Thanks for posting your valuable feedback.
And apologies because I missed out pasting the introduction from the blog description,to body of the blog text.
I believe I have given additional information than what is mentioned in help.sap.And gave the links for more detailed reference.
Craig~Thank You for adding the description to the blog text.
regards,
Vishvesh
2.Many Thanks for posting your valuable feedback. I feel that something is missing here is how it functions.
3.I have a small suggestion on load errors in process chains and solutions on these would be great. For example: How AND, OR , EXOR – Processes if fails at the execution, the basic steps where we need to check.
Regards,
kolli
I will try to include the requested information in further blogs.
cheers,
Vishvesh
I activated a simple process chain on our local server using your method (For Data Load from Source System to Infocube) and checking the cube (Manage>Request) I find that there is no job request there which by right the job should have been executed after the process chain is activated.
Curious, I tried executing job scheduling and this error was produced:-
Job BI_PROCESS_DROPINDEX could not be scheduled. Termination with returncode 8
Message no. RSPC065
Diagnosis
Program RSPROCESS is to be scheduled as job BI_PROCESS_DROPINDEX under user ALEREMOTE.
System Response
Scheduling terminated with return code 8. The meanings of the return codes are as follows:
SY-SUBRC = 4:
Scheduling terminated by user
SY-SUBRC = 8:
Error when scheduling job (JOB_SUBMIT)
SY-SUBRC = 12:
Error in internal number assignment
Procedure
Check the system log for more detailed information.
Execute Function
Appreciate if you could shed some light on the issue that I'm currently facing.
Thanks!
Cheers!
Eric Tang
Consultant
Firefly Sdn Bhd
If the problem is resolved,could you please post the solution in reply.
Thanks,
Vishvesh
Your blog helped me start with PCs but after referencing it couple of times realize little more detail would be of great help.
For example I have an ODS that feeds other 2 depending on Update rules and from this two ODS all converge into a Icube how can I implement this ??
Thanks
Thanks for the feedback.I didnt want to make the blog too long,thats why I supplied limited information and gave links to the help.sap pages.
help.sap has all required help on process chains.There are also 2-3 good PDF's on process chains..in SDN.Search for 'process chains' in left hand search box and change drop down to 'library' or downloads.
There are also many threads that talk on this data loading scenario.
Ok..now for this issue..
I understood that u are first loading ODS1 and it in turns loads 2 ODS's (lets call them ODS2 and ODS3)..then..these ODS2 and ODS3 load to a single infocube..say cube1.
So now we sort of merge sections 1 and 2 from the blog..
Start Variant-->Infopackage to Load to ODS1-->ODS1 Activation-->Further Update from ODS to Targets(ODS2 and ODS3)-->Activation of ODS2 and ODS3--->Drop Indexes on Cube1--->Updates from ODS2 and ODS3 to Cube1-->Generate Indexes on Cube1
U can also split this long chain into multiple local chains and then use them in a meta chain.
Hope this helps!
cheers,
Vishvesh