Here are ideas on how to create a workflow template more efficiently, reduce the load on the DB, errors and a useful trick or two.

Use a responsibility rule instead of fixed positions or users.

Even if the approver is only one user or a holder of a single position it’s better to use a responsibility rule using a code to represent the approver, you can easily link a position or a user to the code and this way you don’t need to change the template every time a user changes, you can also maintain the approver values and create a search help for this by using a domain with fixed values, for example approver code 01 = manager 02 = finance 03 = CEO. In order for the search help to be displayed you will create a date element based on the domain and include it in a structure.  If you create the rule element as a structure/field the fixed values will be displayed as the search help values.

Rejection of an asynchronous workitem

I like using asynchronous tasks but rejecting them could be a problem, one way this can be done by selecting the” processing can be rejected” flag in the second tab of the workflow builder. This will add the option for the user to reject the workitem and add a new branch in the workflow builder display. This option has a small problem since the workitem exporting bindings will not be filled (actual agent for example) this can be overcame by using the _PREDECESSOR_WI container element in the of the workflow container (see also note 903692 – ‘using the _PREDECESSOR_WI workitem’)

Limiting the use of sub workflows

I recommend using sub workflows, aseptically when using the same sub process more than once in the main workflow template and also it makes the workflow template more readable, but there is a drawback, every time you use a sub workflow a lot of logs are created container/history/workitem head etc. this means that for a high volume workflow DB tables will be having a lot more data written to them in comparison to a loop containing all the logic.

For example, one customer which has more the 100K dialog workitems per mouth we had a greater amount of sub-workflows per mouth, this caused about 40% of the DB to be taken up by the workflow tables, which forces us to remove almost all the sub-workflows in order to reduce the DB load.

Always add a check before a send mail step for the recipients

People like receiving information mails in the process of a workflow template but it is not usually a process stopper, if the information mail was not sent this should not usually stop the next approver from receiving the approval mission, so it’s a good idea to add a check that the recipients list is not empty before the send mail step since sometimes users mail are not maintained. (Small tip – in case you didn’t know – send mail steps can receive multiple recipient lists)

Use more attributes

This has been discussed many times, but I think it will not hurt to say it again, use attributes instead of methods to get the data as much as possible, this will cause less use of the DB and make the workflow template less complicated. For example the _workitem container element is a very useful tool; it has a lot of data on the workitem for example the approver name can be found by using it’s ‘executedbyuser’ attribute and it’s ‘Name’ attribute (see for more details  _WORKITEM Container element in SAP Workflow). It can be enhanced to add a few more option for example adding the decision description or the text of the user’s notes.

Data base field calculations

Have you noticed sometimes that calculations of data base object attributes keep old values even if the field was changed? Now there is the option to use the ‘refresh self’ option in a method before the next task, but there is another option, if you look at the code of the DB selection pay attention to the line that checks if the key fields of the table are initial, only if not then the selection from the DB is made, otherwise the data is read from the memory which may be out of date, so for fields with high importance, transfer them to ‘virtual’.

You could ,for z-objects, remove this ‘if’ before the selection and then the system will read the DB attributes from the DB every time but pay attention that this might will slow the selection speed down since  the select will be called many times.

To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Mike Pokraka

    Interesting ideas about subworkflows. However, I would suggest that it is often more important to structure your workflows efficiently rather than eliminate subworkflows just for performance reasons. And I’m speaking from experience on a system generating millions of work items per day.

    A good maximum number of container elements is 10. A big workflow with 50 container elements is IMHO worse than 3-4 well-structured subworkflows with 5-10 elements each.

    Of course there are also other factors such as having a good archiving strategy and what type of container persistence to use.

    1. Ronen Weisz Post author

      We had a general role of using sub-workflows when ever we can so the workflow will be more readable. in the end of the process we still have sub-workflows but we limit their use, for example not within loops. or we changed a main workflow calling many sub-workflows to calling the subworkflows directly from the event and using more starting conditions.

      Container persistence will help with the size of the container tables (although making them unreadable directly) and the default setting for new workflows if I remember correctly is the efficient one which was the case for most or our workflows. our biggest tables where actually the log history tables and the node log tables.

      You have millions per day! wow! and the DB team doesn’t complain all the time? we thought about archiving but the customer requested that we hold the data for 7 years and the archived display of the workflow is the technical display and the users could not read it. is there a way to change this setting? or have you developed something?

      1. Mike Pokraka

        Yes those would be the biggest tables. You can set up archiving to filter by workflow and create an archive stream of automation workflows that you can just delete at OS level and leave the audit-relevant things such as approvals in the system. You will need to do some enhancements as this granularity is not part of archiving, but it’s not hard.

        The thing thats key to WF performance is the amount of active work items. You can have 100 million WIs in the system, but if all but 10k are in status completed/cancelled it should perform well.

        Also I see far too many workflows with technical steps that don’t need to be there (calculations etc.). That’s usually a good place to look at reducing WI volume.


Leave a Reply