Each year one vicious habit discarded, in time might make the worst of us good. Benjamin Franklin

ID-10098904.jpg[Image courtesy of Grant Cochrane / FreeDigitalPhotos.net]

[Latest Update: Carsten Ziegler and James Taylor have kindly pointed out some further blogs to help with BRFplus and Decision Service Modelling – so I’ve included them in the main blog to make them easier to find (although I recommend reading the comments by James and Carsten… some really good suggestions in there.)]

Recently I have completed 2 projects that involved using BRFplus for some very complex rules. Both were involving Public Sector organizations so, as you can perhaps imagine, when I say complex I mean *REALLY* complex.  For rules this complex, having the business build the rules all by themselves was not an option, we needed people with a strong background in logic design to design and build the business rules, and that meant the rules needed to be created in collaboration with IT people.

For starters the business rules covered not just policy and operational matters but also directly interpreting legislation into the ever contentious who-gets-paid-what-and-by-whom.  Secondly these rules are created not just by business people but by politicians who have no interest whatsoever in making allowances for what systems can or cannot do easily.  Thirdly many of these rules were purpose-built to enforce government legislation – there was absolutely no flexibility for changing legislation to make it easier for the person creating the rule, and many discussions through policy teams to clarify exactly how the relevant legislation needed to be interpreted and applied.

So these were right doozies of projects!  On top of which, both projects had significant deadline pressures, in one project we were brought in late during implementation to (re-)build the rules and in the other we came in early to do some high profile architecting and first pass design of the solution.  So we were seeing complex business rules from 2 very different perspectives.

Needless to say this was a big stretch over our previous DSM and BRFplus work and led to a lot of hard, fast (don’t you just love deadlines?!) thinking about how best to work with such complex rules in BRFplus.  So I thought I’d share my thoughts on our experiences and also on a couple of bad IT habits that really need to be discarded like smelly fish guts if business rules are to be effective.

To answer first questions first – Did you use SAP NetWeaver DSM?

We really wanted to use SAP NetWeaver Decision Service Management on the  build project … we were just held back by some bad timing.  

Why did we want DSM? We were building complex inter-related rules across 2 SAP solutions – ECC and CRM – and DEV, QA and PROD instances, so being able to deploy to any system from one central DSM would have made life a lot easier.  Being tied down to the usual change request cycle for transports was a distinct nuisance, especially when we were trying to simply update the rates/factors with more realistic values for testing purposes, or just change the values to test out some retrospective calculations.  Not to mention when we needed to change milestone dates (which we had set up as constants – more on that later) to simulate retrospective, current and future calculations in several rounds of testing, and of course those dates had to be changed simultaneously in both systems for this testing to make sense.

Lack of DSM was also a nuisance when it came to working out authorizations and procedures for our business experts who would be maintaining rules – as business experts needed to access multiple systems just to view the rules, and have an understanding of which rule appeared where and why.  If we had DSM they could all have been authored, viewed and maintained centrally.

 

[BTW if you want to find out more about SAP NetWeaver DSM features then Carsten Ziegler, who literally wrote the book on BRFplus, gives a quick
summary here: SAP NetWeaver Decision Service Management – Let’s Talk Features .  And if you are lucky enough to be going to Tech Ed 2013 like me 😀 then you might want to drop into one of the Business Rules sessions listed by Carsten here  TechEd 2013 Sessions on SAP NetWeaver Decision Service Management and BRFplus ]

In fact the build project customer was already fully convinced about the value of DSM, as on an earlier project where they had business rules that changed rapidly, they had custom-built several DSM-like rapid update features using the BRFplus application exits.   Consequently, they were already looking at SAP
NetWeaver DSM, licensing was finalized during the project, and they were just getting around to installing it.  Just not in time for this phase of the project.   It will be used for the next phase thankfully and the plan is for existing rules to be imported into DSM to provide a one-stop rules environment for our business experts. 

For the second design project, part of what we were architecting was whether or not DSM should be used, and for similar reasons (and partly based on experiences in the build project), we gave a resounding YES! to using DSM for ongoing maintenance of the rules by business champions and deployment of rules by busines people and/or IT to the relevant systems. 

Is Decision Modelling Notation helpful?

I had only just read up on James Taylor’s Decision Modelling Notation blogs, such as this one Using Decision Requirements Models to scope BRFplus and NW DSM ,  shortly before the project.  I didn’t have a specific decision modelling tool available, but I thought why not give Decision Modelling Notation a go and see if it’s useful?  In the end I used firstly paper, and then a mind mapping tool I had to hand to draw up my model, but I was confident I could have done it in Visio or any drawing tool as I didn’t happen to have explicit tool support.   More work but not insurmountable.

 

For the build project, we came in late so we had to very rapidly understand the decisions that we were building.  Decision Modelling Notation was an easy fit for roughing out an understanding of how the major decisions and sub-decisions fit together from a business viewpoint, and identifying the primary data entities and rates/factors that fed into those decisions. So far so good. 

Not so good was using it to structure our rules application in BRFplus.  This was mainly because although we controlled the rule design the calling application came with a predefined function context. [To be fair, it could also have been my inexperience with this particular modelling approach.]

It made sense for us to use the predefined context, partly to save on the development costs of recoding the call in a BADI and partly because it matched the business requirement to evaluate many items of data simultaneously in the same decision to reach a total payment amount. 

As the evaluation involved not only many items of data, but also involved evaluating daily rates in specific sequences and using rates applied to one type of item as a baseline for rates applied to other items in the same context, we ended up having to use a number of Rulesets, Rules and Loop expressions just to meet the requirement and to handle such a large and complex context efficiently. We did actually use the sub-decisions that we identified in the modelling, but they were necessarily called from within loops.

That proviso aside, what Decision Modelling Notation did make very clear was what sub-decisions the business people needed to be able to simulate more or less independently to be confident that the rule was working correctly, so we were able to make sure those sub-decisions were clearly contained in explicit independent Rules or Expressions so they could be simulated separately.  Decision Modelling Notation also highlighted what rates/factors/constants were important to the business and needed to be in our BRFplus Catalogs.   

These aspects – clear logical breakdown of the business decision and sub-decisions, clear identification of data sources and critical rates/factors/constants – were extremely helpful in discussing the rules design with the business in the design project, and confirming that what we were proposing was correct.

Generally we liked the idea of graphical modelling of rules so much we ended up using a variation of it also to explain the hierarchy of our functions, rulesets, rules and expressions in our technical documentation for the build project. 

If that makes you at least curious about Decision Modelling, but you were hoping for a tool to make it easier, James Taylor and Carsten Ziegler have recorded a webinar showing how James decision modelling tool DecisionsFirst Modeler can be used to blueprint rules to be implemented in SAP NetWeaver Decision Service Management.  More on that in Blueprinting BRFplus and NW DSM projects with Decision Requirements

How do you deliver on the promise of business owned and maintained rules?

For rules this complex, having the business build the rules by themselves was not an option – from the start it was clear it needed to be a collaborative effort between business and IT.  Even getting a functional consultant to build the rules was problematic.  They were complex enough that you needed a strong  background in logic design just to put them together in a coherent fashion.  That said our aim was still to build robustly, but plan to hand over the day to day changes – such as periodic changes in rates and factors – to the business experts themselves. 

Given the complexity of the rules it was also crucial that the business experts be able to make some sense of any rules they were viewing, be able to easily find and maintain the rates/factors/milestone dates that frequently changed, and have some mechanism for simulating rules – retrospective scenarios were a particular concern where queries, complaints and formal legal objection processes meant we needed to be able to trigger recalculations not just months but potentially years after the original calculation was made.

The big bonus here was the customer’s previous experiences with BRFplus.  Added to our own experiences, delivering on the promise that the business experts can view/simulate/maintain their own rules came down to thinking about the practicalities of setting up the rules so that the business would be
comfortable with owning them, so here are some recommendations based on what we found worked for us.

Catalogs cut out most of the confusion

Having catalogs that reduced the over 130 rules objects (not including data objects) that were created in just one BRFplus application alone down to the 12 or so that the business experts needed to maintain or simulate was essential to avoid information overload for the business.

Long text not short text

  • Good business terminology matters when business people view rules. [BTW IMHO good grammar and *SPELLING* matters also!] We had either a business expert or at least a functional consultant read through the rule with personalization mode “Hide Technical Names” turned on to make sure it made sense.
  • For the most part we ignored short text – you don’t need it, and it’s fiendishly difficult to come up with 20 character abbreviations for most data objects and rules objects – we just used long text, long text, long text.  And long text is really quite long… you have plenty of space to name your object properly.
  • Use abbreviations sparingly – there is nothing worse than a data object cryptically named “Pl St Dt” when you could just as easily have written “Planned Start Date”.    
  • When basing data objects on data dictionary elements/structures/tables, after you’ve created them go back and at least rename properly every attribute you are actually using in your rules. Some of the default texts in the dictionary are [putting it *very* mildly] unhelpful.

Limit your expression types

Ok well if you expect the business to maintain an expression, then you need to use an expression type that it is not too intimidating for them to understand and maintain.  In other words, stick to decision tables, formulas, constants. If you insist on using exotic expression types like XSLT don’t be surprised if your business people baulk at maintaining their own rules. [P.S. Carsten Ziegler mentions in the comments that XSLT is being retired soon anyway for much the same reasons, but with roughly 20 standard expression types, and then application-specific expression types, and then the option of creating your own custom expression types… there’s plenty of scope for overcomplicating your design.  Choose wisely!]

Building effective decision tables

Not too many columns.  Clear, simple, direct, is the key to good tables.  Consider splitting one complex table into several related decision tables if it’s too complex to understand.  You can always use your rules to navigate between the tables.

Always include a validity date in decision tables

Funnily enough things always change… which is after all the main point of using adaptable business rules.  It’s all too easy for business and IT alike
to assume the current status quo will always work this way… right up until you start testing your retrospective/future legislation scenarios… D’oh!  No one is going to want to create a whole new decision table and rule to call it just because a rate/factor has been increased or decreased.  While there is versioning, a simple validity date column is often easier to understand and use for most situations.

*NEVER EVER* hardcode future milestone dates (or past milestone dates)

  • Even in the few months we were on the build project the imminent new legislation introduction date changed at least 3 times… and then we had to change it artificially for retrospective testing as well.  A well placed constant where you can adjust the date in one  place is so much better than going through rule after rule after rule changing “IF date GT new legislation date” logic. Plus we could reuse the constant in all our decision tables (e.g. Valid Date cell set to “IF date GT new legislation date constant”) .
  • In any case if you create it as a constant and included it in your Catalog (we put it in a subfolder “milestone dates”) then it’s easy for the business to confirm/adjust the milestone dates as necessary without calling IT again.
  • Constants of course are great for all sorts of simple reuse scenarios,  not just for dates… we used them a lot whenever we had to check the same value in multiple rules and for threshold values that were referenced in multiple places

Socialize, socialize, socialize

Show what you are building to the business experts you expect to maintain the rules early – get their input into catalog folder hierarchy, business terminology, and clarity of expressions and use that to guide your ongoing developments.  It’s never too early to start socializing… especially if you can use your decision modelling notation diagram to give some added context.  Early socialization helps in increasing a sense of business ownership as well.  We
had 2-3 business experts who were our go-to people for asking “Does this look about right? Does this make sense to you? Do you feel comfortable you could
maintain this?”

Notes on Simulation

Simulation of a complex rule is just not easy. Trying to fill in many-attribute structures and multiple rows of tables to set up a single simulation was not fun even for the IT folk.  Even though you can now save sets of complex parameters as test data and regression test against them, if you have several structures and tables as parameters, there’s a fair bit of work involved in setting up the initial tests.  We ended up creating some simplified programs that would take in a spreadsheet of input, call the rule creating a trace, and returning the results, as our best-of-both-worlds compromise.  

Another approach we took was to break some really big rules up into 3-4 large sub functions that could be simulated separately, and used a main function to call the other functions using expressions of type Function Call to call the sub functions.    This had the advantage that we could create extra programs to simulate these major sub-decisions via spreadsheet input.   [Granted with a simpler scenario, direct simulation of expressions would have been
enough.]

 

Some *BAD* IT habits to discard when working with business rules

Actually it turned out that the biggest problem in building our rules on the build project were the bad but not uncommon habits of our IT colleagues (ok and sometimes of us too ;-( ).  When you sit down to build functions, rulesets, rules and many of the expressions there’s a lot of pseudo code look and feel that makes you feel right at home as a developer… sometimes a little too much at home.  The mantra to keep repeating is:

Business Experts Are Looking At What I Create 

– not just the end result – all the underlying workings are visible too.  I’ll admit there are a couple here I’m guilty of too and only picked up on when Wolfgang Schaper, who is our internal go-to guy on DSM at SAP and graciously took the time to review our DSM/BRFplus usage on the build project, pointed them out. So here are some bad habits that we had to work hard to get everyone to discard in the pursuit of better business rules

Death by abbreviation

It’s like someone lost all the vowels. Pl St Dt? What is that????!

Hardcoding

Especially when it comes to milestone dates. You were expecting everything to happen as per the original schedule? And you’ve been working in IT how long?!

Modularizing for no good reason

No it’s not programming now.  Avoid creating independent rules if you aren’t reusing them and don’t need to separate them out for simulation. It just adds complexity without benefit. 

Unclear reuse of working areas

BRFplus is *very* strict on formal versus actual parameters – don’t fight it, get used to it.  When using multiple tables with the same row format for instance you need to create a structure/working area *per table*.  This is easy enough to do, especially if they are built using DDIC structures – as it only needs to have a different technical id – the attributes etc. can be the same and have the same/similar names.

 

Defaulting to ABAP 

  • At some point you are going to find yourself saying… “it would be quicker to do this in ABAP”.  At that point stop and give yourself a sharp rap over the knuckles, because guess what? It’s not about you…and it’s not about short term TCD (Total Cost of Development).  This is about long term Total Cost of Ownership.
  • Also there is a bit of a performance hit when you call custom ABAP code from BRFplus rules … so you don’t want to do this too often.
  • We found someone calling ABAP methods within a BRFplus rule just to do a date calculation which could have been done just as easily using Formula expressions. 🙁
  • When should you use ABAP vs. Rules? First ask “Is this something the business will want/need to see?”  Warning – they probably want/need to see  more than you think they do. 
  • [My personal suggestion would be to perhaps use ABAP for structural transformations (e.g. pivoting a table), or complex multi-table lookups where there is no benefit in exposing this to the business, especially if those are already wrapped in BAPIs or similar delivered routines]

Defaulting to custom Z tables

  • At some point you are also going to say couldn’t I just use a custom Z table like I’ve been doing for the last 20 years… Don’t do it!  Frankly these days I consider custom Z tables as nearly as bad as writing your rules directly in code, because once you have set them up they just aren’t flexible enough to  meet any significant changes in business requirements, and it means the business has to go back on the IT backlog to make any significant change.
  • Still not convinced? Then may I suggest Carsten Ziegler’s blog How to Kill Custom Code and Z-Tables

Using “clever” options without thinking through the consequences…

  • Just because an expression type exists does NOT mean you have to find a reason to use it. Seriously!  You can build amazing rules just using constants, formulas, decision tables, loops, and table operations.  Actually I was impressed that I found I *never* had to call an ABAP method or function for my most complex rule application of all.  
  • Go easy on playing around with Access level – there are performance considerations to be wary of here.   [My personal suggestion… use the lowest access level you can get away with]
  • We tried using Global Access Level in a separate Global Application for a few central-ish decision tables and had to back away from it because the performance costs were too great.  In the end when we looked at it hard there was only one decision table that needed to be shared so we brought the decision tables into the local application and just adjusted the access level on the shared table. Immediate 25% saving on overall performance costs with that one move.

  

Assuming the status quo will never change…

I repeat … and you’ve been working in IT how long?!  Include validity dates in decision tables and don’t hardcode those milestone dates.

Assuming the business LOVE working with IT…

Especially waiting for that IT backlog to get round to doing their critical updates.  The key draw of business rules is so they don’t call you again… absence makes the heart grow fonder.  Besides which, do you really want to spend your career updating decision tables? Make sure you have taken every opportunity to let the business maintain their rules themselves as much as possible.

Some Good IT habits and Tips that worked well with business rules

Work top down from the BRFplus function

The major difficulty in working with BRFplus rule objects that tripped us up initially was losing sight of the context which tended to happen if you were entering rule objects directly.  Once I started working top down from the function each time those problems went away for the most part.

 

Perform actions and discard results

This was my favourite get-out-of-jail-free card especially when I needed to update multiple tables in the same Loop expression. It lets you update the higher level context objects and not worry about defining an explicit result for the expression (technically I used an unnamed Boolean result – just to keep the overhead as small as possible).  

Looking for more tips? Carsten Ziegler’s blog Best Practices for Decision Modeling in SAP NetWeaver Decision Service Management gives tips and plenty of helpful screenshots.

If I had my time over again what would I do differently?

Use Decision Modelling Notation even more… I was happy with what we did but I’d like to use it explore its possibilities further. 

Clarify all business terminology up front with the business … just to save the rework in renaming objects later.

Collaborate and communicate with the business even more… I really want them to feel like they own the business rules well before I hand them over, because well they aren’t my rules, they aren’t the IT people’s rules, the rules belong to the business people, and for business rules to really be *business* rules, the business needs to own them from the day the project goes live, and preferably well before then.

Discarding a few bad habits for a more joyful end result?  Happy to do it!

To report this post you need to login first.

23 Comments

You must be Logged on to comment or reply to a post.

  1. James Taylor

    Jocelyn

    Fabulous post – excellent advice all round, especially on the importance of collaboration.

    With respect to decision requirements model a couple of points:

    There is a modeling tool available for Decision Modeling that works with NetWeaver Decision Service Management or BRFplus – DecisionsFirst Modeler. Check out http://decisionmanagementsolutions.com/decisionsfirst-modeler – you can sign up for a free 60 day trial. If you or anyone reading has questions about it, drop me a line.

    You can, as you point out, read more about the approach on SCN with my posts Using Decision Requirements Models to scope BRFplus and NW DSM and Blueprinting BRFplus and NW DSM projects with Decision Requirements. These talk specifically about using Decision Requirements Modeling in an SAP project and are based on a great series by Lee Chisholm.

    You might also enjoy this white paper on using the approach more generally with business rules.

    Carsten and I recorded a webinar to show how to use DecisionsFirst Modeler with NW DSM and you can check that out too – Managing and Evolving Decisions with DecisionsFirst Modeler and SAP NetWeaver Decision Service Management.

    (0) 
    1. Jocelyn Dart Post author

      Thanks James. Appreciate the links and info and will certainly check out the modeller and the session with Carsten on the crossover with DSM… Sounds promising!

      (0) 
  2. Colleen Hebbert

    Hi Jocelyn

    Great read and love the lessons

    When should you use ABAP vs. Rules? First ask “Is this something the business will want/need to see?”  Warning – they probably want/need to see more than you think they do.

    Do you have any other lessons learned or advise on how to make this decision? I ask from a GRC component perspective for MSMP workflow rules. On the GRC community there are posts for assistance in configuring the decision table. However, some of the more “complex” rules for GRC seem to need DB Table lookup, etc. Admittedly, what I consider “complex” may be quite simple in the grand scheme of BRF+

    I too started to wonder if a function module would be better/easier. Coming from a non-development and WF background do you have any suggestions on how to assess choosing between BRF+ and ABAP function module?

    Regards

    Colleen

    (0) 
    1. Jocelyn Dart Post author

      Hi Colleen, Yes it can be a tricky decision & actually as I do have a development background the temptation is there for me too. You can use the DB lookup quite happily and we used it for a couple of scenarios ourselves.  As to calling function modules if there was a standard routine like a BAPI and you were only calling it once for each rule call that would be ok perhaps because you would be using it like a black box.  If the rule is calling several tables in sequence then you could use several DB lookups or a function module but I think I would look at perhaps creating a data dictionary view & using that instead if possible as that would probably be both clearer & perform better.

      My suggestion would be to consider firstly clarity – is the approach clear and transparent to the business experts? – & then in high volume scenarios to reassess based on performance impacts.

      You can actually view the generated ABAP code underlying the BRF+ function and run it through the usual ABAP analyses to check performance.

      Does that help?

      Jocelyn

      (0) 
      1. Colleen Hebbert

        Hi Jocelyn

        Thanks for the feedback. Yes it does help

        Appreciate the tip on running through ABAP to compare. I think the reality for me is if I’m having to configure a complex BRF+ rule to determine agents, etc, it might be better to sit with a developer and get their thoughts.

        Regards

        Colleen

        (0) 
        1. Jocelyn Dart Post author

          Hi Colleen, That’s not a bad idea anyway – what a developer will bring is the logic design and knowledge of logic patterns on how to approach certain things… that can save you a lot of hard thinking especially with complex rules.  What we found help is if the functional/business people bring clear use cases to the discussion to help the developer identify which pattern applies, and to simulate against to verify the rule is correct.

          You might also want to rough out your thoughts using the Decision Modelling Notation – even just breaking up the main decision into logical sub-decisions for a simple scenario helps a lot. 

          Good luck!… and keep on the Business Rules forum – really active and very helpful.

          Jocelyn

          (0) 
          1. Colleen Hebbert

            Hi Jocelyn

            You pretty much nailed my approach – I always create data flow diagrams, high level business processes with decision points as part of my requirements. Coming from security to GRC, I’ve had to start wearing the Business Analysis/Functional Expert hat instead.

            I’m finding the GRC300 course taught us BRF+ at a high level and the GRC component has an ABAP program that creates the application, function and populate the structure of the decision table. GRC Access Controls encourage the user of BRF+ over ABAP function module but isn’t much more advise beyond that (fair enough!).

            In the GRC community the questions asked around this topic are most likely out of our depth (complex rules). I can come up with a solution (some times), however, just because it works doesn’t mean it’s the most efficient approach to creating the rules.

            I’m now following this community. It wasn’t until your article that I realised it existed and then realised how active it has become.

            Thanks again

            Regards

            Colleen

            (0) 
            1. James Taylor

              Colleen

              One observation. We (and the rules industry more generally) find that a declarative/dependency-based approach works better than a flow based approach initially. While you will likely implement a flow for complex decisions, beginning with dependencies clarifies the difference between a decision that MUST be made before another decision for business reasons and those being sequenced for technical reasons. This can be really important….

              James

              (0) 
            2. Jocelyn Dart Post author

              Ahhh – now that’s interesting… please recommend this community to your GRC colleagues asking forum questions as well.  There are a LOT of standard SAP solutions or applications within solutions that are now using BRFplus so it make sense to handle those queries centrally – much as we do for Workflow which is used in a lot of different applications as well.  And all the best with your next steps!

              (0) 
    2. James Taylor

      Colleen

      I use a couple of criteria for selecting a rules-based approach to automating a decision rather than a code-based one:

      • are there multiple policies or regulations driving the logic?
      • are those policies or regulations complex?
      • must the expertise of multiple people be applied?
      • is an assessment of risk or potential required in the decision?
      • does the logic change regularly or must it change at short notice?

      All of these are good reason to use a business rules management system like NW DSM. Combinations work too – something with reasonably complex policies that change reasonably often for instance. All of these are clues that the business content of the logic is going to be high and so repay an investment in business rules.

      It is worth remembering that a business rules management system like NW DSM gives you four main capabilities:

      • -Design transparency
        So you know how the decision will be made in the future (engaging the business)
      • Execution transparency
        So you know EXACTLY how each decision was made in the past (allowing compliance and improvement)
      • Impact analysis and simulation
        So you can see what impact a chance will have before you make it
      • Collaboration
        Making it possible for business and IT to work together (regardless of who writes the rules)

      HTH

      James

      (0) 
      1. Colleen Hebbert

        Thanks James – good tips. I think I have to develop my own check list as I face complex rule situations. Design Transparency is a big one 🙂

        Regards

        Colleen

        (0) 
    1. Jocelyn Dart Post author

      Thanks William – appreciate the feedback.  It can be very daunting when you are looking at complex business rules!  We actually found business/functional vs. IT people see different rules as easy vs. complex so with a collaborative approach we actually solved a lot of each other’s problems quite quickly.

      (0) 
  3. Carsten Ziegler

    This is a truly great blog! You may include a reference to my blog where some of the dos and don’ts are explained with screenshots: Best Practices for Decision Modeling in SAP NetWeaver Decision Service Management

    I like to add the following information:

    1. Expression Type XSLT has been retired recently. You still can use existing expressions but you will not find it in the menu anymore. It is just too technical and we rather recommend migrating them to Call Procedure expressions.
    2. NW DSM adds a test tool and variants to BRFplus for automated and easy testing of rules and decision services. You can easily copy test cases and also rows in tables in test cases etc.

    I did not understand the comment about performance and access level. Probably you mean design time performance, such as field selection, popups etc. If possible provide more information so that I can understand this better and improve the tool. Thanks.

    (0) 
    1. Jocelyn Dart Post author

      Hi Carsten – thanks so much for the positive feedback – that means a lot coming from you!

      I’ll add the reference in the main blog certainly.

      Appreciate the news on the XSLT expression.  I’m hoping to play more with the new testing tools on my next rules project.

      The Access Level I was referring to is that setting on the Application or rule object level that marks a rule object as accessible by the Application, Application Component, Superordinate Component, Top Component or Global. 

      We had some cases where we wanted to share a few decision tables across multiple BRFplus applications, so before we even arrived, a central Access Level “Global” BRFplus application had been set up for all the Decision Tables that were to be shared.

      Because we had to access those tables typically 10s and potentially 100s of times within a single call of our rule which was in a separate BRFplus application, there was a definite performance cost in using the shared tables… we realized early that this might be the case based on some indicative warnings about unwanted side effects in the SAP Library help here http://help.sap.com/saphelp_nw73ehp1/helpdata/EN/32/6aba9c49fd41a5a14f710e121220f1/content.htm?frameset=/EN/36/fee147fb0a48a0a8dd9d461ac5d57c/frameset.htm .

      I’m assuming at least part of the performance cost was simply the loading of two separate underlying generated programs for the two BRFplus applications, but I’d be curious to know what other side effects we might expect, as there are always going to be cases where you do want to share decision tables and constants across applications.

      Rgds,

      Jocelyn

      (0) 
      1. Carsten Ziegler

        Jocelyn, I have problems to understand your observation concerning the performance. It does not fit to what I know about the implementation. I will call you to understand the case better.

        (0) 
  4. Kashif Bashir

    Thanks Jocelyn,

    That is one wonderful article. I have reviewed my objects keeping these suggestions in mind and glad I have adapted few of them already.

    Wish I had read this article earlier 🙂 .

    Regards,

    Kashif

    (0) 
  5. Prasad Chougule

    hi Jocelyn,

    currently we are using Custom storage type and client want to use custom storage type only for all applications which we have created.

    after importing the transport from  development system to Test system  , decision table data gets overwritten with in test client with the source client  .

    going further we will be using different environment like  quality ,integration and production servers which contain different data , and  business user will change the data in respective environment.

    this is a big concern currently i have in my project , can we avoid this problem by deploying the DSM , i have gone through all the DSM videos provided by carsten.

    Any Help in this subject will be appreciated.

    Please send any link related to this problem .

    Many Thanks,

    Prasad.

    (0) 
    1. Jocelyn Dart Post author

      Hi Prasad, Please post your question as a discussion on the Business Rules Management forum.  Then lots of us who are currently working in BRFPlus can help you… and yes I think DSM is what you are looking for… but if you want more then please post a discussion.

      (0) 

Leave a Reply