At the recent SAPPhireNow conference, there was heated discussion about In-Memory technology regarding BI Analytics (for example, the SAP Business Analytic Engine) and some discussions regarding the importance of using such innovations for transactions.
At a SAPMentor/Blogger meeting with Ingo Brenckmann (Senior Director of the SAP Solution Management team) about In Memory Computing Business Benefits and the Road Ahead at the Sapphire, I was intrigued abut what has been accomplished in this area – especially regarding BusinessByDesign and the role of T-REX in this environment. I admit I was envious about the impact of this technology in that product line – however, I didn’t seen anything similar regarding processes. I kept looking – perhaps expecting – more description from SAP regarding the possible uses of this technology in BPM environments but I didn’t find anything that really met my needs, so I decided to explore the various options on my own.
How InMemory technology will impact Process Environments: Indirect vs. Direct
If you look at how InMemory technology might influence processes, the first split is between a direct and an indirect influence. This distinction refers whether the InMemory technology is integrated tightly with process environments (for example, changes in BPM runtimes) or it is used in ways that complement existing BPM environments. Since the rest of this blog examines the direct influence of InMemory technology on processes, I’d like to first examine the indirect influence.
Indirect Impact of InMemory Technology
If you look at the usual relationship between analytics and process environments, BI data is often used to assist users when they are working on a particular task. Charts and diagrams help illustrate certain patterns and assist in making a decision. Thus, one indirect benefit would be the use of InMemory-enhanced analytics in existing BPM User Interfaces (UIs).
Of course, such graphic representations of process-related data are present in existing environments where InMemory technology isn’t available but the advantage of InMemory technology is the amazing speed at which users can access / manipulate this data. Often, such graphs and charts are static rather than dynamic. Ideally, InMemory technology would enable users to drill down into greater amounts of data in a more interactive / dynamic fashion.
Note: Based on the current status of SAP’s InMemory portfolio, this might be able to be implemented today based on the availability of certain BI-related technologies (for example, the BI Accelerator and perhaps the BusinessObjects Explorer).
Besides embedded process support, there are other interesting uses of InMemory BI analysis. Another possibility would be the use of this enhanced BI Analysis on process-related metrics. A recent Custom KPI measurement solution for BPM described a NetWeaver BPM dashboard and I wrote a Guided Procedures Explorations: Process Runtime Dashboard a few years back about a similar dashboard based on Guided Procedures. If you are interested in continuous process improvement, such dashboards are critical. InMemory technology could be used to create a more detailed process metrics dashboards that would allow managers discover process-related problems more rapidly.
Another indirect influence would be when the business objects that underlie the majority of enterprise processes were based on InMemory technology.
Such a change probably wouldn’t have that great an impact on processes, because end users have only indirect access to such business objects via standard APIs, etc. Thus, user might notice improved times when accessing such BO-based data but fundamental changes would not to be expected.
Direct Impact of InMemory Technology
Now, let’s take a look at some use cases where this revolutionary technology could have a direct influence on process environments.
Before we continue, I have to make a distinction between process runtime and designtime environments. “Runtime” refers to those environments where process instances are created, administered and monitored. “Designtime” are those environments in which process design takes place.
The Impact on Runtime Environments
Abstract process design / structure and the data of process instances must be stored in some form. Another opportunity would be when the BPM environment itself was based on InMemory storage. This change would occur when the underlying data storage of the BPM environment was moved to InMemory.
Besides the performance boost that might occur, more intriguing are the new possibilities of interaction between the process-related objects (structure of the process and data from the process instances) and the data stored in the business objects. At the current time, this interaction takes place via “standardized” interfaces (usually web-services) – the interaction is indirect. What would happen if process instances had direct interaction with business objects at a far deeper level?
The resulting potential is evident in a process metrics dashboard (as mentioned above) based on a common InMemory Process and business object storage which would enable analysis of how a particular process directly impacts the underlying business objects. For example, you might explore how process KPIs (for example, time to complete a particular process task) impact business-object-related KPIs (manufacturing delays, etc). Another possibility might allow conditions in the underlying business objects to dynamically affect the basic structure of the related processes.
Imagine if social network-based data associated with a particular brand were stored in an InMemory-enhanced business object. Forrester Analyst Clay Richardson points to this possibility regarding the influence of social network in processes – the direct influence of such buzz on the process instance itself.
Many speculate that social BPM will have the greatest impact at runtime. I refer to this as “runtime process guidance,” and we are starting to see really good examples of this emerge for customer service/customer experience processes – where processes use social analysis to determine “the next best action.”
Note: Some might say that such possibilities are also possible without InMemory technology. One problem is that without such technology the processing speed to analyze such masses of data is so slow that interesting scenarios aren’t really acceptable/ practical for end-users.
The importance of context
Users are becoming increasingly critical of process environments and the associated expectations regarding such systems are also growing. Users want processes that respond to their individual context.
A recent Why Business Rules Are Important in Real-World BPM from Greg Chase stresses the importance of adding user-specific intelligence to process environments to improve user adoption.
If you make applications smarter so they pre-fill related data and tailor themselves to the specific context of the user and the process instance, you’ll make it much easier for casual business users to engage with a process. This is instead of creating a process that requires more power users to handle overly complicated data entry tasks.
As seen in a The Next Step in SAP Business Process Optimization – Mobility from Kevin Benedict, user expectations regarding mobility and the fact that processes must take into account a user’s location are also changing.
Decision makers are not stationary. They are decision makers because of their experience and value to the company. They are mobile.If all of these optimized business processes assume a stationary decision maker, they fail to recognize reality. All business processes and IT solutions today must assume that the key human players in a business process are mobile. Decisions must be able to be made in mobile environments.
Thus, context for a user (his location, what projects he is currently involved in, what customer he supports, etc.) is critical for user acceptance for process environments.
Such personalized processes, however, require a fundamental change in how such runtime environments function. InMemory technology would be ideal to deal with the immense data storage and fast processing speeds necessary to implement such developments.
This information could be stored in business objects that are based in InMemory storage.
I’ll discuss the implications of the inclusion of context on process design later in this blog.
I’ve always thought exceptions were one of the most intriguing parts of BPM. Their very existence represents a threat to the necessity of structure and order that is fundamental to the concept of the process as it exists in the modern enterprise. This diversity reflects the complex context in which a particular process instance exists. In our previous discussion, we examined the importance of personal context. Context, however, can also refer to the business objects (a customer, an order, etc.). The distinct context based on the complexity of a momentary snapshot of a situation wasn’t expected when designing the process and thus an exception occurs in the runtime environment.
Perhaps, it is the difficulty dealing with such complexity that leads most companies deal with such problems manually (as Greg Chase SapphireNow Day 1: BPM Communities of Pod People Straying Off the Happy Path in an Agile, Sporty way recently).
She provided a unique explanation about how BPM is handy for handling exceptions to core processes. As Suja puts it, “The ‘Happy Path’ is the well tested path.” – such as the core process provided by SAP BusinessSuite.
Extending on Suja’s comment above, you have to consider how well your company handles cases where a request or task falls out of the “happy path” and into manualexception handling. Dealing with these kinds of exceptions, sources of inconsistent interaction with customers and suppliers, are very costly in terms of manual labor, and can seriously damage customer relationships.
Exceptions are often viewed with a malice that borders on pure teeth-gritting hatred. The job of the process designer is to exterminate these pests that blemish the purity of the process. However, as Peter Evans-Greenwood comments in a blog from James Taylor, these exceptions often represent the differentiator for an enterprise.
Much more interesting is the exception rich processes which we can’t nail down. We spend our time mapping our decision trees and analysing business processes to try and find a way to stabilise and optimise the business process. We might even apply the bag of tricks we’ve learn’t from Six Sigma and LEAN.
It’s the wrong solution to the right problem. Our highly valued tools for process optimisation work by minimising and managing the variation in business processes. Reducing variation enables us to increase velocity, automate via BPM, and thereby minimise cost. But it is this processes variation, the business exceptions, which can have a disproportionate effect on creating value. In a world where we’re all good, it’s the ability to be original that enables you to stand out.
Rather than fight against exceptions, the idea is more how to take advantage of them.
This requirement, however, necessitates a fundamental change in how processes are designed.
The Impact on Designtime Environments
A restriction of the InMemory-related changes to process runtimes is difficult to imagine. Without corresponding changes in designtime environments, the full potential of this technology can not be exploited. Once the underlying data from processes instances is stored in a columnar format, how will process design evolve?
As I described above, the factors influencing the particular path a process instance follows will be based on a wide variety of factors / the distinct context of those involved. By itself, this change leads to an amazing and heart-stopping increase in complexity. How can you design a process to reflect all possible paths? If you depicted all these possibilities, you would have a process design so complex that process maintenance would be impossible and performance would in all likelihood be horrible.
Currently, a certain degree of process flexibility is provided via business rules. The idea would be to simplify processes and externalize the context information. The use of InMemory technology would allow formore complex rules and faster application of such rules on process steps.
As James Taylor describes in a blog, business rules are especially useful for certain types of processes.
Core processes, however, are much more stable. Everyone knows the paths that work through the process, the activities involved are well defined. Changes to these processes are a big deal, disruptive to the company regardless of how they are implemented. In these processes what changes are the decisions and the business rules behind those decisions – what makes a customer eligible for this level in the loyalty program, what price is this policy for this customer, what’s the best retention offer to make. These decision changes can be mistaken for a process change if the decision has not been broken out but they are not process changes – the activities, their sequence and their purpose all remain the same. The decision-making behavior of a specific activity is what changes.
However, as Peter Evans-Greenwood comments on the same blog, another approach may be necessary to deal with those Edge processes – remember the ones mentioned above with all those pesky exceptions.
An alternative approach is to embrace this variation. Simplify the processes until it is stable, reducing it to its essential core. Treat exceptions as alternate scenarios, compiling the set of commonplaces required to support the vast majority of exceptions. We can then use a backwards chaining rule to bind process instances to the appropriate commonplace in a variation of Jim Sinur‘s “simple processes, complex rules” approach.
This approach reduces the complexity of an ever changing process by transforming change into the evolution of an appropriate suite of commonplaces, and the goal-directed rules used to bind them to process instances.
Some of you now may thinking: “STOP. Dude, this blog started out talking out about InMemory technology and now we are talking about business rules and commonplaces. I don’t see the connection – you’ve lost me.” For me, InMemory technology represents the ability to analyze huge chunks of data at a speed that was not previously possible. I’m throwing the InMemory stone into the BPM pond and following the waves as they expand / grow. You look at the impact of InMemory technology on one part of the BPM pond and see the impact that a change in one area has on its counterparts. I agree it is impossible to design a process that takes into account every possible location of a user – regardless of whether you use business rules or not – my intention is to propose methods that would enable designers to start to take advantage of this technology. I found Peter’s comment fascinating and I looked for a technical foundation with which to implement it. InMemory technology isn’t a panacea but it is a foundation on which solutions may be built.
If business rules and other standard tools in existing process environments are inadequate to deal with the potential of InMemory technology, then perhaps even more radical / fundamental changes are necessary. In a recent blog, Dennis Moore describes one such potential shift – towards a focus on events in process environments.
If HassoDB understands that an object is being stored, updated, or accessed, HassoDB could publish an event – and that event could be consumed by new applications that speed up integration between business processes, allow the insertion of new business processes, or that simply generate alerts for users.
How could this capability be deployed? Well, imagine that a sales person gets an alert every time their customer makes a payment, is late with a payment, submits a complaint or service request, or places an order on-line. Or that a salesperson sets up an “auto-responder” for those events, thanking the customer or asking her for feedback as appropriate. Event-based capabilities would greatly speed up and improve service.
Another example could be in integrating business processes. Rather than hard-coding the “on-boarding” process for a new employee, there could be an event-driven integration. The hiring process could generate an event when an employee’s starting date is set; other processes could subscribe to that event, and do the appropriate processing, including reserving an office, preparing the HR orientation, ordering a company credit card, requesting an entry badge, or assigning and configuring a computer. Whenever the on-boarding process changes, rather than editing the process definition, taking the application down in the process, and restarting it, instead an administrator would just load a new action and subscribe it to the appropriate event.
But there are also other potential design-time-related changes that aren’t as revolutionary:
- One immediate opportunity would be an analysis of the various patterns existing in the process instances that are already finished – what parts of a particular process design are used most frequently, what paths are used infrequently, etc. This information could be used to enhance the design environment – for example, the presentation of the design elements could reflect their actual usage. For example, a particular process path could be drawn in a different color or line thickness based on its degree of utilization. This could occur in real-time, based on actual usage.
- Process simulation based on InMemory technology. In a podcast with Hasso Plattner, the Chairman of the Supervisory Board at SAP describes simulation as one of the new advantages in the use of InMemory Technology when performing enterprise resource planning. A similar functionality might also be possible in BPM design environments where designers could simulate various possible process paths before moving to a runtime environment.
- I liked the ability to discover relevant participants for projects that is demonstrated in the SAP project Elements where an analysis of existing social networks, mail and other sources helps users discover others who might be able to provide useful information or are ideal candidates for collaboration.
It would be great to have something similar in a process design / Social BPM environments where data from process instances as well as other sources (corporate social networks) are stored in InMemory and used to select individuals who are the best candidates to collaborate on process design.
The usage of InMemory technology in other environments (BI, etc.) is usually focused on speed. In both runtime and designtime process environments, the main benefit of InMemory technology involves increased flexibility as well as the ability to better respond to the particular context in which a process takes place. What you have is a fundamental change in the nature of the process and the assumption that its structure is fixed over all process participants and situations.