Additional Blogs by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
clinton_jones2
Active Participant
0 Kudos

http://cdn3.spiegel.de/images/image-528406-breitwandaufmacher-ikuj.jpgBusiness has a voracious appetite for data change and SAP systems are for the most part very accommodating when it comes to addressing this appetite. IT comes up with imaginative ways to help the business engage in mass change of data sometimes with an intimate understanding of the business requirement and sometimes simply with a fix that explicitly addresses the requirements as spelled out by business.

As any  tenured employee that uses SAP will tell you, at least once in their lives they have encountered bad data in their SAP system, either as a result of badly written mass create or change procedures or as a result of an omission or mistake they or someone else made. Sometimes the bad data comes about as a result of some upstream or downstream event but more often than not, errors creep in because of a lack of understanding or diligence on the part of the data steward(s) or the person responsible for writing the script or report that will create or change the data.

In a highly nervous environment it is relatively easy to take away mass change transactions but the reality is that simply removing these doesn’t make the data safer or its quality any better. Equally, you can remove the ability to execute things like GUI scripting against your SAP system but this doesn’t address the potential for someone to create a  BDC session or LSMW script that doesn’t require that method. You could argue that those are technical tools and that inexperienced people shouldn’t be using such things, but it could be equally argued that the most experienced people with the most intimate relationship with the data are often the people who own that data and they often don’t have access to tools to create BDCs and LSMW scripts anyway.  So who is left? For the most part it is back down to technologists and as is often the case, they are often so thinly spread functionally, that all that they can do is respond directly to the requests of the business.

http://3.bp.blogspot.com/-DQntyKG9AI0/UWB4STugSBI/AAAAAAAAPt0/_WeBq3QXdV8/s1600/DSC_0219.jpgEffective mass create and change strategies, whether they are applied to transactional or master data are typically bound up not just by technology but also by responsibility, accountability and process. This means that people engaged in these activities should firstly understand why they are performing an action, understand the underlying dependencies within the business and technology on what they are doing and then understand the strengths and limitations of the technologies that they are going to use to perform these actions. I  can’t stress enough, the importance of having a solid execution, backup and recovery and risk mitigation plan  - all of this dovetailed with a comprehensive testing plan. Unfortunately though,  there still seems to be a largely "fire and forget" or wing and a prayer mode of thinking when it comes to the execution of mass change and create activities and these are being applied in productive environments!

Winshuttle’s products, like those of other ECOHUB vendors, of course offer immensely powerful data change and create capabilities for the SAP ERP space and thousands of users use this and other technologies every day to perform mass actions on their SAP environments. In some cases mistakes are made which are difficult to recover from. Optimistically you could say that accelerated mass change and create allow you to arrive at the conclusion that you made a mistake, faster and this is an advantage to slow and laborious manual entry but it is also a problem because the accelerated way in which you perform these changes means that the problem you create with bad data can become much more pervasive in the business process than if you performed this action manually and then checked the outcome at each change step. That manual entry simply takes much too long and is an untenable approach to use and so mass change tools seem an intuitively logical choice.

In my conversations with many business and IT managers alike, I like to explore how they manage mass change and create and mitigate risk and here are some of the ideas and thoughts that these folks have shared with me.

Functional Testing Rigor

Any well designed mass change or create approach will have been adequately tested in unit and QA. If you’re going to syndicate the data change method to business users, additionally engage them in user acceptance testing. Developing scripts for data creation and change should follow the same rigor and discipline as any software development activity that IT would apply. Be sure to also test system throughput and thus performance test your planned changes, down in my point on performance assessment, you may find that performance testing reveals some interesting discoveries about your SAP environment.

Backups

it might seem obvious but the reality is that you can often repurpose your change scripts to back your data up before your perform the change. This is extra work (double execution, one read, one change) but there isn’t really an alternative unless you are prepared to shut down all other data processing on your SAP system while you perform the change and verify thersults.  Some mass change techniques do make ‘copies’  or take snapshots of the data before the change – explore this as a good mitigation strategy but you shouldn't be complacent about this.

Audit

External auditors often climb all over mass change transactions like MASS, MM17, CS20, CA85 etc and as a result these are often taken away from ordinary  users, it is possible to keep a track of who executed these and when but this is often buried deep in the bowels of your SAP system and accessible only to BASIS admins or users with access to some SM transactions. This also assumes that you keep verbose enough logs for enough time to study these executions when you need to do postmortems. If mass changes on specific data are important enough that you need to have the history then consider a mechanism that allows you to maintain a repository of change history over and above the change log related to the data object itself. For LSMW and BDC sessions this means archiving off the run logs, datafile sources and copies of the scripts. For third party products like those from Winshuttle that do SAP integration this probably means considering the implementation of enforced  retention of  completed logs of the runs.

Performance Assessment

As any large scale SAP IT administrator will tell you, performing mass changes on your SAP systems can result in unexpected outcomes at the database level and in the application stack itself.  Your SAP environment is likely sized for standard operations and unfortunately the mass change of data in your SAP environment problem didn’t come up as part of your SAP sizing exercise.

Performance testing provides some, but not necessarily the best information; there are so many variables in how a given SAP installation is run, that the best way to minimize the impact on a given system is through an appropriate ad routine sizing process.  Using the Quick Sizer tool,  you can input expected transaction volumes and rates for most processes.  The tool provides you with the “SAPS” value for whatever application you implement.  You can then discuss this with your hardware partner to determine if you will need additional CPU, memory, etc. or if you have sufficient resources. Your assessment should be aggressive but realistic and you should carefully evaluate if your system should be sized for peaks or steady state.

This assessment should also speak to the frequency with which you expect to perform these actions. The assumption that the ability to process 25,000 incoming sales orders per day for example, is not the same as the desire to reschedule 200,000 sales order lines in ten hours every 6 months. Your optimism that these numbers compare equally in your system may be proven to be unfounded but only when you take the trouble to do a measured assessment. Other variables that will creep in, will be things like deciding whether you run these jobs during peak hours, off hours, against the central instance, against a particular application server or pool of servers, or whether you want to introduce some sort of delay or sleep function between record changes. Consider shorter but parallel bursts of mass actions, but not if you plan to use concurrent ranges of data as you may encounter system lock conflicts. In one particular accounting mass change initiative that I was involved in it was determined for instance, that some additional database indices would massively improve system throughput and so these were applied before the change was effected, additionally, database auditing was enabled and this resulted in massive volumes of undo logs that were difficult to manage. Then of course you may also want to consider whether there are some sorely needed maintenance or patching activities that are long overdue or that should be accelerated.

Activity Approval

http://thespiritualcoach.com/wp-content/uploads/2010/12/Approval.jpgIt has to be assumed that if you have gone to the trouble of giving a user access to your SAP system that you trust them to some extent with the use of the system. An additional layer of governance is however always something that you could consider especially when dealing with operators who are perhaps less experienced or when trying to control data that is particularly sensitive, volatile or pivotal to operational effectiveness. The classic approach is the typical control mechanism of users making requests to IT and producing the data files based on some IT provided structure or template.  IT or super users then typically execute scripts against that data. While this approach may work well in some environments, it is often unsustainable as an approach when yu try to scale.

I have seen customers use helpdesk ticketing systems to initiate these types of requests, with IT assigning the resource and often this is always assigned to the same person, an expensive ABAP resource who perhaps created the original recording or script.  When that individual is on vacation or out of office the task is deferred and eventually perhaps a backlog of requests results in a concession being made and someone else (perhaps less technical) then becomes the gatekeeper of the process. The approval of that action is however not so easily attached to the task and over time responsibility and accountability is obscured by the needs of ease of execution. In the absence of such delegation, the business side perhaps has some awareness of the action to be executed but for the most part the contract is between the requesting individual and the IT worker who will run the script.

Unless the creation of the data artifact has workflow approval of the data intrinsically embedded in the consumption act on the IT data template this lack of organizational approval is a very difficult obstacle to remedy for control and transparency of purpose going forward. Some ways that this has been dealt with operationally at least, have been by way of SharePoint workflows wrapped around data documents or through email attachments that need to be forwarded by specific individuals but all of these are poor cousins when compared with approaches that rely on a data approval workflow embedded in the end to end process.  These other approaches tend to be cumbersome and users often like to try and short circuit them.  There has to be a better way and often this better way is to choose a product that has workflow as part of the overall data document submit approve and execute process.