Skip to Content

This is Part 8 of the Global Bicycle Inc. (GBI 2.0) story, as related by an intern. The full series begins with Global Bicycle Inc.: An Intern Adventure.

Expansion of GBI Processes

It was at this point in the process that we began to expand the business processes that GBI would encompass. We began research and configuration on processes such as Warehouse Management, Enterprise Asset Management, Customer Service Management, Asset Accounting, and Advanced Controlling. Some of these were all ready in the research phase as they were going to be used in GVSU’s training environment. We could then leverage what was already accomplished, and merely had to update the information to conform to GBI. One of these processes was Warehouse Management. The configuration documentation already existed, and it simply took me one eight hour day to update the documentation, configure the processes, and test it.

 

Well mostly. The first time I configured Warehouse Management, I misunderstood that GBI was to have Warehouse Management in only one plant. For some reason, which I’m still not really sure why, I was under the impression that Warehouse Management was to be in all three plants. So I configured it in all three plants. Big mistake. Having Warehouse Management in all three plants would fundamentally change most all of the processes we had to date. So to fix it, I tried to un-configure Warehouse Management in two of the plants. I went backwards through the configuration document, undoing everything I had done for two of the plants. I was successful, with all but two of the steps. It was at this point that I re-learned something I had forgotten: once master data is entered into the system, there is no easy way to get rid of it. Once I had Warehouse Management configured in all three plants, I went to test it, and created some Master Data to do so. Then when I realized my mistake and went to un-configure Warehouse Management, I still had that Master Data out there. This Master Data was overlaying the configuration, so to speak, thus I could not get rid of the configuration without first removing the Master Data, which I could not figure out how to remove. Thus the dilemma.

 

We knew that Warehouse Management would work, and we knew how to configure it, we just now had to wait for a new client to configure it in as I messed up the last configuration. The lesson: make sure you know what you’re configuring before you configure it!

 

Other processes, like Enterprise Asset Management (sole credit for research and configuration on this process goes to Kevin Coolman), were in no form of research whatsoever. We went to the office, took down the SAP Press book about EAM, opened up Google, and got to work. Documents were created, scratch paper was utilized, we tried explaining the process to each other and mapping it out, and finally we went into the system and configured it, and went through the entire error message resolution process. The end result, with a bit of skill and a dose of luck, was a working process. These all resulted in configuration documents and master data spreadsheets that would be sent to Germany and would be included in future client rollouts.

 

It should be noted here that our method of research and configuration was vastly different than industry practice. Instead of conducting workshops, mapping things out, and having a good plan, we generally went at it in an ad hoc manner. We found the major steps of the process (usually relating to specific transactions codes), configured it how we thought, then tried to go through the successive transaction codes. When an error was reached, we did research on how to fix it, performed the necessary configuration, documented it, and repeated the process until we could successfully navigate through the successive transaction codes.

 

During this time we were researching new processes we would intermittently receive a new client from Germany. At the beginning they had no master data in them as Germany had not yet created the LSMW files to mass upload this data. We would have to manually enter in this data before we could begin testing. Additionally with each new client we received access too, we had more processes we had to test. We began to learn the business processes very well. With each client we would test, we would check for errors, and if found we would find the solution and compile them all on a new configuration document which would be sent to Germany. Sometimes it was something simple they had missed, so we could go back to our old documentation and copy-and-paste it onto the new documentation, and at times it was a new problem we had to solve. Through it all, the documentation continued to grow. 

 

While it was irritating at the time to see the same item continually missed in final configuration, looking back it is actually quite remarkable how few items were missed, and it is a testament to the hard work of those in Germany. If the roles of Germany and GVSU were reversed, we would have missed configuring some items as well. Not because we did not have attention to detail or any such thing, but merely for the fact of how much documentation was being sent. You might think “oh, it’s not too bad, it’s all right there.” Then look at it from the perspective of never having seen it before, and suddenly, you have several hundred pages or so to go through. Add to that the fact that English might be your second language, and suddenly a real challenge is presented. Hats off to the wonderful job that was accomplished. 

To report this post you need to login first.

4 Comments

You must be Logged on to comment or reply to a post.

  1. Tammy Powlas
    “once master data is entered into the system, there is no easy way to get rid of it. “

    So true…and this has been a great series.

    (0) 
    1. Siva Vara Prasad Maranani
      The simplicity with which the complexity behind the process is presented is amazing.. You made it look as though it was a days job..

      This has been a great blog series indeed..!!

      Siva Maranani

      (0) 

Leave a Reply