Skip to Content
In the last installment of this blog (Part 8): The Diachronic Metadata Repository as a Tool for Risk Reduction via Conflict-Prevention During Legacy-to-ERP Conversions (Part 8) The Diachronic Metadata Repository as a Tool for Risk Reduction via Conflict-Prevention During Legacy-to-ERP Conversions (Part 7) The Diachronic Metadata Repository as a Tool for Risk Reduction via Conflict-Prevention During Legacy-to-ERP Conversions (Part 6) The Diachronic Metadata Repository as a Tool for Risk Reduction via Conflict-Prevention During Legacy-to-ERP Conversions (Part 5) The Diachronic Metadata Repository as a Tool for Risk Reduction via Conflict-Prevention During Legacy-to-ERP Conversions (Part 4) The Diachronic Metadata Repository as a Tool for Risk Reduction via Conflict-Prevention During Legacy-to-ERP Conversions (Part 3) I presented Figures 14 and 15 in order to try and show diagrammatically how and why a Diachronic Metadata Repository (DMDR) facilitates data mapping in a Legacy -> ERP conversion (LERCON) by auto-computing accurate solution sets from which final data mappings will be selected for initial data migraton, permanent external interfaces, and temporary internal interfaces. Figure 14 image Figure 15 image Although Figure 14 does present an over-simplified case in which one AsIs Legacy Business Function has been mapped to one ToBe ERP Business Function, it nonetheless makes the basic point that needs to be made: If: a) AsIs Legacy Business Functions have been horizontally mapped to some ToBe ERP Business Functions TBF1; b) AsIs Legacy Business Functions have been vertically mapped to AsIs Legacy IT Processes; c) ToBe ERP Business Functions have also been vertically mapped to ToBe IT Processes (hopefully in the ERP software vendor’s own metadata repository) then: a DMDR can use the horizontal mapping in (a) and the two vertical mappings in (b-c) to compute the “solution sets” from which analysts can select the final data mappings needed to support the mapping of each AsIs Legacy Business Function to its corresponding ToBe ERP Business Functions. As shown in Figure 15, moreover, this iterative (function-by function) approach to data mapping differs radically from traditional non-iterative approaches to data mapping which simply try to map AsIs Legacy data elements onto ToBe ERP data elements using nothing more than the logical definitions of these elements that can be found in the AsIs and ToBe data dictionaries. And for this reason, most traditional business analysts, business process re-engineers, and business process experts will scoff at the proposed iterative approach on the grounds that it is a wate of time. The reason why most traditionalists will scoff at an iterative function-by-function approach to data mapping is because this approach insists that each AsIs data element and each ToBe data element must be revisited multiple times: each AsIs data element must be revisited in the context of each AsIs Business Function which it supports, and each ToBe data element must be revisited in the context of each ToBe Business Function which it supports. And to most traditionalists, this iterative revisitation of data elements will seem to be a waste of time. But in fact, nothing could be further from the truth. The iterative function-by-function approach to data mapping is actually the only way to guarantee that nothing important has been overlooked during the data mapping process of a LERCON. Or, to put this point another way, if you don’t know how any given AsIs data element is used in each AsIs Business Function which it supports, and how any given ToBe data element is used in each ToBe Business Function which it supports, how can you possibly pretend to know how to map AsIs to ToBe data elements effectively and accurately? But that’s OK – just keep on doing it the way you’ve always done it. You know the drill: i) find an AsIs domain expert and try to get a day of his or her time; ii) find the corresponding ToBe domain expert and try to get the same day of his or her time; iii) have them cross-walk just “their” portions of the AsIS and ToBe data dictionaries together, e.g. if they’re vendor experts, have them walk just the vendor portions of the data dictionaries; if they’re parts experts, have them walk just the parts portions of the data dictionaries, etc. Yeah – we all know how very effective (i-iii) can be as the basic protocol for LERCON data mappings.

To report this post you need to login first.

6 Comments

You must be Logged on to comment or reply to a post.

    1. David Halitsky
      Hi Anton –

      Thanks for taking the time to respond to the question. 

      If you’re correct, then more’s the pity. 

      Because one sure way to spur development of useful and meaningful ES(O)A services (as opposed to what I call “bells and whistles” services) is for an organization to examine carefully how and why its various logical and physical data elements are being used by its various Business Functions. 

      And although this examination should of course  take place well in advance of any attempt to do a Legacy->ERP conversion, one sure way to ensure that this examination actually happens is for an organization to engage in the process I call “iterative” or “function by function” data mapping during a LERCON itself.

      Better late than never.

      More to say on this in Part 10, out in a moment.

      (0) 
      1. Anton Wenzelhuemer
        David,

        I have got to admit that I can’t get the essence of your lengthy excerpt (thoug I think I am one of the few trying to follow it still).

        What I don’t understand at all is why you seem to use ‘ES(O)A’ synonymous for ERP. I can’t see what LERCON is supposed to have to do with SOAs in first place. Sounds like a marketing manouvre to use a buzz-word to spice up a rather dry contemplation.
        I could imagine to think about LESACON (LEgacy-to-Service-oriented-Architecture-CONversion) or as a slogan “Smashing monolithic vendor/ technology/process-lock-in solutions in favor of powerful, sustainable, distributed, service oriented architectures”; or something like that.

        Second thing I don’t understand at all is the fact that I strongly believe that you don’t have the necessary degrees of freedom to map data and functions optimally, considering differing encapsulations, concepts, semantics, … and limited sets of (standard) source and target functions. Or simpler, aren’t we already happy if we manage to sucessfully do ANY LERCON (variant) at all?.

        Well, your’s to design your blog serials. Maybe you could give short intermediate summaries in order to make it more viable to follow for us poor readers.

        anton

        (0) 
        1. David Halitsky
          Furthermore, I suspect that if you really didn’t get the “essence”, you would not have come up with this very insightful comment:

          “Second thing I don’t understand at all is the fact that I strongly believe that you don’t have the necessary degrees of freedom to map data and functions optimally, considering differing encapsulations, concepts, semantics, … and limited sets of (standard) source and target functions.”

          As far as ES(O)A being synonomous with ERP, I’m not sure I see where I’ve suggested this.  What I have suggested (as I say explicitly in Part 10) is that SAP really has to try and resolve a certain corporate schizophrenia which results from the fact that its “blueprinting” process is essentially transactional and pre-ES(O)A.  And from this observation, it should be clear that I don’t equate ES(O)A and ERP at all; in fact, I would consider “ERP” itself to be a pre-ES(O)A, transactional concept.

          (0) 
        2. David Halitsky
          In the hands of a real Perl expert (or a real expert in any scripting language supporting regular expressions), everything about a system S1 can be successfully mapped to everything about a system S2, assuming the ontologies of S1 and S2 are sufficiently well-defined.  My only real beef with SAP is that it defines its ontology extremely well in its own internal metadata repositories, but makes it overly difficult for “outsiders” to discover its ontology.  If SAP is serious about ES(O)A, it should try to rectify this problem.
          (0) 
          1. Anton Wenzelhuemer
            if both systems had the same boundaries and the ontologies were kind of complete this would probably be true. but since the boundaries differ in reality you’ll find near complete ontologies only in sub-domains of each sytem you’d not be facing an 1:1 ontology mediation but actuallykind of an n:m one. even the best regex’er will have a hard time with that situation and always end up with compromises.
            anton
            (0) 

Leave a Reply