This blog is the continuation to Stumbling blocks in purchasing info record migration by LSMW IDOC method

The project is outlined in the other blog, so I can directly start with the difficulties I had with the migration of the price conditions.


When you migrate info records by IDoc method, then the price conditions are not included. They have to be loaded in an extra step.


So again the question about the method. Batch Input vs. IDoc method.

In my last blog I explained that the IDoc method had the advantage that I can work with preassigned numbers and can create a mapping directly from the conversion step in LSMW.  This advantage does not count for the conditions of an info record, because there is no extra mapping which has to be given to the business. The mapping was already there with the info record migration.

Still I decided to go with IDoc method, as it allows me to reload the data again and again if I do something wrong. With the batch input method I would need 2 LSMW objects worst case, for initial creation and for a change if something goes wrong. Trust me, even I make mistakes 😉 . Eventually already with the decision to use IDoc method here. You can leave a comment.


Now I knew how I am going to load the data, but I still had to solve the issue how do I get the data from the legacy system. If you check SCN with key words how send COND_A  then you get 70 hits but rarely an answer. I actually found just one correct answer in relation to info record conditions and will spread it here again: it is possible from MEK3 transaction via condition info. However, at the time of migration I did not know about that. And I still think that it wouldn’t be suitable for me, as MEK3 expects the condition type before you can continue to condition info, and I had 28 different condition types in the source system. Further I cannot even go into the condition info if there is no access defined for the condition type.

In the meantime I got an idea to send the conditions together with the info record using serialization, I will try this in my next info record migration.

But in this migration I decided to download the conditions using SE16 and a simple SQVI QuickView.


The goal was to migrate only valid conditions, no historic conditions.

I started with SE16 and downloaded all valid conditions from table A017 and A018; excluded the deleted conditions and those who had an validity end date lower than today. (We did not load conditions at material group level, hence no download from A025 table)

A017.png


As you can see, the condition record number is listed at the end.

Next step was the development of the QuickView.

Just a join of the table KONH and KONP since we decided to do the few scales (KONM table) manually.

The chosen fields for our migration were:

List fields Selection fields

KONH-KNUMH

KONH-KOTABNR

KONH-KAPPL

KONH-KSCHL

KONH-VAKEY

KONH-DATAB

KONH-DATBI

KONP-KOPOS

KONP-KAPPL

KONP-KSCHL

KONP-STFKZ

KONP-KZBZG

KONP-KSTBM

KONP-KONMS

KONP-KSTBW

KONP-KONWS

KONP-KRECH

KONP-KBETR

KONP-KONWA

KONP-KPEIN

KONP-KMEIN

KONP-KUMZA

KONP-KUMNE

KONP-MEINS

KONH-KNUMH

KONH-KOTABNR

KONH-KAPPL

KONP-LOEVM_KO

KONH-VAKEY


As the number of records exceeded the maximum numbers that can be added in the multiple selection I used the table number and excluded the records with deletion indicator. Nothing else as selection criteria.

116000 condition records caused a file of 44 MB. Too big to output the result from the QuickView directly in Excel. So it had to be downloaded as text and then imported to Excel.


Here you have to careful, if you have more about 60000 lines or more, then you can find 2 empty lines at position 59994 and 59995.

If you continue with sorting and filters, then you do not get all records, hence you need to remove those empty lines.


A017_Excel.png


As a next step the fields have to be formatted. All value fields have to be formatted as number with 2 decimals and without delimiter for thousands.


After this I entered a formula VLOOKUP to identify the condition records that equal the condition record number from the A017 table download. Only those were kept for next processing step.

If you wonder why this is needed then have a closer look at this validity end date from KONH table. It is almost everytime 31.12.9999.  If you enter a new price in the info record, then you usually care only about the valid from date and the condition is valid until a new condition is entered. But when a new condition is entered it does not change the valid-to date of the old condition. SAP selects the valid condition only based on the valid-from date.

And here I was lucky that I only had to load current valid conditions. I tried it a few times for all conditions but this caused a bigger chaos, because I was not able to process the IDocs in the needed sequence: the oldest condition first, the newest condition at the end. Even I had the correct sequence in the CONV file in LSMW, they became a mess when the IDocs got generated and processed.


As said, I kept only the latest conditions identified with VLOOKUP in the file. Be aware that this VLOOKUP for that many records can take hours depending on your PC and the numbers of processors. With 24 processors this was a job of a little more than 5 minutes.

Still I had to do a bit more cosmetic to get what I needed to load this file.


In this picture from LSMW you can see the IDoc structure: a header, the validity period, the conditions

COND_A struc.png


The source file has to be prepared to deliver data for each of this segments in the structure.


Starting basis is my download file which has only the valid records now, this will be used for the detail, the KONP segment of the structure.

A17KONP.PNG


For the header segment I just copy the columns A to E  into a new Excel tab.

for the validity period segment I copy the columns A to G into a new Excel tab.


From both new tabs I remove the duplicate records. /wp-content/uploads/2013/12/a_duplicate_350577.png


Now I save each tab as CSV file and assign the CSV files in LSMW to my source structures.CONDALSMW.png


When you execute the step READ DATA then SAP joins all those 3 files based on the common fields in the beginning of each structure.


The rest is just the standard process of executing LSMW.


Important in the field mapping is the VAKEY  field. SAP uses a variable key (VAKEY) to access the conditions. This is concatenated field with material number in the beginning, followed by vendor number, purchasing organisation, plant (only in case of A017 conditions) and info record type. As usual in a migration you have to replace the old values with new values to enable SAP again to find those conditions.

This means you have to create a data declaration to split this VAKEY field into its parts, like this:

data:  begin of ZZVARKEYA,

       ZLIFNR like LFM1-LIFNR,

       ZMATNR like MARc-MATNR,

       ZEKORG like LFM1-EKORG,

       ZWERKS like MARC-WERKS,

       ZINFTYP like EINE-ESOKZ,

       end of zzvarkeyA.

you may want to do this for the source field and for the target field.


And then you need a little ABAP coding to replace the individual values. I have this mapping old to new  in a Z-table with 3 fields, the identifier e.g. EKORG for purchasing organiatisation, the old value and the new value. I use a select statement to retrieve the value from this Z-table.


The value field KBETR can cause as well some headaches, especially if you have conditions with currencies that have no decimals like JPY  or where the value is percentage. In both cases SAP has a strange way to store the values in the table.

The value field is defined in the data dictionary with 2 decimals. A currency like JPY   does not get decimals when it is stored, it is just moved as is to this value field. And a price of 123JPY is then stored as 1.23 JPY.

A percentage is allowed to have 3 decimals, but stored in a field with 2 decimals makes it look extraordinary too.

And last but not least you may have negative condition values in the source file because of discount conditions.

All these circumstances have to be taken care in the conversion routine, which is in my case like this


g_amount = ZKONP-KBETR / 100.

if ZKONP-KONWA = ‘%’.

  g_amount = ZKONP-KBETR / 1000.

  endif.

if ZKONP-KONWA = ‘JPY’.

  g_amount = ZKONP-KBETR.

  endif.

if  ZKONP-KSCHL1 = ‘RA00’

or ZKONP-KSCHL1 = ‘RA01’

or ZKONP-KSCHL1 = ‘RB01’

or ZKONP-KSCHL1 = ‘RC00’.

g_amount = g_amount * -1.

endif.

write g_amount

   to E1KONP-KBETR decimals 2.

  REPLACE ALL OCCURRENCES OF ‘.’ in E1KONP-KBETR with ”.

  REPLACE ALL OCCURRENCES OF ‘,’ in E1KONP-KBETR with ‘.’.


Because of the CSV file format I have to devide the value by 100 to get the normal condition value.

In case of a percentage the source value has to be devided by 1000

In case of the zero decimal currencies, I can take the value as is from the source.

If the condition value is negative, then I need to multiply with -1 to get a positive value as I an only carry positive values in this IDoc.

And the last lines take care about dots and commas as decimal separator. In Germany we use the comma as the separator for the decimals, while the Americans use the dot as a separator. And as I get often Excel files from all over the world for migrations I always have to care about the decimal separators.



And finally I want to share an issue that came up during our tests as I had not completed the condition migration.

The info records itself got loaded like explained in the previous blog including the information about the price and effective price, even when they were zero.

inforecordpreis.PNG

You can see that both fields are protected like they are when conditions exist. However, the conditions were not loaded at that time.

When the user clicked the condition button…nothing happened. A inconsistent situation, the info record itself has the information that conditions exist, while they are not there, hence the condition button does not work.

The only workaround is to change a value in another field and save the info record. Then go in again, now you click the condition button as usual. (Don’t forget to correct the previously changed field again)

To report this post you need to login first.

16 Comments

You must be Logged on to comment or reply to a post.

  1. Tammy Powlas

    Sounds like you’ve had a lot of experience with LSMW’s and data migration.

    We are finished with the first part of a couple of big data migrations using SAP’s Data Services solution with SAP Utilities.

    If I find the energy to write about it like you have I will try 🙂

    We are also using LSMW’s to update certain custom settings not covered by the standard SAP RDS solution.

    (0) 
    1. Jürgen L Post author

      Data migration with LSMW is a big portion of my job. Maybe was, next year is mainly planned as project manager, LSMW goes then to a younger colleague, and I do only some consulting and training, and some smaller migrations starting with Singapore in 2 weeks after a 6 week break. A closer look into RDS is as well on my wishlist for 2014.

      (0) 
  2. Ole Geismar

    About the last problem you mention; that the info record appears inconsistent…
    Would it help to run RM06INP0. This would update EINE-NETPR with the most current condition values and maybe also fix the “inconsistency”

    (0) 
    1. Jürgen L Post author

      RM06INP0 will not help as there are no conditions in that particular case, with conditions I do not have this issue.

      (0) 
  3. Vivek S

    Congrats Sir!! very good in clarity of delivery by the way of presenting this to SCN

    members.

    Since price condition in PIR with LSMW / IDOC Method is really hard challenges. But you

    had presented here is very good delivery. You are king of data migration sir.

    Always thank you for contribution and sharing knowledge to scn members like me and

    others.

    Regards,

    Vivek

    (0) 
  4. Tamas Gal

    Hi Jürgen,
    many thanks for this blog, this is extremely useful, and great work!!

    It was really interesting you mentioned that in MEK3 with the conditioninfo button we can list the conditions of my condition type and then I can send them to the target system in IDOC format.
    My question is what your opinion? Do you know any constraints, e.g. any data that this standard IDOC would not migrate to the target system? I have a few condition types only so I think this scenario would work much better then creationg a lot of LSMWs with recordings that I am planning to do (as many LSMWs as many condition tables we have scored with the number of condition types for each condition table).
    For example do you have any idea which basic type we should use? We have the following basis types:
    COND_A01
    COND_A02
    COND_A03
    COND_A04
    Which one would you use?

    Cheers,
    Tamas

    (0) 
    1. Jürgen L Post author

      I am using  COND_A02, since I need neither an UPS Link nor do we use long material numbers.
      I haven’t yet faced any constraint with the info record price conditions that I had to load  using this Idoc. Sure there are specific challenges with each system that we merge into ours, but I had not yet one where I had to give up. I just evaluate and decide on economical basis, you would never see me trying to fix an error case via coding if that takes me 2 hours and has the danger of screwing up other things while I could just do a manual fix in 1 hour.

      (0) 
  5. Tamas Gal

    Hello Jürgen,
    thank you for answering, COND_A02 seemed to be the simple method, thanks.
    With the second part of your comment you are saying that in case I would face bigger difficulties with it, then I should drop this idea and go with the manual LSMW instead? Or you are saying that we should not set up any rules for choosing data migration methods, just always find the solution that suits our needs the best?

    By the way, it is really surprising to me. I am quite new in this area and this is really-really surprising to me that big companies with expensive implementation/merger projects, dealing with valuable master data are using methods like MS Excel for data transafer between 2 systems. I know that it’s fine and works well, just surprising to me as a relatively new starter that companies trust “manual excel work” when it comes to their valuable master data.

    Please keep up with this great work that you are doing!

    (0) 
  6. Jürgen L Post author

    I don’t trust any Excel work if it was not done by myself.
    We try to stay away from collecting data in Excel whenever we can, we prefer to have the data maintained, corrected  and enriched in the source system. Here we usually use classification.
    Excel is then mainly a data carrier used on the day of data load, filled and eventually edited as described in the blog.
    Big companies do not just take over big other companies who run SAP, they sometimes buy small companies who use other systems, and in this case we rarely have any other option than Excel. But even when both companies run SAP some concepts may be that different that a part of the needed data for the future processes is just not available in the legacy system and has to be prepared in Excel with all negative consequences (countless “original” copies, carriage returns in an Excel cell, twisted numbers etc etc). And for some data there is just no option to use ALE, and sometimes you are even not allowed to do any customizing for ALE or RFC programs in the legacy system (e.g. when you buy just a part of a company), then you also need a way to get the data, which may then be SE16 table downloads to Excel.
    You have to know where the data is, you have to know several ways how to extract the data and also several ways how to load the data. And based on that you make your choice per project (if you have this freedom). And my goal is time and cost efficiency.
    I have currently 11 different projects, 3 real big and rest smaller. I did 213 data loads in the last 4 months, there is no time to think for an hour how I could save 10 minutes runtime, as this means I finally lost 50 minutes.
    If I have to load 50000 info records, and 20 fail in a test load, then I do not change the program, I just maintain those 20 manually, as this is faster than changing the LSMW coding and doing new tests.

    (0) 
  7. Tamas Gal

    Hello Jürgen,
    thank you for your answer, but can I have a concrete question, please?

    Have you met the situation when SAP is expecting you to use ISO codes, or predifined number formats in IDOCs?

    I got the error message “ISO Betrag 1.060,00 ist nicht in SAP Betrag umsetzbar bei KOMG” (I’m using the system in German). So I had to change 1.060,00 EUR to 1060.00 EUR and it was OK then.
    I also had to change PC to PCE, because that’s the ISO-version (ok, I know I can set PC in CUNI as ISO-code, but I don’t know what other UoMs I will encounter during the migration so I cannot set it for all the UoMs).

    I already worked with IDOCs before, but never really met this problem. Is this issue special for COND_A  IDOC types, or it is general SAP behaviour? Do you have any other ways to solve it other than changing the number formats/UoM manually in my excel?

    Thank you again in advance!
    Tamas

    (0) 
    1. Jürgen L Post author

      Certainly I had such error messages and situations. These are technical restrictions, and they are not consistent across all all Idoc types. This is annoying but unchangeable for me, I am just passenger on the SAP train.
      Idocs are a technical medium, working with Idocs means you have to leave the functional area that uses conversion exits to make data look nice and have to dig deeper  into the technical attributes of data.
      I know no IDoc which accepts this version of  field content for a value field: 1.060,00 EUR
      I know the info record Idoc as the most restrictive Idoc in regard to units of measures (ISO Codes)

      I have ABAP coding in my LSMW to turn the good looking Excel value into the needed technical format: REPLACE ALL OCCURRENCES OF ‘.’ in E1KONP-KBETR with ”.
      REPLACE ALL OCCURRENCES OF ‘,’ in E1KONP-KBETR with ‘.’.

      I already fight for many years that each and every unit that we use should have a unique ISO Code to avoid uncontrolled conversions into unwanted units. Unit of measure mapping is one task that consumes a lot time in each project.

      (0) 
  8. Tamas Gal

    Hello Jürgen,
    we have just finished a migration project that seems to be over now, I used the methods (partially) that you described in your blog, and I would like to thank you for your help, and extremely valuable tipps. I thought it could be interesting if I would write my experiances regarding the techniques you have mentioned above. Since we were migrating from an 6.0 SAP to another 6.0 SAP system, we used IDOC technique (without LSMW) to transfer PIRs (me13 – sending with me18), purchase (MEK3) and SD conditions (VK13) In case of the conditions we used that “send IDOC” button that you wrote about in your blog above, it worked pretty well! So it would be the following:

    • IDOCs for migration technique can only be used safely between the same upgrade level of SAP system, so people should not use IDOCs (I think) for a 4.7 –> 6.0 migration
    • ALWAYS CHECK T006 table! – in CUNI you can set ISO-codes for each unit of measures. One have to make sure that the UoM – ISO-code mapping is in 1-to-1 relation both on sending and in receiving system. (e.g. if you have multiple UoMs, like H and STD, that are both attached to the same ISO -code , HUR, on the sending side, but HUR ISO-code is mapped to the ‘STD’ UoM as a primary UoM in the receiving side, you can end up in the situation that all your conditions that had had ‘H’ and ‘STD’ UoMs in the source system ALL OF THEM have ‘STD’ UoM in the receiving system. It could be really-really dangerous!
    • don’t send migration IDOCs in online mode, always in background, because of serious performance reasons. You can schedule ME18 as a background program and send ~80k PIRs in an hour, but you cannot schedule VK13 or MEK3 as a background program, so sending ~100k conditions like this in online mode, can last for days! So what you have to do is that configure WE20 on the sending side in a way that you don’t select the “send immediately button” in WE20, but you select the other one (collect IDOCs, or sthg like that…). You still use VK13 and MEK3, but it will be much much quicker, because the system will not send the IDOCS immediately, but only create and save them in an interim (yellow light) status. Of course, you need to schedule the RSEOUT00 program in background mode, and it will send those IDOCs waiting in the system to be sent out in no time, so migrating ~100k volume conditions will last for a few hours only!
    • VAKEY – you mentioned that you can ‘manually’ concatene the VAKEY field in the condition header (VAKEY is the field that is responsible that from ME13, by clicking on the “condition” button the system will direct you to the relevant conditions. If the VAKEY is migrated incorrectly, I think your pricing will still work, but users won’t find the relevant MM conditions navigating from the ME13 transaction). It is good if you don’t have many condition tables, but we had a few one, so in case of each condition table you have to manually concatenate the VAKEY differently (since VAKEY is the concatenation of your condition tables key fields). So it can be dangerous to find out how to concatenate the VAKEY by yourself, so I think it’s better to use the ‘SD_CONDITION_VAKEY_FILL’ FM for both in case of MM and SD conditions. As an input parameter you give in your condition table name and your data (KOMG structure/segment), and the FM gives you back the VAKEY – easier, less chance of manual errors
    • you can find the how to guides on SCN how to create a Z process code, and a Z function module and assign them to your ALE interface in WE20. Of course you can use userexits, etc. to manipulate data on the receiving side (e.g. we had to change material numbers, and vendor numbers during the process of the migration, so we had to manupilate the data), but I think it doesn’t worth struggling with exits, because new requirements will always come from the Client for sure, and it can happen that sooner or later you will receive (or find out) a customer requirement that you cannot handle with exits. So it’s much better to copy the standard IDOC processing FM into a Z one, customize your Z FM in WE20, so you have complete freedom, can manipulate the data as you wish, and once the migration is over, you simply delete your Z Fm and process code from the customizing (and you didn’t “ruin” the customer’s standard code with exits that will stay “forever” for the sake of one single migration project).
    • IDOCs are tricky – PIR and contion IDOCs can both create data, or (if the data already exists) change the data!! – if you are migrating into a system where you already have a working SAP system, you have to make sure not to overwrite the existing data of the client with your migrated data! Fortunatelly, IDOCs are quite smart. So e.g. in case of PIR IDOCs, the system creates the new PIR (on the receiving side) with the exactly same PIR number you had in the IDOC (like external number range, because of direct table writings I guess). But if a PIR already exists for the vendor and material you have in your inbound IDOC, just the already existing PIR has a different PIR number than the one you have in the IDOC, the system makes no problems. It will overwrite your PIR with the IDOC data because of the material and vendor number match (dispite of the different PIR numbers in the IDOC and in the destination system). Of course if the PIR numbers are matching, but the material and/or vendor numbers are different, the IDOC will run to error, which is good this way.

    This is all I had in mind, I hope it was worth sharing.
    Wish you all the best,
    Tamas

    (1) 
  9. Prince Lim

    Hi Jurgen

     

    How about condition in contract bapi create.. is it the same method?

     

    because when i load with condition, account multiplied with a lot of numbers and blank assignment.

     

    I still can’t figured out why it account assignment multiplied as much as condition entered.

     

    Thank you

     

    (0) 
    1. Jürgen L Post author

      Sorry I can’t understand what you try to explain, my entire blog had nowhere anything mentioned about accounts  and I don’t get the link between creating conditions  in an info record and account assignments.

      Please post a question instead of adding a comment to a blog and provide much more detailed information

      (0) 

Leave a Reply