Skip to Content

This blog is in continue to the LSMW Material master by BAPI method – Part 1

I would like to show how BAPI method is used to create  and change material master.

As an example I use a real business case:

existing material masters need to be updated with new material descriptions and sales text

not all materials have a sales text view

Text needs to be maintained in German and English language, English text is identical to German text

Text is provided in a Excel spreadsheet

Annotation: Our material numbers are set as Lexicographical, no leading zeros

We have to use a certain path to store the .read and .conv files

The homepage of LSMW can be found here in SCN at

In that document you can find the links to the general documentation in too:

Step 1 – Maintain object attributes

Here you define if your LSMW object is for a one-time data transfer or if you want use it permanently as a periodic transfer.

Migrations are usually one-time data transfers, even you do it many times until you are satisfied with the result

The periodic transfer gives you an option to create a program ready to be used by end-users.


Please make use of F4 search help to get the parameters for the import objects.

in this example we are using standard material (industry) as BAPI import method: BUS1001006, method SAVEREPLICA and Basic type MATMAS_MASS_BAPI03

Step 2 – Maintain source structure

I just have one Excel file with the source data and each line looks equal, so there is no hierarchy no different structures per line


Hence the source structure to be defined is as well a simple flat structure.

Click the create icon enter a name for the source structure and a description.

Don’t create a too complex and long name, as you may need to type it in the mapping rules if you have to add some ABAP coding in the Mapping rules later.


Step 3 – Maintain source fields

In this step you have to enter all the fields that are contained in your source file per structure. I recommend to use the same sequence as it is in the source file.

There are various options to maintain the source fields as you can see in the pop-up after clicking the “Copy-Button”

you can upload the fields from a text file, or copy it from another LSMW object, or you copy it from data dictionary (I will explain this in more detail later in another blog), or you can copy it from the first line of your data file (but as you do not have the field length in the first line you need to complete it anyway manually).


For such small files like in this business case I usually do it manually. Move the cursor onto the structure line, then click the table icon (right from the copy icon). This is more convenient than defining field by field via the create button.

I usually use the field names, its type and length from data dictionary in SAP, but if you want identify the values from your source based on field names (instead of position in field sequence), then you need to make sure that the fields names are identical with your source  file.


Step 4 – Maintain structure relations

In this step you can see that a BAPI or IDOC structure is much more complex than a structure that you get from a recording.

But don’t panic, you do not need to care about any part, just about the parts that are needed for your migration case.

You move the cursor onto the structure that you need and click the create relationship button. As your source structure has just one structure it is assigned automatically. In case of a multi-structure source you would need to select the source structure that need to be assigned to the corresponding target structure.

We have fields that are used in many target structures in just one source structure, so we assign this source structure to all needed target structures.

In our example it is the header segment (it always needs to be assigned), the header segment with control information, the material description and the long text.


Step 5 – Maintain field mapping and conversion rules

After you assigned your source structure to the target structure in step 4, in this step you assign your source fields to the target fields, and you define how the values will be. Whether they are just moved from source field to target field or if they need to be translated via rules, or fixed value assignment for mandatory or just necessary fields that are not in your source file.

My first choice is Auto field mapping. You get it from menu Extras > Auto-field Mapping


you will then get a screen to define the rules for this auto field mapping:

Here you control whether you do this for all fields or just for empty fields

I usually choose the 100% match as there are too many fields with similar names that could get a wrong assignment if you use a lower percentage.

If you have not yet define reusable rules then you can only apply the MOVE rule to the fields. You can change it anyway later.

And as I want to see what SAP does I choose “with confirmation”, SAP will then show me each assignment and I click only okay to continue.

With my example the auto field mapping does not work as my source field names are different from the BAPI field names.

This happens often with BAPIs, as they do not use the table field names. With IDOC import method this auto field mapping is a big success if your source field names are defined like the SAP table fields.


In my example we need to do the field mapping manually.

You have to tell the BAPI what views you want maintain, this is done in the header structure, where you find as well the material number.

move the cursor onto the material field, then click the create source field icon and select the material number from your source field definition.

Keep in mind that my material number is lexicographic, means it does not have leading zeros. If your material number is with leading zeros, then you either need those leading zeros in the source file too, or you need coding to add those leading zeros, otherwise the BAPI is not able to find your material for update.

For my example we need Basic data view for the material description and Sales view for the sales text.

As those fields are not among the fields in the source file we have to assign value as a constant.

Move the cursor onto the field, then click the constant button. You get a pop-up to enter the value, which is just a X.

Use F4 to assign the value, this helps to avoid errors from typing. SAP behalves sometimes strange if the characters expected as capital characters are entered as small characters.


The sales view is only needed if there is sales text in the source file. Hence we cannot assign a pure constant, we need some coding.

Nevertheless, assign the constant first, so you need to code less yourself, then double click the coding to get to the code editor.

Now you only need a small piece of ABAP coding to check if the sales organisation field in the source is empty.


After this is done we move forward to the Segment E1BPE1MAKT for the material description.

Here you find as well the material number field, so you assign it to your source field.(think about leading zeros!)

Language is not among the fields in the source file, so you assign the language as a constant. Remember we need the text in German and English.

However, we can only assign one language here. So you could think about cerating a second LSMW object to load with the other language, or you make it flexible with some coding. At this point we assign just the first language. We do it for both language fields LANGU and LANGU_ISO.

LANGU is the 1 character long field, LANGU_ISO is the 2 character long field for language (see table T002 for reference).

Just assign the language via F4 and SAP will put the right code for you (even you enter e.g. EN in the pop-up, SAP will only enter E in the LANGU field.

And the last field in this section is the Material description which need to be assigned to your source field.

Now we need to care about the second language, which is from the text identical to the first language as our product names are equal in all languages.

We do this with a small coding in the section __END_OF_RECORD__

By default you see the code transfer_record. This code submits the entries made in this section.

We need a second record for material description in English.We can force this in the __END_OF_RECORD__ processing time

Double click the coding line here  to get to the code editor and enter the code behind the transfer_record statement.

You only need to move the new value for the 2 language fields, all other field values are still in the memory.

so you add: E1BPE1MAKT-LANGU = ‘E’.

and:            E1BPE1MAKT-LANGU_ISO = ‘EN’.

followed by: transfer_record.

This way you transferred 2 records for MAKT table with a different language.


The same has to be done for the sales text now, but only if we have sales text, which means a few lines more coding.

But let us start from the beginning in the sales text structure:

All long text is stored in table STXH and STXL. The key of this table is Text object (for material master sales text: MVKE), the text ID (0001), the language, and a combination of material number, sales organisation and distribution channel for the text name.

You can find this key if you maintain one material manually, then click the editor icon in the sales text view. In the editor chose from menu GOTO > Header

and you will get all necessary information.

In this example you assign MVKE as a constant to the Object field, 0001 as a constant to the Text-ID field, and again the language to the 2 language fields.

For the combination of the text name field we need to do it with coding.

Coding is: If MYSRCFILE-VKORG = ‘ ‘





Which means: if the sales organisation field is empty, then skip this this structure. If it is no empty, then concatenate Material number (think about the leading zeros) with sales organisation and distribution channel and bring this value into the target field .


Until here we only have the first of 2 lines sales text. So we need to take care about the second line and as we need this sales text as well in the second language we solve this with coding in the __END_OF_RECORD__ section, similar as we did for the material description.


the transfer_record statement in line 1 creates a record with the first Sales text line in the German language .

in line 3 we check if there is a second line for sales text, and only if we have one then wie move the content from the source field to the target field and transfer this record. The language and the object, text id and text name is still in memory.

And then we repeat the same for the English language. We first move the English language to the 2 language fields, then we move the 1st line of sales text and transfer the record. And then we do it for the second sales text line – if present.

Step 6 – Maintain fixed values, Translations , user defined routines

Not necessary for this business case

Step 7 – Specify files

In this step you have to define minimum 3 files, your source file, the read file and the conv file.

When SAP uploads and reads  your source file, then it is writing the content into the READ file. The name is automatically created by SAP using the project, subproject and object name.

When you run the conversion, then SAP reads this READ file and writes the converted data into CONV file. This file name is as well created automatically by SAP. Here we have a restriction of a maximum length of 45 characters. if you have to write this data into a certain directory, then you may have to shorten the proposed file name.

Your source file is completely unknown to SAP, so you have to describe it. Which means you have to tell where it is located, whether it is on your PC drive or in the SAP’s file system. In this case the source file is at the PCs local drive.

As SAP cannot read directly from Excel, you need to save the Excel file as a comma separated file (CSV) or as a Unicode Text file (TXT).

Keep already in mind that your file has to be closed when you execute the reading step.


Please use F4 to find the file on your PC, SAP is then creating the content of the FILE field itself

Enter a name of your choice

Indicate whether this file has one source structure or multiple source structures. In this case there is only one source structure (remember the definition in step 2)

In case you saved your Excel file as unicode text you have to put the radio button to Tabulator.

As we have field names in the first line in our Excel source, we need to select the box for “Field names at Start of File”. This controls that SAP starts processing the source file at the second line.

If the sequence of fields in the source is identical to the definition made in Step 3 then you have to  mark the box “Field Order Matches Source Structure Definition”. Without this indicator SAP would try to find the fields based on the names.

Further set the radio button to “Record End Marker” and to ASCII as we are using a text file as source.

Step 8 – Assign files

Not much to do, as there is only one source file, SAP proposed it automatically, you only need to click SAVE


Step 9 – Read Data

In this step you read the data from your source file. SAP automatically generates the Read-Program based on the settings made in the earlier steps.

You can process it for all your data or only for a part of it.

In case you defined amount and date fields in your source (step 3) then you have here the option for automatic conversion. Without having amount and date fields you can ignore the defaults.

After execution you get the numbers of read records and transactions. As we have only a flat structure the numbers for records and transactions is equal.

However, you should verify with your source file if this number is correct. It is the first indication when something may be wrong.

I had often cases where people had deleted content from the Excel file, I mean cell content, they had not deleted rows. For Excel those rows with erased content are still active, and they get saved as empty lines at the end of your text file. So SAP may read many lines more than actually have data. If you have a good sense for data then the number of read records will already tell you something.


Step 10 – Display Read Data

A very important step is to display the read data, because you can check if the values were moved into the correct fields and that they fit into the fields and do not overlap with other fields ( in this case you have to redefine your field length in step 3) 

You cannot see this from the overview, you have to click the line to get into the detail.


However, the overview screen is important too. Here you can see in one sight if the records look equally or if you see a kind of wave, which indicates that something with the field lengths i wrong. And if you use the icons to go the end then you can as well check if your records have data until the last record

SAP will process even empty records in the conversion step, which certainly leads to errors. Such empty records need to be removed from your source file, then you need to read it again.


Step 11 – Convert Data

Similar to the Read Data step you can convert only a selection or all data.


Important here is to set the radio button to “Create File“. This way SAP writes the CONV file with the converted data. This corresponds to the setting made for the Port in Activation of  IDOC Inbound processing.

Further is gives you the last chance to check in step 12 a few individual records if the data was correct converted.

Not to forget that you get the numbers of converted records and transactions. Those numbers will hopefully match with your expected numbers:

I usually store this output as text file for audit purposes.


Step 12 – Display converted  Data

Like in step 10 you can call the display for all or just for a selection. If you have more than 5000 records then you should restrict the selection.

You get first the overview screen, similar to the Read data, and can as well go into the detail by clicking a line.

The converted data is displayed according to the target structure. Only those segments that were selected in step 4.


The green lines are technical lines for the IDOC, The yellow line is the header segment for the transaction, the blue lines are the individual records.

You can already see that there are 2 records with …MAKT which holds the material description in German and English

and another 2 records with …MLTX for the sales text.  (if we had sales text wit 2 lines, then we could see 4 records here)

Go into the detail of at least any record of one transaction to verify that each target field has the right content.

Step 13 – Start Idoc Generation

Not much to worry, you have just 2 options: Execute or leave

SAP proposed already the file name with the CONV file. Click execute and see the count in the status line going up.

SAP tells you when it has finished. then leave this step and continue with the next step.

Step 14 – Start Idoc Processing

You will see a bigger selection screen, but everything needed is defaulted and you can just execute it.

If you have many thousand IDOCs, then better execute it in background (via menu Program > run in background).

Even the IDOC posting is much quicker than batch input sessions, I have often migrations with 10 and more hours for processing of one object.

Changing material text and sales text for about 40000 materials needed only 36 minutes.


When processing has finished, then SAP present an overview like you probably know it from MM17 Mass maintenance.


You can sort it by status to get immediately an overview if you have failed IDOCs.

However, this analysis is better made from the next step.

Step 15 – Create Idoc Overview

If you ran the processing step in foreground, then the selection screen is already fully defaulted and you can execute it like it is.

If it is not defaulted, then enter the “Created at”  time and the “created on” date and enter the logical message, here “MATMAS_MASS_BAPI” and execute.

The overview looks slightly different now.

You get a frame on the left from which you can select the IDOCs based on its status.

As we have done everything correct, only one status is shown.


By double click onto the IDOC number you can display the IDOC details, and eventually correct the content of a failed IDOC and then repost the IDOC  in the last step of your LSMW. I plan to describe this in more detail in another blog later (but first I have to make something wrong, otherwise I cant get the screen shots – this will be tough 😆 )


To report this post you need to login first.


You must be Logged on to comment or reply to a post.

    1. Jürgen L Post author

      This is certainly possible, e.g. for purchase orders, contracts and for bill of materials and many more

      the F4 search help lists in the BAPI field lists all objects .

      I plan to write a few more blogs regarding data migration, e.g. about advantages and disadvantages with certain import methods, common errors, challenges in huge migration projects, SAP to SAP migration….

      Hope I find time to do that before my next migration project goes into the busy phase.

        1. Jürgen L Post author

          No, I saw it already and had checked my objects, but neither me nor any colleague had used LSMW for projects and wbs elements.

          In general this BUS2054 or project IDOC is used for ALE distribution

          I would setup a small ALE scenario and send an existing project which is similar to one that you want load with transaction CJAL. (see my blog: LSMW migration with IDOC method  and using IDOC as source  Part1: Extract by ALE )

          then you can check the details of the IDOC in BD87  to evaluate if it contains all what you need. If it does then I would try to get the same with LSMW.

  1. yawar khan

    Hi Jurgen,

                    I was not in the favour of using LSMW for Material master, This document will surely help.

    but I am not able to view the screen shots from Step no 4 and latter.

    Please help.


    Yawar Khan

  2. Vamsi Krishna C V

    Hi Jurgen,

    I’ve been facing a peculiar Issue with MATMAS_MASS_BAPI03.

    While popualting the MARAX structure i have passed the MATERIALNUMBER field as ‘X’.

    (I do know that we have to pass the material number here as well, i passed X by mistake, when i checked the status it was 53.

    My concern is that this could lead to misinformation and some data might be missed to be loaded

    I have checked the BAPI return messages by debugging at the BAPI’s Call statement

    and they provide no info whatever.

    the Reason the IDOC sets status as 53 is because the BAPI return table returns message type as S(E expected). and the IDOC logic interprets it as succesful processing)

    I am creating a material with an external number range. so it is required that i pass the Material Number.

    But the weird thing is the IDOC goes into status 53. But no data is uploaded into the database.

    Functionality wise, the IDOC should not be in status 53, beacause no data was transferred into the database

    have you ever faced a similar issue.?

    i may be paranoid but just checking.



    1. Jürgen L Post author

      I remember that I have  seen similar reaction, but I dont remember the exact root cause, must have been my fault otherwise I would remember

  3. Abhilash Joseph

    Hi Jurgen

    Example very informative.

    In this example does that mean 12 physical documents were created in SAP Application before being moved into the respective tables ?

    1. Jürgen L Post author

      not sure if I understand the question correct.

      LSMW step 13 converts the data from the conv file into 12 individual Idocs, and they were posted one after the other

  4. Abhilash Joseph

    Hi Jurgen,

    What I was referring will it result exactly 12 files( one for each Idoc)  being created on the server .

    I was just trying to understand if it will be a performance bottleneck if there is need to use this for large volume data set due to heavy I/O(writing a file and then reading this file for DB )

    Is there a way to bypass creation of physical files by choosing “create Idocs directly” in step 11.?

    1. Jürgen L Post author

      It creates 1 file only.

      It is possible to directly create IDocs and omit this file.

      But this removes an extra chance to check yourself the quality before posting.

      Of course it will save time if you do not go this extra step (maybe 15 minutes for 50000 Idocs)

      If you tested your LSMW very well in a QA system, then there is no need for this extra step in production system.

      1. Abhilash Joseph

        Thanks Jurgen.Its explained well by you . I thought it should be logical to create only one file due the huge I/O overhead this will bring if it has to create one file per each IDOC but some of the people who worked in SAP claimed that there is single physical file per each idoc and hence I asked the question.

  5. Mohammed Fath Elbab

    Hi Jurgen,

    it is a very helpful topic.. I am trying to create a Project and WBS . i have followed all your steps and the Idoc get green however after i checked the project it is not created.

    can you help me on this why it is not created?

    1. Jürgen L Post author

      Not sure if I can help, projects and WBS creation is usually not my duty so there might be something specific to projects what you are missing.

      You should post this as a question in the forum instead as a comment to a blog. There you can also post screenshots of your IDOC structure and content

  6. Ali Dai

    Hi Jurgen,

    This is once again an excellent post. I have a challenge that I would like to address and ask if you have a way of resolving it. I need to upload master data (materials and functional locations). For Functional Locations I use 0440 Standard Object and I can upload one class and one characteristic. However, I want to upload two classes and more than one characteristic per class.

    Do you have any idea?


    1. Jürgen L Post author

      I suggest to open a discussion in the EAM space and then provide some details from your source file for class and classification as well as from the overview of  Dispaly read file step in LSMW. From the target structure itself I do not see any issue that it would only take one class with with one characteristic.

      However I usually load classes and classification separately with IDoc method

  7. Joris Bayer

    Hi Jurgen,

    Great document. I did a quick test and all went well it seemed… Except that the material master value I wanted to modify in my test wasnt modified by the BAPI. IDOC green, no errors. Any clue what the culprit could be?



      1. Joris Bayer

        I indeed missed the ‘X’ value. But still the material master field is not modified as defined in my data. Converted data looks OK, IDOC doesnt give an error. Any other suggestions?

          1. Joris Bayer

            Hi Jurgen, I missed the ‘X’ value at first but have filled it now. But as mentioned no luck yet with updating the material master (BISMT, GEWEI) in my little test.



            1. Jürgen L Post author

              we’re coming closer to the root cause, MARA-GEWEI is a relic from old days, at least last millennium, this field is redundant with the MARM-GEWEI. Online in MM01/MM02 you can enter a value to the weight field (MARA-GEWEI) and it gets copied automatically to MARM-GEWEI  and vica versa.

              This does so not happen in IDOC and BAPI, here you have to add the same value in the  MARM  structure to get a successful update.

              1. Joris Bayer

                OK, that is an important difference between LSMW Batch input and BAPI… Tried with MARA-GROES (Size/dim) but material wasnt updated.

                  1. Joris Bayer

                    The IDOC posts ‘green’ even if I provide incorrect data on purpose in the source TXT file (e.g. material X128 instead of material 128). I guess something is wrong with the ALE part although all settings in WE20/21 seem to be correct. Master data are not updated (MTPOS_MARA)

                    1. Jürgen L Post author

                      I can’t answer that without seeing screenshots from the IDOC data and from text file. Please open a discussion in MM space and provide the details

                      1. Anand Babu

                        Hi Juergen,


                        I need your inputs on MATMAS_MASS_BAPI03.

                        I am trying to update Gross weight data for the material but the  segment E1BPE1MARA doesn;t have the BRGEW field to be mapped to.

                        But we have the gross field  BRGEW in E1BPE1MARM which loads the Alternate UoM data. 

                        When I try to mapped the Groos weight field to E1BPE1MARM, it is failing with the error “Enter the unit of weight for the net weight”. Below is the screenshot of E1BPE1MARM segment.


                        the corresponding X fields are set correctly.

                        Could you please let me know how should the segments need to be mapped for Gross weight update.

                        Kind regards,


                        1. Jürgen L Post author

                          you cannot have a gross weight without net weight.

                          So you have to fill as well E1BPE1MARA-NET_WEIGHT along with its units.

                            1. Jürgen L Post author

                              Please post a question in the forum and provide more details e.g. from your mapping in LSMW and from the IDoc segments

  8. Mike White

    Very nice instruction Jurgen.  It helped me with an error message MG046 “Field & has been transferred inconsistently or is blank”.  I do have one issue though when trying to populate more than 132 characters in the basic data text.  In my LSMW the text is cutting off but if I manually maintain in MM02, I can populate the entire string.

  9. Ghulam Murtaza

    Hi Jurgen,

    I created LSMW following your steps, here are my source field.

    I did not include any coding in the program,

    I ran successfully without any errors, but material master is still not updated, Can you point what could be the reason.


    1. Jürgen L Post author

      a) it is not known what exactly you want to perform, e.g. creation of a new material or updating of text

      b) with the shown fields from your structure you can’t achieve any of the 2 purposes from a)

      c) trouble shooting starts from the logs, each IDoc for material master creates an application log. Display an IDoc, open the section of status records, click the icon in front of the status and in the next screen the button for Application log.


        1. Jürgen L Post author

          you can try to maintain the needed values in SE37 for function module MATERIAL_MAINTAIN_DARK and then see what messages you have in the return parameter.

          As I fear that this requires much more detail it would be better to create an extra question in the community than continuing here with comments.

          When you create the question then please provide screenshots from your structure relations in step 4.  Currently I guess you just missed to link or to maintain values for  the X-structure for your changes.


  10. khushbu dave

    Hi Jürgen L,

    Nice detailed document. I followed all the above steps but technical line EDI_DC40 is not generated at correct level.

    I tried for 2 materials and it creates segments with 2 records into the same IDOC.

    Please can you share your inputs in case of this issue.




    Best Regards,


    1. Jürgen L Post author

      Can you show a screenshot of your source file, I guess you have redundant or no old numbers?

      Can you show a screenshot like it is in my step 12 ?



    1. Jürgen L Post author

      I tested an old project which is setup equally to the settings explained in this blog, and it worked like shown.

      However after changing the step 1 (activating the box for “allow structure assignment to EDI_DC40)  it became confusing. With the initial change nothing changed in the behavior and I had the EDI_DC40 segment as before and like shown in the blog above. So I removed the flag again, which resulted in a lost EDI_DC40 structure from field mapping. So I had to activate it again, and also got it now into step 4 to assign my source structure to it.  This way I got it back in the field mapping. And now the project works again, but looks slightly different from above blog.

      It has probably to do with the system version, but I could not find any OSS note explaining such change.



Leave a Reply