Skip to Content

SAP BW Business Content: valued, but not a wishing well…

As we can read from, SAP Business Information Warehouse provides pre-configured objects under the collective term “Business Content” (BC). These objects accelerate the implementation of SAP BW, since they deliver ready solutions, meeting the requirements for business information.
As regards our analysis area (BW extraction tools), it’s evident that, coming out from SAP official definition and just working for some period on the system, BC (and its datasources) mean ready to run build-in extractors, a good (and growing) business coverage within SAP environments (BW API Service is available as plug-in for all R/3 systems, BW itself and therefore also in APO, mySAP ERP components and industry solutions), transactional and master data involved, less implementation efforts and costs and sophisticated delta handling.
And you will say “This is a dreamland !”.

I’m sorry, my dear friend, I hate to be a killjoy, but, wake up and welcome in the reality…

Although the BC and the related extraction technology have reached a significant coverage of every business area, there are still a lot of reasons (or, better, there are a lot of situations in which you are compelled) to enhance existing extractors or even develop entirely custom extractors: need to extract customer-specific data, to build customer extension for the BC that doesn’t support specific information required by the customer and so on.
Besides, we can’t forget about modified SAP systems in some areas with specific customizing settings or simply with custom fields added to standard tables.

Now the spontaneous question is: what’s happen if a standard logistic datasource as provided in standard (ready-to-use) configuration doesn’t completely meet our data model requirements ?

In other words, we go in RSA5 transaction screen (Installation of datasource from Business Content), by surfing through application component hierarchy, we find our candidate datasource 2LIS_11_VASCL (since functional analysis requires a set of information belonging to sales order schedule line level, for example); afterwards, a double-click on it and we inspect the field list and – dash it ! – I don’t see a specific needed field (e.g., AUDAT, the document date) !
What I have to do ?

Extraction cockpit technique: let’s go back a little…

Fig.1: LC Delta Process for Sales Order Schedule Lines

With the LC, several data structures are delivered and, for each level of detail, there exists an extract structure as well as a datasource (that already represents a BW extract view).

When you create and save a sales order (as other transactional tasks), the document is processed in the memory and then stored into application (and database) tables.
In LC extraction technique (see Fig.1) we have at our disposal different LIS communications structures (like the MCVBAK, MCVAP, MCVEP and so on for sales orders) that we can decide to use for our reporting purposes when the application is running and during memory processing (into a separate memory partition, but for details refer to LOGISTIC COCKPIT DELTA MECHANISM – Episode three: the new update methods).
To be more precise, every extract structure is related to one or more communication structures (and for every communication structure involved an include is provided by standard; see Fig.2) : for sales order schedule line extract structure we have got MCVBAK, MCVAP, MCVEP, MCVBKD, MCVBUK and MCVBUP, whose components you can see from SE11).

Fig.2: 2LIS_11_VASCL Include mapping

Keep in mind that here there is no need of any LIS knowledge, because these LIS structures are involved only from a memory processing point of view and no subsequent updating into LIS tables is performed.

In search of lost field (as Proust said…)

Now let’s come back to our little example and try to understand what procedure have to be followed when, as in our situation, we need a field not provided in a ready-to-run configuration.

2LIS_11_VASCL is the standard LC datasource to extract order schedule lines related information. MC11VA0SCL represents its linked extract structure.
Remember that it’s possible to enhance that, but you can’t create new extract structures (on the same standard datasource).

In the LC there also events mentioned (shown below the extraction structure), but they don’t have any customizing options. They are here just to give you some kind to transparency when our structure will be updated, but they are not of any relevance from the customizing point of view.

  • Among existing fields from the available communication structures

Within LC (see Fig.3) a tool is provided that enables you to add fields from the LIS communication structures (to the extract structure) without having to do any modifications.

Fig.3: Logistic Cockpit Customizing screen

In the maintenance screen (see Fig.4), on the left side, you see what has already been selected in the standard extract structure and on the right side, you see all the available fields of the communication structures where you can select fields from for the update.

Fig.4: Maintenance screen

And, what my eyes are seeing at a glance ? My AUDAT field !
Ok, now it’s enough to highlight the row and click on the left-arrow: (every) selected field is included automatically in a generated append structure for the corresponding include structure of the extract structure (for example, append ZZMC11VA1SCL for include MC11VA1SCL for additional fields in the order schedule lines extractor for LIS communication structure MCVBAK).

When you successfully complete this step, the traffic light icon turns red. This indicates that you changed the structure.

At this point, you have to generate the datasource (see Fig.5): here you can (among the other things) choose fields that can be selected (for various reasons, it is not possible to offer all the fields contained in the LIS communication structure for selection in the extract structure; these fields are hidden for a specific purpose because some specific extract structure is a combination of different processes; for details see OSS Note 351214 ‘BW extraction SD: Restricted field selection’) and if a key figure is inverted or not (refer to OSS Note 382779 ‘Cancellation field in the datasource maintenance’ for details).
After maintenance in this step, the traffic light turns yellow.

Fig.5: Datasource generation

Once you activate the update, data is written to the extract structure and the traffic light then turns green. Our enhancement process is completed and now you can schedule (if required by your delta method) the delta job control process.

If, during a subsequent import of a new plug-in, this same field is already included in the standard extract structure, it will be removed from the customer enhancement (in order to avoid a double occurrence for the same field) thanks to an automatic XPRA program execution within the upgrade procedure.

When you extend the extraction structures in the LC, you can realize that not all existing fields of an assigned LIS communication structure are available for selection.
This is not a lapse of memory: this behavior is wanted.
As not all fields of the communication structures can be used in a practical way, some are hidden because, for example, the field is not filled for the relevant events or is only used internally or for other reasons of design (e.g. you should select key figures only from the most detailed communication structure and only the characteristics from all communication structures !).

Enhance it, but mind the queue !

If you change an extract structure in the Logistic Cockpit through transaction LBWE (or one of the LIS communication structures which are the basis for the extract structure or, in individual cases, also an application table – e.g. MSEG, it’s happened to me ! – by importing a transport request or a support package or by carrying out an upgrade, you have to operate with a lot of cautiousness.
Many problems in this area result from the fact that, although everything is well organized in the development system, the transport that takes place in the production system is not controlled.
In fact, a not responsible and diligent behaviour (when your datasource had already been activated for update, even if for a very short period) can lead to various errors: delta requests terminate, the update from the extraction queue does not finish or V3 update is no longer processed, the initialization on data that was generated before the change no longer works, the protocol terminates…in short, a real tragedy !

Without venturing on a too technical ground, this situation can be briefly described in this way: when you change a structure, the data which is stored in the old form can no longer be interpreted correctly by the new version of the same extract structure.
For the same reason, you can no longer use the statistical data already contained in setup tables and you have to delete it via transaction LBWG.

Therefore, you should carry out the following steps before you change the extract structure (also for an R/3 release upgrade or plug-in/support package import):

  • close your system and make sure that no updates are performed (both by users or batch/background process);
  • start the update collective run directly from LC (that concerns either the V3 update or the Delta Queued update);
  • At this moment, with “delta direct” method, the delta queue (RSA7) must be empty, with “delta queued” method, the extraction queue (LBWQ) and delta queue (RSA7) must be empty, with “unserialized V3 update”, the extraction queue (SM13) and the delta queue (RSA7) must be empty.
  • load all data of the respective datasources into your BW System
  • .
    To completely empty the delta queue, request a delta twice, one after the other: in this way the second upload will transfer 0 data records, but only with the second upload is the data of the delta queue that is available as a delta repeat deleted.
    Anyway, with plug-in PI 2000.2 (or PI-A 2000.2) specific checks were implemented in LBWE so that structures are changed only if (in this order) there are no entries in setup tables of the affected application, there are no entries in the V3 update and in the queue for the affected application.

Now, that all data containers of the relevant data flow are empty, you can make (import) the change.

IMPORTANT: the fields that are available by default in LBWE are automatically filled during the extraction and are delta relevant(see later in this weblog for more details about ‘delta relevant’ changes).

  • Using the LIS enhancement on available communication structures

If your field is not available in LC (that is, it is not in the available communication structures) you have to follow some different ways.

One method of adding user-defined fields is the following: add the required fields to the communication structures (MCVBAK, MCVBAP and so on ) using the append method (via SE11) and then use the LIS customer exits to fill the field.
For information on enhancing the communication structures, you can see the documentation for the enhancements MCS10001, MCS50001 and MCS60001 provided in transaction SMOD.

After you enhance the communication structures you can then enhance the extract structure with the relevant field in the customizing cockpit (transaction LBWE), provided that the communication structure is available in the selection. Then you can proceed with the steps described before in the previous bullet.

Even this procedure allows to manage delta records, but you must make sure that you can determine the status in the user exit before and after the document change (this varies from every peculiar situation and from which table the field is filled, e.g. internal document tables, such as XVBAP and YVBAP).
Why this is so important ?

A document change in the delta extraction process (relating to our LC datasources) consists in the transfer of two data records to BW: one of these records represents the status of the document before the change (before image) and the other one represents the status after the change (after image).
During the extraction process, these two data records are compared by the system and checked for changes: only if there is some difference between the before and after images, these records will be involved in the extraction process and will continue to be processed.
Please refer to OSS Notes 216448 ‘BW/SIS: Incorrect update / SD user exit‘ and 757361 ‘Additional data records in BW when document changed’ for more information on correctly populating the before and after image (even if related to only SD Applications).

  • Using custom append on the extract structure

If you don’t find your field already available within LBWE, if, for any reason, you don’t want (or you are not able) to enhance LIS communication structures, you have another chance: enhance your extract structures by creating an append with your ZZ* fields and then filling these fields with a specific user-exit.

To do this, go to RSA6, choose your datasource, double-click on it and then on the extract structure: you will see an SE16 screen, create an append, insert your ZZ* fields, save.
Then you have to fill those fields with some ABAP custom code that can be anything from some simple calculations or table lookups to complex business logic requiring access to multiple database tables. You can do that by CMOD, creating a project and using the enhancement RSAP0001.
The function modules provided in this enhancement serve for the derivation or modification of data, that is extracted and transferred by the extraction engine of the Business Information Warehouse: EXIT_SAPLRSAP_001 for transactional data, EXIT_SAPLRSAP_002 for master data and EXIT_SAPLRSAP_004 for hierarchies.
However , consider that customer enhancement (CMOD) functionality is being converted to BADIs (RS_BBS_BADI in this case), even if not all BW CMOD enhancements have been converted to BADIs and as long as the exit will be not replaced by SAP there is no need to convert a CMOD exit to a BADI (via transaction SPAU).

In general, SAP doesn‘t recommend to use this latter ‘direct’ method to enhance extract structures in LC.
In fact, by following this procedure, changes to added fields are not extracted to BW if no additional field contained in standard extract structure was changed (since delta relevant): our ZZ* field is empty at the time of the check, in both the before and after image and, since there is no change for the system, no delta records are extracted.
And the same problem occurs in case of document deletion because document has already been deleted when the custom user exit is executed.

  • Getting to the root: enhance your application tables

The last resort of adding user-defined fields to extract structures is to directly enhance the document tables, in our example, VBAK, VBAP, VBEP (…).
The fields can then be filled in the general (sales) user exits for the applications and are also available in LBWE for enhancing the extract structures.

With this procedure, bear in mind that the fields added to the document tables must be saved at database level with the information they contain (with related space effort required), but, on the other side, there is no need for data retrieval in the LIS user exit.

In the end: some useful technical background info

There are four control tables involved in the customizing process in LC:

  • TMCEXCFS: Field status of communication structures.
    Here the content is supplied by SAP: each field has a status per extract structure and communication structure: initial (inactive), A (active) or F (forbidden).
  • TMCEXCFZ: Field status of customer communication structures.
    In this table you can find all fields selected by the customer, per extract structure and communication structure.
  • TMCEXEVE: Events and extract structures.
    Supplied by SAP: which event supplies which extract structure with which communication structure.
  • TMCEXACT: datasources activation and updating status.
    Also this one is supplied by SAP, but can be changed by customer.
To report this post you need to login first.


You must be Logged on to comment or reply to a post.

      1. Former Member Post author
        I would like to know about this extraction types like LO,LBWE,LBWQ etc & wht r this 2LIS…?????.Can I have step by step extraction thru R/3 RSA6,RSA3,RSA5 & other steps .As I am new to BW & also would like know abt the r/3 purpose for extracting into Bw???????????pls help me.
  1. This is an excellent presentation. Thank you very much!
    My question is more philosophical – why use the LO cockpit and delta queue at all?
    With a generic extractor, a change date an creation date selection in the InfoPackage, and an ODS in overwrite mode, one can extract LO data with fewer records across the network. Let the ODS to InfoCube data load manage the delta.
    Many companies will not allow the data warehouse to stop production for a change to the LO cockpit structures and V3 reintialization.
    What makes th LO cockit worth the interruption to R/3 production and the effort in the data warehouse?
    1. Hi John !
      I’m very grateful to you for your interesting question…but you are asking me a tip-off about my next weblog !
      If you can be a bit patient, I will try to give some answer to your (not so philosophical) questioning as soon as possible…
      thanks again


    2. Former Member
      Hi , at some of the customers i have worked with we have been able to use “Statistical Update Date” – which could for example give you open orders as on a day – at any point in time. I do not think it is possible with Generic extractor.

      Also, what if i need additive in ODS rather than Overwrite for certain key figures – a generic extractor will not help there.

      LO cockpit guarantees serialization – by update the Queue in a single V1 task under Direct and Queued delta. This is probably impossible under generic delta if the application server ‘clocks’ are different . It is built for performance …. and many other features – which a generic delta can never achieve .

  2. Former Member Post author
    As usual your coverage of the topic is excellent. Thank you for spending the time to share the knowledge that you have acquired. It most certainly has lead to a better understanding of the LO cockpit. John’s made an interesting comment and I am looking forward to your next weblog. Also looking forward to the weblog on the necessary steps for activating and carrying out successful data extraction as mentioned in your first weblog.


    1. Hi Demetrius,
      a lot of work to do…
      anyway, I will try to do not disappoint your expectations !
      Thanks to you…
  3. Former Member
    Great information. However, when writing your weblog about generic delta, it would be great to hear how you can create a generic delta from a table without a time stamp, e.g. VBPA. We have joined VBPA with VBAP to get the ERDAT and AEDAT, but we couldn’t find a way to use the generic delta, since in a delta you want to capture both created documents and changed documents since the last delta.


  4. Former Member Post author
    Hi Roberto,

    This blog is also informative but you did not give additional focus on the ABAP Exit (the most critical part of this customization process that has a direct impact on the overall performance and overall data correctness).

    As we all know, most of the time a field we require is not readily filled up automatically by SAP and we must do our own ‘SELECT’ to various tables inside the EXIT. This ‘SELECT’ if not given attention will seriously affect overall extraction performance.

    I wish if you could write a new Blog about this, focussing on the proper techniques of implementing a custom ‘ABAP’ on the ‘EXIT’.


    1. Mark Finnern
      Hi Jkyle,

      Better then pointing out shortcomings in this excellent post, you should create your own Weblog post: 

      Additional aspects to consider that Roberto could not fit into his Weblog post …

      At the bottom of the left side navigation you can find the link to “Become a Blogger” do it and I will approve.

      Looking forward to your Weblog posts, Mark.

      1. Former Member Post author
        Hi Mark,

        Roberto Negro has been my BW Idol eversince I joined SDN. I always give praises to his post and he knows it. I just want to somehow request from him to tackle User Exits more…I did not mean to focus on his shortcomings…

        My sincere apologies if my request/comment was somehow not well written…

        Again, my apologies…


  5. Former Member
    Hi Roberto,

    Great blog and thanks for taking your time to bring the “miracle” of LIS a bid closer (I know, for many people including me there are still questions poping up). However, and what John mentioned, the most urgent issue with LIS, used in a production environments, is it’s time critical initialization. It is more than tough to explain to your boss that you have to shutdown the system for users because we have to do a initialization (whatever reason might cause this situation). But when you explain the time window, of course always depending on the data volume, most IT managers are just getting pale ;-). Even though you parallize the initial data load process on the production server it needs time. Time that is critical for your business and, mostly, costs money. If we had today Christmas ;-), I’d wish to get more tips how you can speed up the process and make it it the first run a success; particularly all inventory related initial process.

    Again, thanks a lot for your time and your valuable tips in your blog. I am looking forward to read more on LIS.

    Have a nice day,

  6. Former Member Post author
    Hi Roberto

      Can you explain how the enhaced  data source can handled delta. You told that we have to delete the statistical data using LBWG. After deleting Statistical data how can it recognizes delta.

    bhanu prakash

  7. Former Member Post author
    I wanted to provide more information on this weblog as far as enhancing.
    Using the the LIS enhancement on available communication structures>>

    Roberto says “One method of adding user-defined fields is the following: add the required firlds to the communication structures (MCVBAK, MCVAP and so on) using the append method (via SE11) and then use the LIS customer exits to fill the field.” 

    You don’t need to use the LIS customer exit to fill the field where the field you are seeking to use comes from the same tables from which the extractor already pulls.  The secret is to use the SAP data element name directly when you enhance the communication structures (MCVBAK…).  The LO function module MCEX_BW_LO_API fills the fields automatically for you.  And yes before delta.  I have got this to work in many of my LO dataSources. 

    I needed to pull LIFNR from AFVC in PM Operations.
    DataSource 2LIS_17_I3OPER : uses the logistics cockpit to extract data from the tables of AUFK, AFRU, AFVC, AFIH, AFKO.  The routing number is used to join the tables AFKO to AFVC.  Extract Structure = MC17I30OPR.
    Vendor (LIFNR) is not provided by the business content communication structures MCVGIH or MCAFIH but since I know the extractor pulls from the tables, I took a chance.  I appended one of the communication structures – MCVGIH with my field using the SAP namespace.  It didn’t matter which one I added it too, I just picked one.
    LIFNR   LIFNR   CHAR  10  etc.
    I saved, activated and went back to my Logistics cockpit, found the field I just added on the right and moved it across to the left.  Unhid the item in the dataSource selections and ran setup tables.  Conducted extraction and the LIFNR field populated automatically – NO LIS CUSTOMER EXIT REQUIRED.

    This goes for any dataSource modifications, Logistics cockpit or not.  If it uses a function module, the abap code conducts a Move on the entire table.  Luck may have it that your enhanced field will be populated as well, AS LONG AS YOU NAME YOUR FIELD THE EXACT NAME IN THE SAP R/3 TABLE.  Therefore think twice before jumping to the append and adding your custom ZZdata_element.  You will have a WARNING on your append saying that the field is not within the customer namespace, but that is just a warning and can be ignored.

    Haven’t found any downside to this yet but if anyone does, add to this weblog.

    1. Hi dear,
      really thanks for adding your precious contribute on this weblog (right now I really need to update its content)!
      Anyway, you forget that I wrote “One method of adding user-defined fields”…I’m talking about USER-DEFINED FIELDS, not the std ones already available from the source application tables covered by LIS communication structures…
      What you suggest simply makes optimum use of the MOVE-CORRESPONDING statement in std mechanism and there is already some OSS Note that suggests this kind of approach (look at OSS Note 410799 “Enhancement of line item extraction FI-GL, FI-CIS,” to have an idea…).
      Anyway, the real issue in your approach is the usual one: what you add in this way continue to do not be delta-relevant…in other words, as you surely know, if in source system only the added field changes and anything more…well, “Houston, we have a problem !”, because you will not have a generated delta record…
      Or better, not always this is problem, since field could be not relevant (as business requirement) from a delta point of view.
      Besides that, the approach is not valid for ANY field that doesn’t belong to a specific set of application tables…and, more, it is not valid for other REAL custom fields…
      Adding a field in communication structure and fill it via LIS customer exits assures instead a delta event coverage !
    2. Former Member

      Can you please let meknow howto enhacne the Communication structure for 2LIS_02_acc, I would like have the details of Invoice  and Goods quantity in my data Purchasing Data source.

      Look for your reply Thanx in advance


  8. Hi Roberto,
    Thank you very much for sharing your knowledge and experience in this area.  I do deeply appreciate after reviewing all your documents.   However, I am currently have an issue with the DataSource 2LIS_02_SCL.  I enhanced scheduled qty (MCEKET-MENGE) and Received Qty (MCEKET-WEMNG) to 2lis_02_scl.    I used rsa3 to validate these 2 fields against eket table, the Received Qty is not correct.   I am unable to figure out why this fields is not populated correctly.   Would you please help.   Thank you.
    Jennifer H. Trieu
    OKIDATA Americas, Inc
    Systems Analyst
    1. Former Member
      Hi Roberto,
      I had some confusion about the PP datasources 2LIS_04_P_COMP and LIS structures S026.

      Is there any relation between this two? How is S026 used in LO extraction for PP data in application component 04?

      The datasources 2LIS_04_P* does not extract any data unless the PPIS setting for plant+order type config is not done (Tcode – OPL5). Is this setting update the old LIS structure S026?
      Please help.

      Thank you for all the knowledge you have been sharing.


  9. Hi Roberto,
    Thank you very much for sharing your knowledge and experience in this area.  I do deeply appreciate after reviewing all your documents.   However, I am currently have an issue with the DataSource 2LIS_02_SCL.  I enhanced scheduled qty (MCEKET-MENGE) and Received Qty (MCEKET-WEMNG) to 2lis_02_scl.    I used rsa3 to validate these 2 fields against eket table, the Received Qty is not correct.   I am unable to figure out why this fields is not populated correctly.   Would you please help.   Thank you.
  10. Former Member Post author
    Great Article.  I have a question from R3 Developer.

    MCEKKN is not used in the Cockpit for purchasing Data.  Is it possible to add a whole communication structure to the extractors to add the the list of fields within the cockpit.



    1. Hi Caroline,
      I think you can do it (as you can do it with other additional fields), but, as usual, you have to manage delta mechanism issue…
      Anyway, by reading an OSS Note (718887), it seems to be normal for SAP to build a separate infostructure on MCEKKN (and then you should integrate that data with other purchasing flows)…
      Let me know !
      1. Former Member Post author

        Do you know how you would go about this.  I am more the Data Model person on the BW side and we have an R3 Developer doing the extractor enhancements.  Since I have not worked much on the R3 side I don’t know how to assist with this.

        If you could give us some direction that would be great!


  11. Former Member Post author
    Hi roberto, i’ve read the blog and it’s fantastic!! i’m really interested on enhancing the lis communication structures, but now i have a doubt: i want to enhance the DS 2lis_02_scl wih ekkn-kostl. But i could have 2 (or more kostl) for a schedule line (or Item Number of Purchasing Document), so if i add this field on the lis communication structure, i’ll have both records with the correct quantity ? i’ll hope that the question is clear..
    Hi Alex
    1. Hi Alex,
      you cannot have more records if you don’t create them on an enhancement exit…there is no a std mechanism to obtain this (if not already provided by business content…).
      I think you have to decide with a rule what the extractor have to do in this case…
  12. Former Member Post author
    Hi Roberto

    Great work.

    Could you tell me if all datasources are affected if you change one of the LIS extractors in a certain area. For example if you change one of the 2LIS_13…. extractors can it then influence the other 2LIS_13.. extractors ?

    Kind regards

  13. Former Member Post author
    Hi Roberto,

    Thanks for your weblog, it was very informative.

    We are facing a situation where we want to the inventory analysis based on the local currency as well as the second local currency/group currency. The SAP standard extractors only get the value in the local currency.

    Would like to know if you have faced this scenario and what would be the best way to get the value in the second local currency/group currency (BSEG-DMBE2) out to BW.



  14. Former Member Post author
    Hi Roberto,

    Thanks u for your blog. it’s very helpfull.

    I face problem, after i enhance purchasing structure, i add 2 more field from communication structure.

    On r3 dev., all is working file.

    on staging and PRD, init load(setup table) is working fine. but when i schedule delta load, it display error, structure is changed. but from dev. already trasport all the changes.

    can u give the idea, how to solve the problem?

    thanks in advance,


  15. Former Member Post author
    Hi Roberto,

    I have an issue regarding the Activation of the Extract Structure in LBWE.

    I deleted a field (MCEKET -BANFN Purchase Req.) from MCEKET (SE11),
    reason being that we wanted to replace MCEKET-BANFN with EKPO-BANFN.

    The idea was to eliminate the eket field and use the ekpo field in the structure list.

    Now I am unable to activate my structure in LBWE for 2LIS_02_SCL…activation ..I get this error:
    End phase 002 ***********************************************************
    Message no. D0322

    Can anyone please help?


  16. Former Member
    Roberto, I think there is a typo in one of the lines which said ” Extraction queue (SM13)” which should be “Update queue (SM13)”. Correct me if I am wrong. Good going, We expect to see more blogs from you ASAP 🙂

    Thanks in Advance,

  17. Hi Mr. Negro.

    I have a BIIIIG question… as long as I have worked with BI for 2 years but I’m still learning as it’s my first BI implementation for SD.

    Does LO Cockpit works with LIS structures like S546 (S5nn)? Or it just works for standard structures (like S260, S261, S262)?

    I need to extract some informationto BI and it is in LIS Structures S546, S547 and S549. As long as they are 3 different structures I thought I should use better LO Cockpit, but reading the manuals I get that it doesn’t work with client LIS Structures…

    Please, please, please help me… I need to get this ASAP to not let my project fall…

    If I have to extract LIS data in the old way, how can I join info from LIS S546 and S547? Which is the best practice for this?

    Thank you so much.

    Karim Reyes

  18. Former Member Post author
    Hi Roberto,
    If you add field directly to the tables (ex. vbak, vbap); how does it become available in lbwe. Does it automatically gets insderted in communication structure or through some other way.
  19. Applicon Konsulenter

    We are using the BW DataSource 2LIS_02_SCL from the LO Cockpit and have added these two fields from the Customizing cockpit (LBWE):

    Apparently these two fields are NOT updated in BW from a delta load. In the ECC table EKET these two fields have changed for a given purchase order but the change is not added to the delta queue. The change is triggered from goods issued or goods received.

    Please advice – we have tried OSS but with no use.


    1. Former Member

      I don’t think this is a delta issue. It looks like the schedule line extractor is working properly only in quite specific conditions. We experienced the same issue (with WEMNG). First, it’s necessary to load the data in BW using only addition for the WEMNG. Then, it’s important to know n that doing so, the quantitieswill not be correct at schedule line level, but at item level only. Just give a look at following note: 617240.

      This is because of the program running to fill the delta queue.

      Best regards


  20. Former Member
    Hi Roberto,

    Thank you for your informative blog, enjoyed reading it.

    Question :
    If a scenario demands a cross application enhancement
    eg: Enhancing shipment relevant field in a delivery data source/extractor or delivery relevant field in an invoice relevant data source.

    Is the delta relevance of these enhanced fields retained by enhancing the datasource at communication structure level? I guess, the delta of these enhanced fields would still be lost, since they are not included in the events that are used to create the delta.

    (ie a shipment document (our field) may change without change in delivery document).

    In such cases, is the only way  to merge the enhanced field from its corresponding (shipment data source) in the DSO?

    Can you please let us know if there is an alternative or better way to cover this scenario.

    Best regards,

  21. Former Member
    Dear Roberto,

    All your blogs are great and I learnt a lot.

    I am a little confused with the LIS and LO extractors. My confusion is that in your previous article you mentioned that LIS is an old technique and LO is new technique. That means that if we use LO we are not using LIS, right?

    Now, what is 2LIS_11_*?? Is this not the same LIS?


  22. Andreas Förner

    Dear Roberto,

    the Blog is a great help.

    I have no problem enhancing datasource 2LIS_04_MATNR with serveral fields.

    But adding field VFMNG to structure MCAFPO with include ZZAFPO i always get an error “MCEX 032”. In Table TMCEXCFS the field has STATE = ‘ ‘.

    Can I change the STATE manually from blank to ‘A’?

    Or is there another way to activate field VFMNG in Datasource?




Leave a Reply