Skip to Content

1.  There are 3 SAP delivered transactional data sources for stock management.

2LIS_03_BX : Always needed to carry the initialization of the stocks.

 

2LIS_03_BF :  Initialize this in source if you need historical data else an empty init can be done for this to load only future delta records. If the Source system is new then no need to do an initialization. 

2LIS_03_UM : Only needed if revaluations are carried out in Source. This data source is helpful if adjustment in material prices is done time to time otherwise it won’t extract any data.

 

2.  Check list for source system. Needs to be carried out before initialization of the above 3 data sources .

 

A.   Table TBE11 : Maintain Entry ‘NDI’ with text ‘New Dimension Integration’ and activate the flag(Note 315880)

B.   Table TPS01 : Entry should be as below(Note 315880)

PROCS – 01010001

INTERFACE  – SAMPLE_PROCESS_01010001

TEXT1 – NDI Exits Active

C.   Table TPS31 : Entry should be as below(Note 315880)

PROCS – 01010001

APPLK  – NDI

FUNCT – NDI_SET_EXISTS_ACTIVE

D.   Tcode –   MCB_  

In most cases you need to set the industry sector as ‘standard’. For more info please see Note 353042

E.    Tcode –  BF11

Set the indicator to active for the Business warehouse application entry. This entry may needs to be transported to the production system. (Note 315880)

 

3.  After running the setup data, check the data for the fields BWVORG, BWAPPLNM, MENGE. If no data available in these fields then some setting mentioned in the above checklist are missing in R/3. Correct the issue and rerun the setup data.

 

4.  Data staging with DSO for BX extractor is not allowed. Data should directly load from extractor to Cube only once. Choose the extraction mode as ‘Initial Non-cumulative for Non-cumulative values’ in the DTP.

 

5.  DSO is possible for BF. If you are creating a standard DSO then choose the fields MJAHR,BWCOUNTER,MBLNR,ZEILE as key fields. Some of these fields won’t be available in the standard data source but the data source can be enhanced using LO Cockpit (LBWE) to add these fields. In addition to these other fields depending upon the DSO structure is also possible.

Note  417703 gives more info on this.

 

6.  Point-5 is valid for UM also. The Key fields could be a combination of MJAHR,MBLNR,ZEILE,BUKRS  fields. Note 581778 

 

7.  Data load to the cube should follow the below process

 

A.      Load the BX data. Compress the request with stock marker(uncheck the marker option).

B.      Load the BF and UM init data. Compress the loads without the stock maker(Check the marker option).

C.      The future delta loads from BF and UM should be compressed with Stock marker(uncheck the marker option).

 

8.  If in future the cube needs to be deleted due to some issues then the load process should also be carried out as above. (only init of BF and UM should be loaded first and then the deltas should be processed)

 

9.  To check the data consistency of a Non cumulative cube the standard program SAP_REFPOINT_COMPLETE can be used. To check the compression status of the cube the table RSDCUBE can be refered. Before the compression of BX request, the ‘REFUPDATE’ field should be blank and after the compression the value should become ‘X’. Check Note 643687 for more info.

 

10. After BX data load to cube the data won’t be visible by LISTCUBE. Only after compression the data can be seen by running a query on the Non cumulative cube.

To report this post you need to login first.

24 Comments

You must be Logged on to comment or reply to a post.

  1. Joe Anonymous
    Hi Ranjit,

    7/B: “Load the BF and UM init data.” -> Init with data or without data? Or it does not matter regarding the compression method? (marker option checked or unchecked…)

    Thx.

    (0) 
      1. Joe Anonymous
        Thank you. And if you are not interested for historical changes in the inventory, then you should choose init without data and the marker option should not be checked? Thanks..
        (0) 
        1. Ranjit Rout
          For init without data transfer, you dont need to take that request to cube. Only load the next deltas with marker.

          Regards
          -Ranjit

          (0) 
  2. Sonal Patel
    Hi,

    Its a nice blog.. inventory topic is always been a hot one.. i have a question abt point number 4..
    That why the BX data cannot be loaded to a DSO.. please can u explain this ?

    (0) 
    1. Ranjit Rout Post author
      Hi

      While loading BX data, the cube should be notifed that we are loading the initial status data. This can only be done with the Extraction mode option “initial Non cumulative for non cumulative value” in  DTP. This option in DTP only comes if the target is a noncumulative cube. So if you load this data to DSO first then you wont get this option in the DTP and the BX data couldnt be marked as the initial status for inventory data.

      Regards
      -Ranjit

      (0) 
  3. Ravinder Reddy
    Hello Frnd,

    i have done the same procedure but when i checked the no of records added in Info cube it is showing almost double in BX and next delts with BF also adding almost same number of records.

    while loading BX to BW i have taken Full update as update type.

    Could you please let me know i need to do it again with NoN cumulative update type or no need.

    i am using snapshot method.

    Thanks in advance

    Ravinder Reddy karra

    (0) 
    1. Ranjit Rout Post author
      Hi

      This blog is only for Non cumulative method. this wont help you in snapshot method.

      Yes you can try the Non cumulative method and For BX choose “Generate intilal status” in info package as the udpate method to load it to the PSA and “Intial non cumulative for non cumulative values” update method in the DTP to load it to the non cumulative cube.

      Regards
      -Ranjit

      (0) 
  4. Zeeshan haider
    After we delete the data from the cube due to any reason, cube is now empty and there is also no request avaialble in cube, now do we need to repeat the loading process as you describe in the same steps i.e. BX with Marker update (unchecked) and then BF and UM without the Marker Update (checked).
    (0) 
    1. Ranjit Rout Post author
      Yes, the loading should follow the same steps. Remember that while reloading you have to load first the init request of BX(compress with stock marker) then init request of BF and UM(compress without stock marker). Then load the delta of BF and UM with stock marker.
      (0) 
  5. Swaroop Chandra
    Hi Ranjit,

         Thanks for this very helpful weblog, really good info.
    I implemented this process in my company, following the SAP document on how to handle inventory scenarios. I did the BX load, and then pulled some historical movements and then the deltas have been running.I did do all of the compression steps in the same way that you have desrcibed.

    I am now seeing some mismatch between R/3 and BW, not for all materials but around 200+ materials out of 10,000 or don’t match.

    Is there anything that can be done to fix this or do I have to reinit the whole thing?

    Please let me know, thanks.
    Swaroop.

    (0) 
    1. Ranjit Rout Post author
      Hi,

      Have you loaded the UM data to cube?

      Check if material revaluation happens in you company. if yes then load the UM data that will fix your issue.

      Thanks

      (0) 
  6. kannan natarajan
    Hi Ranjit,

    Easily readable and checklist like is your blog. Good one. Can you please clear on this,

    What is the significance of with stock marker and why it varies with each datasource?

    A.      Load the BX data. Compress the request with stock marker(uncheck the marker option).

    B.      Load the BF and UM init data. Compress the loads without the stock maker(Check the marker option).

    C.      The future delta loads from BF and UM should be compressed with Stock marker(uncheck the marker option).

    (0) 
    1. Ranjit Rout Post author
      Hi

      Stock marker is a must in noncumulative cubes. The marker will link the movements(issues, receipts and initial balance) with reference to the validity field(most cases its calendar day).

      It varies because of the nature of data we are getting from different data sources. since BX gives the initial balance so i want to mark it as my starting point. History for BF and UM only contributes to initial balance so i dont want the marker to set for them(but still u need to compress these loads). and the future deltas from BF and UM should also be marked as i need to extend my marker position to current going forward.

      (0) 
  7. Nuno Pires
    Have anyone faced performance problems loading from movements (BF) DSO to Infocube? Althought the load is Delta it is taking lots of time to load (using BW 3.5). Do you know how can i improve this?

    Thanks,
    Nuno

    (0) 
  8. Mark Dodgshun
    Hi Ranjit,
    We want to pass Inventory stock on hand to QlikView.  Where do I find these initial stock on hand records so that we can start QlikView with these and then load the Material Movements please?
    (0) 
  9. Alok Kashyap
    Hello Ranjit,
    In case if we have duplicate data and we have to remove it and then repair then after that should we compress it with marker/without marker??

    for ex: we detected a duplicate entry for some material docs in the PSA & Cube. Now we want to do the selective deletion from the cube for the request having all such Material docs based on posting date (as we cant delete the compressed Inventory request from cube). Also respective PSA request has been deleted. Afterwards, we have filled up the setup table and data has been reapired. Now we have to compressed that newly loaded request. Should I compress with Marker or without marker??

    (0) 
    1. Ranjit Rout Post author
      If the posting dates of the material docs are before your BX initialization date then compress without marker else compress with marker.
      (0) 
      1. Alok Kashyap
        Cool..your one liner was more than enough to satisfy me..thanks a lot 🙂 Anyways, we have raised an OSS Message (0000505291) to SAP, not asking for marker and all, for knowing the reasong for duplicate entried in PSA.. even there is SAP composite note which explains that behaviour.. can you explore some possibilities why we have duplicates.
        (0) 

Leave a Reply