Skip to Content
Author's profile photo Former Member

How to Retrieve the Lost Logistics queues


This document describes the important procedure which is required in performing the reconstruction of the Lost/missing queues (LBWQ) for the Logistics applications.

In our day to day activities we might having issues with the queues where some of the queues might be missing for several reasons like Database error, Problems with RFC, an Accidental deletion of queue Or some other reasons..

Document explains the procedure to retrieve the missing data from the queues without needing the re-initialization and the down time.

SAP has delivered a transaction – LBWR and table – MCEX_DELTA_BACK in order to support this functionality with the OSS – Note 1008250 .

Procedure Steps:

Start the transaction – LBWR 


i] Enter the name of the application for which the backup needs to be considered. Update the section as MCEX<nn> where nn stands for Application number. If the application is Purchasing then use the Queue name as MCEX02 for Sales MCEX11 and etc.,

ii] Use “Customizing Changes Only” in the Reconstruction Extraction Queue section. Initially the customization is required to update the tables in order to update the back-up tables.


iii] Under Customizing section fill the below entries –

–          No. of Collective Processing

–          No. of Days with the Backup Data


  1. No. of Coll. Processing –  This will make sure how many number of collective runs needs to be extracted or sent to the backup table. Numbers are considered based on the V3 setup which is present in the system.  If you have V3 job running hourly for application 12, then there will be 24 collective runs per day and 48 for 2 days of backup.
  2. No. of Days with Backup Data – Update the days as to how many days of data is required to maintain in the backup tables. More the number, more space will be occupied. Hence update this as 2 or 3 days for the storage. Data will be automatically will be cleaned by the system itself according to the settings as per the customizing entries.

For Example:

If you enter –

  1. No. of Coll. Runs with Backup as 96 (represents 4 days)
  2. No. of Days with Backup as 2 days, the delta data of the last four days is therefore retained in backup table MCEX_DELTA_BACK.


If the periodicity of the collective runs changes, this period can also change. However, it always amounts to at least two days because of the setting for BACKUP_DAYS.

Settings can be viewed using the report – RMBWV3RE as well.

Note: If a value for BACKUP_DAYS (number of days for which the backup data is to be retained) was also defined, the larger number of entries in the backup table is retained.

Reconstructing the data:

Execute the report – RMBWV3RE or the transaction LBWR.

Update the processing Mode as – ‘Reconstruct the Queue from the Backup Table’.


Update the Time stamp or the No. Coll. Processing’s to be extracted from Backup tables to the queue and press F8 based on the initial analysis of which data is missing in the target systems in order avoid duplicates in BW.




Say Yes to the below message.


This will bring the data for the last 4 days due to the number of collective runs (assuming the collective run has 1 hr periodicity)

Once done, check the update in LBWQ


The Queue name is marked as MCEX12_BACK which represents that this has come from the Backup tables of the reconstruction tables.

With the execution, the entries from the backup table has come to the LBWQ which can further used to push the data to the Delta Queue once the V3 runs.

Entries from the MCEX12_BACK.


  This will be moved to RSA7 for further extraction to BW.

Display data from the backup tables:

  You can view the data in LBWR transaction with the below settings –


Select the kind of data to be viewed either Header, Item or schedule line in the given list and you can choose the number of records for the output and then hit   the results:


  Entry from Backup table:



  • Make sure that the Authorization object – M_QU_RE is assigned to restricted team who can actually judge the data to pull the same to BI without duplicating it.
  • Data analysis has to be done thoroughly in BW in order to avoid any duplicates. If there is a data for a particular day, and if the data has been reconstructed using the time stamp, then it is important that the relevant data in BW is deleted before reconstructing!
  • If the Document posted is the later version compared to the missing data, then the same has to deleted in BW before planning the reconstruction.
  • To do reconstruction, the Customization should be been in place for the respective application.
  • The settings can be transported to Q and P once after the proper testing of the parameters.

Applications in Scope:

1.     All logistics application. Initially the feature was not available for Global Trade Management (46) but the same has been updated with the OSS Note – 1570406.

Reference Notes:

1.       Note 1570406 – GTM: Backup table for queue MCEX_46

2.       Note 1008250 – Backup table for the queues of logistics extraction into BI

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Former Member
      Former Member

      Nice Blog,,,Good work.

      Author's profile photo Suman Chakravarthy K
      Suman Chakravarthy K

      Excellent mechanism. Thanks for sharing..

      Author's profile photo Former Member
      Former Member

      Thanks for Sharing.... Very useful Document.

      Author's profile photo Former Member
      Former Member

      Thanks for sharing...great work and very usefull information..

      Author's profile photo Former Member
      Former Member

      Extremely helpful while working with Lo-Cockpit data sources.

      Author's profile photo Frederic Cincet
      Frederic Cincet

      Hi Murali,

      Thanks for sharing this, always interesting to hear of new SAP tricks !!

      Author's profile photo Former Member
      Former Member

      Thank you very much Mr. Murali.

      Author's profile photo Sathiendiran Balu
      Sathiendiran Balu

      HI Murali,

      Good One  - Very Usefull , thanks for sharing

      Author's profile photo Manna Das
      Manna Das

      one of the very useful guide. thanks for sharing this.

      Author's profile photo Anshu Lilhori
      Anshu Lilhori

      Simply superb..Thnx for providing this approach.



      Author's profile photo Former Member
      Former Member

      This is very good approach from SAP. Thanks

      Author's profile photo Former Member
      Former Member

      More useful information. Thank You.

      Author's profile photo Former Member
      Former Member

      Excellent blog. Thanks for providing detailed explanation and am sure that it will save lots of time when when we have to recover the lost/missing deltas.

      Author's profile photo Kamal Mehta
      Kamal Mehta

      Excellent . Is this something that has been added recently or may be i am not aware.

      Thanks for sharing.

      Author's profile photo Martin Grob
      Martin Grob

      nicely made

      Author's profile photo Former Member
      Former Member

      useful information, to work with  reconstruction of lost data.

      Thank you

      Author's profile photo Kiran Y
      Kiran Y



      Author's profile photo Prashanth Konduru
      Prashanth Konduru

      Thanks for sharing ... n useful.

      Author's profile photo Satendra Mishra
      Satendra Mishra

      Nice information....... 🙂

      Thanks for Sharing...... 🙂

      Author's profile photo Renjith Kumar Palaniswamy
      Renjith Kumar Palaniswamy

      Hi Murali,

      Very nice!!

      Thanks for creating and sharing such a useful document. I would like you to write more quality content documents like this in near future.


      P Renjith Kumar

      BW Forum moderator.

      Author's profile photo Viren Pravinchandra Devi
      Viren Pravinchandra Devi

      Hi ,

      Thanks. So I think there would be two queues for same datasource/target in extraction queue?



      Author's profile photo Former Member
      Former Member
      Blog Post Author

      Backup tables are not a mandatory one but would be helpful in any case to retrive the lost queues.

      Author's profile photo akshara akshara
      akshara akshara


      W.r.t. No. of Coll. Runs with Backup or No. of Days with Backup Data, the one which brings largest content is tored in backup tables, right?

      Can you please give the process steps?

      Author's profile photo Former Member
      Former Member
      Blog Post Author

      The heighest value in the Collective run or from the backup days will be considered in the backup tables. The listed steps can be useful to perform this.

      Author's profile photo Former Member
      Former Member

      Very nice 🙂

      Author's profile photo Former Member
      Former Member

      Nice one, very useful and informative.