Skip to Content

The following blog documents the mechanism by which one can debug the Outline agreement or the PO creation process in SAP ERP system as a result of the publish event triggered from SAP Sourcing.

Pre-requisites: In the ECC system, activate Logging of the publish events originating from SAP Sourcing. In order to do so,

  1. In transaction SPRO navigate to “Integration with other mySAP.com Components – SAP Sourcing – Make settings for Integration with SAP Sourcing”SIMG.png
  2. Check the box ‘Activate Log’ in the section Technical settings (as shown below)

     SIMG2.png

Once the above configuration settings are enabled, whenever a publish event is triggered in SAP Sourcing, the paylod or data that is sent from SAP sourcing is captured and saved before an Outline agreement or a Purchase order actually gets creates in ERP. One can then re-use the data to check for the reasons of failure or debug the process.

Debug the Publish of Master agreement to ERP

In the ERP system use the transaction BBP_ES_ANALYZE and on the selection screen using the value help and filter criteria of “OA*” representing an outline agreement.

ANA1.png

Determine the payload by filtering on the date and time during which the Publish action occured in SAP Sourcing. It can also be searched by Master agreement number represeting the ID from sourcing as ‘External Reference number’.

ANA2.png

After selecting the appropriate entry, the paylod for the specific publish event will be displayed, the data can then be verified and subsequently the Payload can be tested by choosing the ‘Execute’ or ‘Debug’ options.

ANA3.png

In order to start debugging the process, just click on the ‘Debug’ icon presented on the application toolbar and this will allow for the detailed analysis of hte payload as it gets executed.

ANA4.png

The outcome of both the Execute and Debug actions will result in an outline agreement either being created or updated. This can be verified using transaction ME33K.

The same procedure can be repeated for Purchase Orders, except that in the initial screen of BBP_ES_ANALYZE we will start by looking for the Purchase order related payloads published from SAP Sourcing (as shown below)

ANA5.png

If at the end of the ‘Execute’ or ‘Debug’ step data related errors are noticed, the same need to be fixed in SAP sourcing and the document can be republished.

PS: not all errors necessarily can be fixed from SAP Sourcing, some need to be addressed in ECC and some in the PI mappings depending on the particular case.

Cleanup of the Payloads:

   If the ‘Activate Log’ setting is left turned ON, there will be an accumulation of considerable payload data over time. It is always a good practise to

  1. Eanble Log during implementation or testing and if there are unexplained errors
  2. Deactivate the Logging when not needed and
  3. Clean-up payloads periodically

The clean-up of logs can be triggered by the transaction ‘BBP_ES_RFC_DELETE’. This will allow for selective deleate of the saved Payloads.

Summary:

     BBP_ES_ANALYZE is a helpful transaction and can be very helpful in tricky situations when the publsh events fail without much information.

To report this post you need to login first.

2 Comments

You must be Logged on to comment or reply to a post.

  1. Iftekhar Alam

    Thanks for this Informative Blog.  Do you suggest Activating the log in Production environment and then setting the batch job for cleaning of  log files   say we can schedule job for clean up every week.

    where are the log files saved  in the system ? How can we do the sizing on the log file content  so that we could plan how much space should be allocated for storing a week log file in production for analyzing any incident etc.

    Thanks

    Iftekhar Alam

    (0) 
    1. Venkateshwara Prasad Kothapalli Post author

      Hello Iftekhar Alam,
         All of the above is occuring in the ERP system, the test data(for aanalysis) is stored in cluster tables and there is no need of sizing for this content (and also note, it is just the payload-XML that is coming from SAP Sourcing that is written to the database).

      The number of entries written to the cluster is directly dependent on the number of business documents you are publishing from SAP Sourcing and.

      If you think there is going to be considerable business documents (in the 100’s) or so in a week, then you would think of setting up a batch job to clear it up.. if not you could either plan to do this on-demand or a better idea would be to inactiviate this log in the Production enviromnet unless it is needed to really test a problem which cannot be checkled any other way.. (so in other words you can have this activated on your sandbox at all times maybe , but in production have it switched off all times and activate on-demand be sure to de-activate as soon as  the problem is resolved and clean up the log’s at that time perhaps)..

      Hope this answers your questions..

      Prasad

      (0) 

Leave a Reply