Skip to Content

CIF is the standard interface between APO and ECC used to transfer master data (uni-directional) and transaction data (bi-directional). But very often due to some master data issue transaction data cannot flow into target system causing what is commonly known as CIF queue. This would block subsequent data flow but SAP provides option for post-processing. Think of this as hauling truck that picks up your dead car on highway and removes it for safety and also to clear the highway for traffic to move on. Post-processing does same thing – it puts the records for subsequent analysis and corrective action i.e. reprocess the queues when underlying cause is taken care of. However every order for given product-location combination creates post-processing error record for same reason very similar to what is mentioned in this blog on automating CIF Delta Reconciliation records.

Depending on criticality and importance given by business, production support team members end up doing the post-processing errors review, assessment and preparing report of causal factor (mostly master data issues) at least once a day and in one of my clients at least twice a day with 4 times over weekend during batch run. This analysis of post processing records can indeed be very tedious and repititive in nature. Most of the time the causal factor is known but still you need to go through each queue to assess the causal factor and then aggregate all of it in a report.

So started the ideation phase on how to automate this regular CIF Post Processing records analysis. The high level design architecture of a custom report program that will read CPP Error Records between APO and ECC (both ways), then lookup relevant qRFC generated Application Log, extract the error class and match for root cause from mapping table to generate a summary report. This summary report output is by Product – Location – Transaction Data Object Type – CIF Error Type combination, with individual CPP records by orders aggregated by Product-Location and transaction data type. As an improvement summary report as Excel attachment to email from APO system directly is also added.


The solution design was simple – execute custom program based on selection screen by Product and Location for reading CPP error records by corresponding ECC logical system aggregated by Product-Location-Object Type (transaction data type). Next for each unique combination read appropriate Application Log to read log details based on qRFC Transaction / LUW ID matched as External ID of Log. From Error Message in log details pick up Error Application Area and MsgNo that is then used to read a mapping Ztable to determine Explanation Text and finally presented the data in nice ALV output that can be downloaded / generated as Excel file or emailed as an attachment to particular user or distribution list.

The key to the whole process is the custom Mapping Table that will be managed by Application Support team who will analyse new error records once these appear and add them with suitable explanation test. Here is example structure of the custom Mapping table

Transaction Type

Location Type

Location

Application Area

Msg No

Explanation Text

* or specific

R/3 Object Type like 1 – Pur Req, 2 – Pur Order, 5 Planned Order

1001 or 1002

Location code or *

Xxxx like /SAPAPO/RRP

Yyy like 502

Purchasing Inforecord Validity mismatch with Transportation Lane Validity

Next let me walk through some of the key standard technical objects that is used for building the custom program.

Determination of whether Post-processing is switched on is obtained from view SAPAPO/V_BSGSET where assignment of Logicial System to Business System Group is maintained. The field CIF_ERRHDL_MODE value can be blank or S: Strict (termination of error) means CIF queue error generates record in SCM Queue Manager / SMQ2 (so no post processing records generated) while 1: Postprocessing of error is what is required as it means CIF queue error during processing does not hang with status SYSFAIL and records are available in CIF postprocessing for analysis and reprocessing after corrective action taken.

So based on this the determination of CIF errors from SMQ2 or /SAPAPO/CPP1 is done. Note if there are multiple entries due to multiple connected Logical System then the entry with SAP ind. marked as “X” is to be considered.

In the next step sort the post processing errors by product-location combinations for each object type (like Planned Order, Stock Transfer Requisition). Multiple errors for same Product-Location-Object Type is possible due to different order elements but there is no need to analyse each order element as long as error text is same. The qRFC Transaction ID for error record is important – as it is passed to Application Log selection screen as External ID field to read Log Details. The Application Log reading can be done using CIF display program CIFQEVO2 in ECC and /SAPAPO/CIF_QUEUE_EVENT2 in APO side. In case two sets of Application Logs is available then skip the one with Sub-object Text = Error Handling (Technical Name EH). The other Application Log matching to LUW ID need to be picked up for further processing.

Next search CIF Error text from Application Log details from mapping ZTable by matching Application Area + Mesg. No and considering Transaction Type + Location Type + Location combination. It is possible to have wildcard values in Transaction Type (mapped to single character value based on R/3 Object Type as given below) and Location. However Location Type will not be wild-carded and has to be maintained with valid values like 1001 (Plant), 1002 (DC), 1007 (MRP Area), 1010 (Customer), 1011 (Vendor). The custom table will be maintained with wild-carded values first followed by more specific values. This is to allow different explanation text for same Post-processing Error Message, same Transaction Type but different locations / location types. For generic messages like XC023 No Active Integration Model, the explanation text can be set different for different Transaction Types (like Sales Orders vs Purchase Reqs / PO vs Planned Orders).

Finally build ALV report output with header as given below. In case of Email option create Excel file and attach to email sent out.


System Flow : APO to ECC or ECC to APO

Total Error Count : nnnnn (sum of all Error Counts)

Product

Location

Object Type

Error Count

System Message

Message No.

Explanation Text

Time of error

MATNR

WERKS

TEXT

NUMC

TEXT

TEXT

TEXT

DATS

<Material>

<Plant>

Purch Req

xx

Vendor can supply till dd.mm.yy

aannn

Purchasing Inforecord Validity mismatch with Transportation Lane Validity

<Material>

<Plant>

Purch Order

yy

Net price must be greater than 0

bbnnn

Pricing condition not maintained in Purchasing Inforecord for PO

Now let us see few example screenshots of the report program. Here is the selection screen where user need to select if APO to ECC or ECC to APO post-processing records to be analysed. Target system is always the connected ECC system and can be defaulted. Location and Product can be provided for filtering the post processing error records for analysis. Email and download options are self-explanatory.

CPP_analysis_selectionscreen.png

 
Here is the custom mapping table maintained with explanation text for post processing error application message type and message number. Note this can be setup to have different explanation text or corrective action to be taken by Location or Location Type (common for all DCs – location type 1002).

/wp-content/uploads/2015/05/cpp_analysis_mappingtable_696718.png

And finally the report output in dialog mode showing how 8 (Error Count) Purchase Requisitions (Object Type) for a given Product-Plant combinations are stuck going into ECC from APO (System Flow) due to Purchase Inforecord Validity date problem as displayed in Explanation Text field with the date / time of the error when it first occurred.

/wp-content/uploads/2015/05/cpp_analysis_reportoutput_696731.png

The goal to gain significant time / effort reduction of production support team and the following business benefits is achieved.

  • Significantly reduce time spent for daily CIF queue monitoring and analysis (reduction by almost 85-90%).
  • Summary report generation of CIF Post-processing Error to planners / master data responsible persons.
  • Centralised mapping (with explanation documentation) of CIF queue error to root cause reason.

NOTE: Further design enhancement of this custom report is blogged here.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply