Easy and efficient way of uploading pricing conditions in SAP system using a single exclusively designed program
Introduction
In order to upload pricing conditions in SAP system, we need to create a conversion program which caters to all condition tables and uploads the respective data. Now since all condition tables have different structures/key fields/fields, the BDC approach can’t solve the problem unless we do recordings for all Condition Types. Also in case a new condition type is added, then we need to add a new recording to the code, which increases the development/maintenance hours.
Each time pricing conditions need to be uploaded in the SAP system, a technical resource is required to create a conversion program which uploads the data into the system. This tool helps in uploading the pricing conditions for SD and MM module, thereby eliminating any multiple manual intervention.
Solution Details
As the requirement is to upload any condition table, we have to design a solution which caters to all condition type uploads.
So in order to make it generic for all condition types, we would be expecting the Condition Table name to be as a part of upload fields. Now using this Table Name, we would fetch the schema of corresponding Condition Table and map fields dynamically. Also we would be using the Conversion Exits on the various field values using the field information returned for each table.
For example, say the condition table name coming in upload file is A652. We will use FM “CATSXT_GET_DDIC_FIELDINFO” to the details for table fields. As a result from this FM, we will get all the fields with their attributes like:-
- Key Flag (denotes whether the field is a part of Primary Key or not)
- Domain Name
- Data Element Name
- Check Table Name
- Length
- Field Labels
- Conversion Exit, etc..
The basic common fields in upload structure can be:-
- KAPPL (Application)
- KSCHL (Condition Type)
- TABLE (Condition Table Name)
Now the point lies how we map the data from file to different condition tables, as each table has a different structure and also varying in number of fields.
Since any Database Table can have only 16 fields (max) as a part of Primary Keys. And there are 5 fields which are common in all A* tables:-
- MANDT (Client)
- KAPPL (Application)
- KSCHL (Condition type)
- KFRST (Release status)
- DATBI (Validity end date of the condition record)
So remaining number of key fields left are 11 (16 – 5), now the upload structure of the file would have 11 fields as floating/generic value. So we keep FLD1-FLD11 of type FIELDNAME (Char 30).
The other fields in the upload file structure (common to all condition types) are:-
- DATAB (Start Date of Condition Record)
- DATBI (End Date of Condition Record)
- KBETR (Condition Value)
- KPEIN (Condition Price Unit)
- MEINS (Unit of Measurement)
- KRECH (Calculation Type for Condition)
So the final upload structure is:-
Field Names | Data Type | Description |
KAPPL | KAPPL | Application |
KSCHL | KSCHL | Condition Type |
TABLE | TABNAME | Table Name |
FLD1 | FIELDNAME | Field Name |
FLD2 | FIELDNAME | Field Name |
FLD3 | FIELDNAME | Field Name |
FLD4 | FIELDNAME | Field Name |
FLD5 | FIELDNAME | Field Name |
FLD6 | FIELDNAME | Field Name |
FLD7 | FIELDNAME | Field Name |
FLD8 | FIELDNAME | Field Name |
FLD9 | FIELDNAME | Field Name |
FLD10 | FIELDNAME | Field Name |
FLD11 | FIELDNAME | Field Name |
DATAB | KODATAB | Validity start date of the condition record |
DATBI | KODATBI | Validity end date of the condition record |
KBETR | KBETR_KOND | Rate (condition amount or percentage) where no scale exists |
KPEIN | KPEIN | Condition pricing unit |
MEINS | MEINS | Base Unit of Measure |
KRECH | KRECH | Calculation type for condition |
Now since in every A* table first 3 key fields are:-
- MANDT
- KAPPL
- KSCHL
(NOTE: other 2 fields KFRST and DATBI are the last 2 key fields)
And first 3 fields in upload file are:-
- KAPPL
- KSCHL
- TABLE
So rest of the key fields from A* table would be mapped to upload file fields FLD1-FLD11 based on the number of primary keys. Thus we will start mapping from field 4 of condition table to FLD1, FLD2 and so on till FLD11, based on the number of key fields.
In case we have 3 more key fields (excluding 5 common key fields), then in the upload file we will have values in fields FLD1, FLD2 and FLD3. In case any other field has a value, then it is an erroneous data, and nor these 3 fields can be blank (as they are part of primary keys).
For instance, let consider the table A652 (refer the snapshot in attachments)
The mapping of upload file to Condition Table would be like:-
1. FLD1 –> VBELN
2. FLD2 –> MATNR
3. FLD3 –> VRKME
4. FLD4 -to- FLD11 would remain as blank.
Also the data coming in these fields should be in continuous chain.
For instance if FLD1, FLD2 and FLD4 has values and FLD3 is initial, then also this record is erroneous.
1. In case of the above erroneous situation, an error message “Discontinuity in Variable Key Fields” is appended.
2. Validate the Processing Status from table T686E. In case no valid record found then append an error message “Invalid Processing status for conditions”.
3. Now using the field information returned from FM “CATSXT_GET_DDIC_FIELDINFO”, check if any Conversion Exit is applicable, and then use the value coming in the upload file field and apply the same to convert the value and then we can pass this to IDoc structures. In case any error occurs then append message coming from the Conversion Exit.
4. Check if the field is present in Segment
- E1KOMG,
- E1KONH, and
- E1KONP
If field is found in the segment, then pass the value in the segment(s).
5. Also concatenate the key field values in a string called Variable Key.
6. After all key fields are covered with the above steps specified; then check for the length of the field. If field length is greater than 100, append a message “Variable Key too big”.
7. Get the Logical System Name from table T000 where Client = SY-MANDT. In case no record found then append an error message “No Partner Function Found”.
8. Concatenate ‘SAP’ SY-SYSID to form the Port Number.
9. In case no error is found till now and test run is not requested, then populate the IDoc Segments.
- a. Pass Control Records
- Pass the Sender and Receiver Information.
- IDoc Type as COND_A04
- Message Type as COND_A
- Basic Type as COND_A04
- Direction as Inbound
- b. Pass Data Records
- Now pass the above prepared data into segments
- E1KOMG
- Application
- Condition Type
- Variable Key
- Region
- E1KONH
- Start Date
- End Date
- E1KONP
- Condition Type
- Condition Value
- Condition Unit
- Condition Price Unit
- Calculation Type for Condition
- c. DIRECT POST – Post the data using FM “IDOC_INPUT_COND_A”
- Pass all the above prepared data into the FM.
- If some error is returned, then append the same.
- If no error found, and data is successfully posted then check for Status as 53. If found, append a success message “Changes done successfully”
- d. IDOC POST – Post the data using FM “IDOC_INBOUND_WRITE_TO_DB”
- Pass the data records into the FM.
- If some error returned from FM, then append the same in the log, to be displayed to the user. If no error found, Commit Work and append message “Idoc successfully posted:” with the IDOC number.
- E1KOMG
- Now pass the above prepared data into segments
Business benefits
The above explained approach will upload all the relevant condition records in to the SAP system for SD and MM module using the iDoc approach (which is faster as compared to using BDC’s or LSMW’s).
The only thing which is crucial for using this tool is to understand the mapping of condition table with the upload file format. Once the mapping is done and a Tab Delimited Text file is provided to this program, it uploads the data in the desired tables; and thereby saving around 80% of estimated time. For instance, the general effort spent in developing the conversion program is 40 hours -versus- 8 hours spent in using this tool.
In addition, no maintenance is required in case any other change request is to be catered.
Thus this solution minimizes:
- The functional effort of manually entering conditions records one by one.
- The technical effort of developing conversion program using BDC’s for different condition tables. The number of these conversion program can vary depending upon the conditions to be uploaded.
- The maintenance effort required as and when new condition types are added.
That's a very helpful piece of information and the code will help a lot of developers in the implementation projects as well AMS.
Thanks for sharing.
Regards.
To implement the same with Microsoft Dynamics Ax takes around 1 hour or even no code is needed. Why it is so complicated with SAP? There is not a tool to Import automaticaly the CSV file with data to the Prices?
Hello Pedro,
Thanks for the information. I wasn't aware of this tool. This solution was designed by me for one of my customer where they wanted to upload pricing through one custom transaction.
Best Regards,
Tarun
Hi Tarun. Yes with with Microsoft Dynamics you can upload the Price Conditions using AIF and doing only some mapping between the file and the data. You can also do that by code with no more than 100 hundred lines of code.
Anyway, thanks a lot for the example. It is great.
its useful document.
Thanks,
Vivek
There is code attachment available I saw long back. But I cannot find it now. Where can I find the code attached to this particular blog. Thanks in anticipation
Hi Tarun ,
Greetings for the day ahead .
I have a similar requirement .
Plz attach the code .
Would be grateful for your Help .
Thanks ,
Devendra Singh
Hi Tarun,
Could you please attach the code.
Thank you,
Narasaiah T
Hi Tarun ,
I have a similar requirement.
Appreciated if you could attach the code.
Thank you for your Help .
Regards,
Willie T
Thanks. It seems does not manage the scales quantity. Could you please share the code "update" for also managing the scales? Thanks a lot.
The code will solve TK11/TK12 uploads?
Thanks
Hi Tarun,
I tried using the same IDOC approach and also tried RV_CONDITION_COPY, RV_CONDITION_SAVE and RV_CONDITION_RESET. I can see condition records are visible in table KONH, that records are created. But now problem is same records are not visible in VK13 transaction code. I’m passing exactly the same parameters as you mentioned. Could you please let me know what might be the issue?
Likely caused by missing conversion from external to internal format due to leading zeroes.
Hi Tarun,
This is very very useful blog... Awesome Work!!
Regards,
Bhushan
Thank you Taurn, it is very helpfull.
Do you know how to use this function module for "Change condition Validity" ??? .
Now if we want to restrict existing condition from 01.04.2023 - 31.12.9999 to 01.04.2023 - 01.05.2023 FM IDOC_INPUT_COND_A will change and add two records:
01.04.2023 - 01.05.2023 ( created )
02.05.2023 - 31.12.9999 ( Changed)
.... this is wrong result, because we just need to set "stop using" for existing conditions.
Can you help ?
Kind regard
Przemek
not possible - the program by default creates a new condition record. However if you would extend the input file structure with the field LOEVM_KO and map that to E1KONP - then you should be able to have a new record which is flagged for deletion - not necessarily what you want to achieve but still works.
First - excellent post and code - thank you.
Addon because someone else asked and I needed the same ... scale is passed in structure KONM - added a (partially correct) code snippet below but consider this issue: If you work with a flat file and want to add multiple records you will need to adjust the program accordingly as it currently operates on a line by line basis. If you were to add multiple scale values you need to modify the program by changing the logic in method PROCESS_DATA so you create header record first, then condition value then scale and loop accordingly (change the LOOP at GT_MAIN and change the method POPULATE_IDOC_DATA to follow that logic.
Remember that you need to populate field KRECH with the correct value for the scale - in my case for quantities I need to use C
Enhance constants, work area, objects and field symbols:
will update above code in a few days - using a 2nd file for scaling records based on KRECH value. File will have to use the same input structure as the main file DATBI to create unique key.
One more thing to consider: Input file will require YYYYMMDD regardless of your user settings - it will otherwise not work.
Hi Nicolas,
I have a similar requirement for ICM Condition Pricing .
Appreciate if you could attach the complete updated code .
Thank you for your Help .!!