Dear SCN Users,
We had a requirement of Handling huge sized table of around 60k row of records in migration project where huge size table from other middleware needs to be replicated in SAP PO ,i.e.60k values needs to be stored in SAP PO. These values will be called in SAP PO Mapping.
For this we came up with a solution of SAP PO BRM where EJB is used to fetch the values in mapping,but as per SAP recommendation only 20k row of records is good for optimal performance of the system.
We had two inputs and one output.Group value and Input value being the inputs and output value being the output. There was no logic to split the table as single group value had around 45k Inputs and corresponding outputs, rest all different group values so its not possible to split logically the table in any ways.
Below are the generic problems.
1.No logic to split the tables values and particular group value had 45 input and output values.
2.The table values will be updated on daily basis ,So BRM Rules manager was the option to update the table. But the table size was high ,So the data was not getting displayed and throwing 500 timeout error.
To achieve this scenario, We split the of 60K records like below.
|Table1||First 15k records of Group Value1|
|Table2||Second15k records of Group Value1|
|Table3||Remaining 15k records of Group Value1|
|Table4||Other Group Value details|
Then four rules to call each table.
So by this way even without any logic or conditions,BRM decision tables can be split into any number of tables and performance will be not be affected.