Skip to Content

In BPS, during flat file loads if the volume of data is huge, it gives a memory error stating it cannot load records due to memory limitation.

The normal work around for this is to break the flat file into multiple sub-files and load it one-by one; But this is not always encouraged by end users who always prefers to minimize their efforts.

BPS partitioning functions (prepares the system for huge data volume) are created to cater these scenarios.

Why It happens:

Whenever data is loaded using planning function (here flat file load function), BPS loads all data into planning buffer.  This is an ideal scenario for low/medium volume of data; But when the data volume increases, it leads to memory overflow error.

Solution:

SAP recommends creation of independent ad-hoc packages based on partition characteristics and these ad-hoc packages are executed independently. It is important to note that it is not required to create multiple packages for this; at runtime ad-hoc package is adjusted for the same. The memory is released after execution of each package and hence the memory problem is avoided.

How it is done?

  1. Identify a characteristic which breaks the data into logical data set ( basically a partitioning characteristics)
  2. In planning Level for that characteristics, make the flag ” Selection in package”
  3. Don’t create z planning package for this case. Standard ad-hoc packages will be used
  4. This works only for global planning sequences executed in background. So create a Global Planning sequence.
  5. Use standard program UPC_BUNDLE_EXECUTE_STEP to execute the global planning sequence with the partitioning characteristic and the data range

The program creates independent ad-hoc packages for each value of partitioning characteristics; With memory release after each package execution the problem of memory limitation is avoided.

This is one of the ways to avoid memory error while loading huge volume of data in BPS.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply