Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

Most of the BW consultants have raised a query in SCN for the performance issues they are facing while running DTP. To eliminate the issue of slow or long running DTP it is essential to understand the concept of Package Size & it's impact on the DTP. Once we are clear through it we might be able to eliminate the issue of long running DTPs.

To start with, understanding of Package & Package Size in DTP is important. Let's do that first.


What is a Package?

A Package is a bundle of data. It contains the records in group.

What do you mean by Package Size?

It is the size of the records a Package can hold inside. By default, the package size of DTP is 50000. You can increase or decrease the package size in DTP depending upon the processing type, transformations, routines & data size.

Do notice that there are two types of loading in SAP BI:

1. Infopackage loading:

    Default Package Size 20000


2. DTP loading:

    Default Package Size is 50000


In this blog I will specifically talk about DTP Packages. DTP 'package size' plays important role in loading the data to info-providers. A good consultant might consider package size as a major while designing a project skeleton. The reason being the following impacts.


Impacts of Package Size in DTP:


  •      When to keep Package Size less than 50000:


1. If we are dealing with lots of look-ups in transformations then keeping the package size smaller helps in faster execution of routines. As we know look-up in       transformation is directly proportional to number of records.

    As we know that routines run at package level, bigger the package size longer the duration for completing the look-up. Simply, MORE SIZE=MORE TIME.

2. If we have large volume of daily loads, then reducing the package size good option. It is obvious that data with high volume requires more time for transfer         process. If the Size is reduced then the processing time of a package falls reasonably eventually boosting DTP.


3. If the parallel processing we are allocating in DTP is less & data volume is huge it is better again to reduce the package size since the parallel processors         allocated are less the processing time for each package can be reduced gradually.


  • When to keep Package Size greater than 50000:


1. Obviously, if the data volume is less then keeping the package size more won't affect the performance of DTP.


2. Some times there are no look-ups/routines in a transformation. The mapping is direct. at this situation, increasing the data package size will not hamper the       DTP.




Hope this blog helps to understand the concept of package size in DTP along with it's Impact.


Author: Shubham Vyas  

Company: Mindtree Ltd. (Bangalore/India)

Created on: 19 June 2015

10 Comments
Labels in this area