DTP (Data Transfer Process):
Data transfer process (DTP) loads data within BI from one object to another object with respect to transformations and filters. In short, DTP determines how data is transferred between two persistent objects. It is used to load the data from PSA to data target (CUBE or DSO or Info Object or External systems) thus, it replaced the data mart interface and the Info Package. In BI 7.0 Info Package pulls the data only till data source/PSA from source system, thereafter we need to create the transformation and DTP to pull the data further in BI system and to external systems.
IP (Info Package):
The Info Package is an entry point for SAP BI to request data from a source system. Info Packages are tools for organizing data requests that are extracted from the source system and are loaded into the BW system. In short, loading data into the BW is accomplished using Info Packages.
Where DTP comes in Data Flow:
DTP can be used to load the data in the following situations:
1) Loading data from DSO to DSO/CUBE, CUBE to CUBE
2) Real time Data Acquisition
3) For Data Direct Accessing.
4) Loading data from BI to External systems (for example Open Hub Destinations)
Advantages of DTP:
1) Delta Management
2) Handling duplicate records
4) Parallel processing
5) Error Handling
1) Delta Management:
i) DTP follows one to one mechanism, i.e. We have to create one DTP for each data target, whereas IP loads to all data targets at once.
Let’s take an example where we are loading delta data using IP to 5 data targets. The data was successfully loaded into 4 targets but failed in 1 target.
In this case we have to delete the request from all 5 targets and need to re-load the IP to load the data.
If we use DTP (one to one mechanism), we need to just delete the request from failed targets and re-load to the failed target, which is very easier than loading to all 5 targets once again.
I) One DTP can be used for Full and Delta loads, whereas in IP we have to create different Info Packages for Full and Delta loads,
iii) No Full Repair/Repair full request concept in DTP as it has one to one delta mechanism.
Note: Data Mart and Delta for DTP is maintained in table RSMDATASTATE.
In Table RSMDATASTATE
i) Field DMALL is used to keep track of Delta. It increases when Delta completes
ii) Field DMEXIST is used to prevent deletion of Source request.
Deleting data request deletes the entry in table RSSTATMANREQMAP which keep track of data marts
2) Handling duplicate records:
While we perform attributes or text loading, we will get the records with same key fields. Sometimes based on the properties of the fields we have to load this records to the target (Info object).This can be easily achieved by DTP.
We have to set the option “Handle Duplicate Record Keys” indicator in update tab of DTP to get this feature enabled.
DTP filter’s the data based on Semantic Key which is not possible on IP. We can use filters between DSO and CUBE also.
4) Parallel processing:
The request of a standard DTP should always be processed in as many parallel processes as possible. There are 3 processing modes for background processing of standard DTPs.
i) Parallel extraction and processing (transformation and update)
The data packages are extracted and processed in parallel, i.e. The parallel process is derived from the main process for each data package.
ii) Serial extraction, immediate parallel processing
The data packages are extracted sequentially and processed in parallel, i.e. The main process extracts the data packages sequentially and derives a process that processes the data for each data package.
iii) Serial extraction and processing of the source packages
The data packages are extracted and processed sequentially.
We can change this setting in the execution tab of DTP.
The maximum number of background parallel processes we can set for a DTP is 99.
5) Error Handling:
The error DTP concept is introduced to handle the error records in DTP. While loading the data using DTP, error records will move to the error stack and the correct records will load to the target. We can edit this error records in error stack and load to the target using Error DTP, which is not possible in IP.
Simulating the data update and debugging helps us to analyze an incorrect DTP request. In this way we can also simulate a transformation prior to the actual data transfer if we would like to check whether it provides the desired results.
We can define break points for debugging by choosing change break points, which was not available in Info Package.
Debugging can be done in 2 ways
i) Simple Simulation
ii) Expert Mode
For more information about DTP, go through the below link