Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
1)      Deletion and Creation of indices of the data target before and after data load respectively
                Data load to the Info Provider with indices would consume considerable amount of time, so for                better load performance we delete the indices before data load and re-create the indices after       load- for better reporting performance. Drop secondary Indexes for large Info Cube data loads     and load the data.
              Pros:  Deletion of indexes improves the load performance and Creation of indexes improves
                          reporting performance
              Cons: Creation of indexes is time consuming
                Refer: http://scn.sap.com/message/6937951
2)      Always load master data before transaction data (DSO and Info Cube)
        All master data SIDs are created prior to transaction load, and need not be determined during   transactional data load which causes large overhead. This accelerates transaction data load      performance.
             Pros: Maintains data integrity and avoid load failures
             Cons: This leads to dependency in data load.
          
                SAP Note 130253   
                Refer: http://scn.sap.com/thread/1555210
3)      Process Chains Parallel Data Load
               Multiple processes can be started in parallel manually, For example, Info Packages -loading data from multiple data sources or from the same data source (with different selection +criteria) simultaneously.
                    Pros: Distributes data load on different application servers and improves data load 
          Performance
                    Cons: Data load will take longer time and it may lead to Timeout error
                    Refer: http://scn.sap.com/message/10534840
4)      File should reside on the application server not on the client PC
          Pros:  Reduces large network overhead to the client system/desktop
5)      Selection Criteria in Info Package (Indices on Source Tables)
                         If you decrease the size of data to be extracted by selection criteria in the Info Package, consider building indices on the source tables to avoid full table scans. All Data Source selection fields should be supported with DB index.
                     
                      Pros: Faster identification of data set in the source. Reduces the number of data packages per load.
6)      Increase Load Performance by Buffering Number Ranges
          Buffering starts to become useful when data volume exceeds 100,000 records and its visual if you have performance problems loading the data. Number range buffering helps when loading large volumes of data, but adds little if any performance improvement on smaller master and transactional data loads. Typically, you should set it for initial large data loadsand turn it off once the load is complete. You can leave buffering on indefinitely if high volume loading occurs for all master or transactional data loads. By buffering the number ranges, the system reduces the number of database reads to the NRIV table, thus speeding up large data loads.                
                        Never set buffering on the Info Object 0REQUEST. This is a special Info Object used for data     loads and could cause the system to lose data packets if buffered.
           
            Master data - Determine number range
            Function Module= RSD_IOBJ_GET
           Transaction Data - Determine number range
            Function Module = RSD_CUBE_GET
            T code: SNRO

                Pros: Speeds up large volume of data load

           
               SAP Note 857998 and 130253
7)      Recommended DSO Setting (T code RSODSO_SETTINGS)
               Optimal package size in the transaction is around 10000.
               Select a higher runtime than is usually selected.
               Parameter rdisp/max_wprun_time should be 3600 (This is maintained by the Basis team)
               SAP common recommendations:
               Batch processes    = (All Processes/2) +1
               Wait Time in Sec.  =   3 * rdisp/max_wprun_time
                    Pros: This Controls data packet size utilized during parallel update/activation and number
               allocation of work processes.
                    SAP Note 25528 
😎      DTP Performance

Parallel Processing:   Set the Number of process to 3 in DTP
RSA1 -> Display DTP -> “Go to Menu” -> Settings for Batch Manager
             
               SAP Note 892513
9)      Only project required fields should be activated for Data Source in extract structure
10)   Generic delta characteristic should be indexed
11)   Use Projection view instead of physical table in the generic extractors
               Pros: Only required fields can be selected in the extract structure
12)   PSA Performance
               Set the PSA partition size according to the expected package sizes.  If you expect many small packages, set the partition size to a rather small value, so that these partitions can be deleted quickly. In transaction RSCUSTV6 the size of each PSA partition can be defined.
Recommended values in RSCUSTV6 settings
Partition Size: 1,000,000
Package Size: 30000
Frequency Status IDOC: 10
13)   DB Partitioning on SAP source system
Physical Partitioning of the tables on the Source system side would help in faster picking up of the data during the extraction process. For eg: Partitioning the tables based on Calendar Year/Calendar Month.
2 Comments
Labels in this area