Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
jeroenvandera
Contributor

Raymond Busher gave a very comprehensive presentation with a lot of tips where to look and what to do to improve the performance of your ETL. In his presentation he discussed monitoring, data models, ABAP techniques, Shared object memory (SOM) and finally process chains.

It was a very comprehensive session.  But let me start with the points he wanted to take us home :

  • Ongoing
  • raised and benefit your development team as a whole
  • Ensure cubes are compressed regularly
  • Ensure indices are active and statistics are collected regularly
  • Key to improving overall performance is to minimize database
  1. access. Using Shared Object Memory can help you to achieve
  2. this.
  • Ensure

of your available BATCH processes

  • Performance analysis must be ongoing
  • Develop standards and guidelines for your development team and

monitor your system by checking metadata guidelines are being

adhered to

I highlighted some words as it is evident that finetuning is not a one off project but a continuing process. In the take away it becomes evident that fine-tuning means constant looking at the BW system and be on guard.  You can have new developments, new people in your team, but you need to rely on BW as a solid system.

In the session we started with looking at monitoring options. First transactions that can be used for SQL monitoring (DB02, ST04, ST05, SE30) and process chain monitoring via transaction ST13.

Then we looked at all the components of extraction. In the extraction time we monitor the DB time to select data and user exits.

Look  at delta vs full updates, start and endroutines.  Check your DTP’s. Semantic grouping for example does deliver a large overhead.

Data Model

Look at the cubes to model them in a performance oriented way. The layout in the multiprovider can be user oriented.  This means using all the dimensions to get the dimensions as small as possible. With DSO’s if they are not used for reporting, turn off SID generation.  If you summarize data to new keyfigures use a 7.x infosource instead of a staging DSO.

There are a lot more tips but the main one

Define clear and accessible guidelines for your development team

Use the Multiprovider hint table (RRKMULTIPROVHINT). Until 7.3 if you do not, the multiprovider will fire a query to all the underlying cubes while perhaps only one cube is relevant.  From 7.3 it is covered with Semantic  Partitioned Objects.

You can check your designs with SM50 for processes accessing the NRIV table, ST13 drilling down to DTP and use the ABAP program SAP_INFOCUBE _DESIGNS. Raymond told us that he and his team use a slightly different version where you can select an infocube instead of the standard where all the cubes are shown. Mind though that the program looks at statistics. When they are not up-to-date the result will not be reliable.

Settings must be monitored. This is an ongoing process especially after transport, modification or redesign.

ABAP techniques

There were a lot of detailed ABAP tips. The main ones were that it is best to use generated / dynamic calls. If there is bad programming in new projects this will ensure that it does not drag the entire system with it. For example if a query variable if faulty it will only affect itself, not the rest of query variables.

Golden rules :

  • Keep result sets small (filter)
  • Keep data transfers small (no SELECT *, it is especially bad for HANA)
  • Array Fetch (Into table)
  • Keep DB overhead small (proper indexing and optimizer)
  • Avoid database access where possible

Set up a programming standard using best practices and make them available to developers. Check before go-live that these standards are met.

Shared Object Memory (SOM)

From release 6.40 you can store data in SAP memory.  This is shared between users sessions so it can freely be used by other processes. This makes it more flexible than session memory.

You can use these SOMs by loading the masterdata into memory after the masterdata reorganization as it won’t change anymore. Every transaction data process that needs to look up this masterdata can skip the database loading part.

You can use this technique in the user exits in OLTP and the START routines you can read from SOM instead from the database.

You can set up the process chain that the SOM load follows the masterdata reorganization step.

Building better process chains

Keep an eye on performance, use ST13 for this. You can look at one process chain for many days. Use prefixes in the naming of process chains as this makes filtering easy in the tool.

Do process chain maintenance in production. This reduces the turnaround and correction time. Also as development and Quality are different systems, settings in production do not make sense there.

Find bottlenecks with ST13 go to DTP’s where all details are available. There is the DB logs check catch the SQL statement and analyze it. If you make changes watch out for side effects (processes occupied, memory consumption etc.)

Always include the aggregate rollup step. Use the switch “end process successfully if no aggregate exists”. If this is the case in all chains then maintenance of aggregates becomes very easy as you know you don’t have to take additional steps to maintain the aggregate.

Clean up PSA’s, Clean up changelogs. Again well named infoareas make easy selection for changelog deletion.

Finally do a regular check of the SAP market place for performance improvements and notes for your SAP release.

Raymond Busher gave a lot more in depth tips and examples in the session than I described here. Most important is that monitoring and fine-tuning is an ongoing process.

Labels in this area