Skip to Content

How to transport changes to very large ODSes in SAP BW

In almost all BW systems which have been live for 3-4 years , it is a standard sight to see the following :

1. ODSes like Delivery Items , Material Movements etc having millions of entries ( this of course depends on the usage of these functions in SAP )

Very often a change request comes along which requires an enhancement to these ODSes and in lieu of the the time this would take – a lot of sister ODSes spring up all around these mega ODSes leading to a proliferation of Infosets and cubes and Multiproviders.. which in turn lead us into very difficult data modeling choices


The question which needs to be answered first – what does it take to change these ODSes

When you add data fields to an ODS with data – the transport tries to adjust the table with the new fields – this causes the database to COPY the data which exists in the table into the undo tablespace and then try and adjust the table – so that if there is an error – it can rollback. Some of these terms are related to Oracle but the concept is the same.

What this means is that :

When you move a transport that changes a table with 20 million entries – this table is backed up first and then adjusted. So make sure you have enough space on your Undo Table space ( free space greater than or equal to active table size ) before attempting the same.


Another option for doing this :
Before moving the transport – back up the table manually in the database – this involves copying the data from the active table of this ODS into a separate table

Drop the data in the active table , changelog and new data table of the ODS AFTER making sure that all upward data flows are complete

Move the transport in

Fire INSERT statements from the backup table into the changed ODS table – here you can control the processors available etc to make this process faster.

If your ODS is partitioned – take this into account as well – also if you have partitioned the ODS through the DB – when you transport the ODS – these changes are overwritten – you will have to put them back

Once the insert finishes – recreate the indices / rebuild the indices

If possible do a compare between the two tables to make sure that the data matches – then drop the backup table.


You are ready to go. This process might take the same time as moving the transport – but then it gives you greater control.

We tried this for a ODS with about 45 million entries and it worked fine.

Of course – you could argue that the change could be done directly in the Database than going through transports – but then transports is SAP’s way of keeping things together..

And of course you would need a brave DBA…

You must be Logged on to comment or reply to a post.
  • Instead of backing up in a DB table, would it be viable to create a copy/clone of the ODS and perform the backup in that? The data can be loaded from the Original ODS to the clone by using Transformations/DTPs, wipe out all data from the Original DSO, perform transport and then load the Original DSO from the clone. Would there be any issues with this approach?
    • You could do that – the driving factors are two –
      1. Transports – for the backup DSO you need a transport – for the database copy you do not need one
      2. You work directly at the DB level instead of going through the application level

      The only downside of my approach is that you end up wiping out the monitor entries in the ODS and you cannot get them back.

      Also you will have the trouble of deleting the deltas – if the deletion does not allow you to delete the data – then you will have to delete using selective deletion before you can move the transport.

  • Using both of your advice… we can go ahead with one more option.

    Create a Table in the BI System and code an ABAP program to load this Table from DSO.
    Once your changes moves in, we can easily load the data either through ABAP or through normal way of transformation to this DSO. Though latter would be performance dud if we are talking about multi-million records.

    What Say ?

    • This would be doing what I have indicated above through the application layer.

      Advantages :
      SID Generation , Monitor entries etc all exist
      Disadvantages :
      Additional application layer introduced
      Additional development required and transports and testing 🙁

      Instead of doing a custom table – you could do another DSO and load into that…