Skip to Content

We often end up in a situation within SAP BI where the master data code of an object (example material master codes) changes over time in SAP ECC and BI is expected to report all historical as well as new data with the new code.

Example: A material in the SAP-R3 system had a number M100345. The IT team decided to upgrade the R/3 to an ECC box, at the same time the data standards were revisited and the naming convention of material codes changed. So, in the new ECC box this material is now coded as I300456. However, in BW the historical transactional data was already loaded with the old code (M100345).


Requirement: SAP-BW report is expected to aggregate the historical transactional data (i.e. the data with the old material code M1000345) and the new transactional data (i.e. the new code I300456) in a single line item displaying the new material code in the report output.

solution option.png

Solution Options:

Traditional approach:

Such situations have traditionally been addressed by a complete data reload of the transactional data as well as the master data in BW. The transactional data is been converted to the new code through a mapping file during the reload process. This approach usually introduces a lot of effort and any changes in the mapping file lead to a complete reload of all transactional data in the BW system. Here are the sequence of steps that can be followed with this approach:

  1. Get the mapping file with all old codes (example old material numbers) and their corresponding new codes (new material numbers).
  2. Delete all transactional data from the BW system and refresh the master data from the source system.
  3. Reload all transactional data and change the old codes to new codes by referring to the mapping file generated in step 1.
  4. Perform the above steps for each object where codes are changing (material, plant etc) and for each transactional data store (DSO, cubes etc) where those objects are used.

Utilizing the Master data as mapping file:In this approach the new codes (new material numbers) can be loaded in the master data as a navigational attribute of the object (new material as a navigational attribute of material object). Post this, the navigational attribute can be used in reports. This approach can deal with the dynamic changes in the mapping file as well as does not require any reload of transactional data in the BW system. Here are the high level steps that can be followed for this approach:

  1. Get the mapping file with all old codes (example old material numbers) and their corresponding new codes (new material numbers).
  2. Add a new navigational attribute to the object for which the codes are changing. This should be done for all objects where codes are changing, irrespective of whether the object is a master data object or not.
  3. Load the navigation attribute with the new codes (new material code) with reference to the mapping file. No changes in the transactional data of BW.
  4. Identify the transactional data objects where the object (in step 3) is used through ‘where used list’ functionality of BW system. Activate the newly added navigational attribute (made in step 2) in the transactional data object (example DSO or cube).
  5. Change the reports to use the newly added navigational data object for displaying the data. The original object can still be used in the reports to display the old code of the same object.

Scenario.jpg

Traditional VS Master data approach:

Traditional Approach

Master Data Approach

Complexity

Low

Medium

Reporting performance

High

Comparatively low (should not be an issue with HANA as a database) as it is using navigation attributes for reporting vs the data stored in the cube/DSO.

Data Reload effort

High

No reload required

Flexibility

Cannot accommodate mapping file changes, will need a data reload

Can accommodate mapping file changes dynamically with a master data refresh only.

Traceability to old codes

No traceability (Old codes are lost)

Yes, data can be reported on old and new codes.

Risk

High (a complete reload of transactional data is high risk to data quality)

Low (transactional data is not reloaded)

Downtime

Transactional and master data reload will incur system downtime, depending on data volume

No downtime required, the changes can be moved through transports and MD is usually a full load.

Recommendation / Conclusion:

Prefer the traditional approach where data volumes is low, data modeling logic is simpler (to avoid risk of regression testing of all reloaded data) and reporting performance is a major concern.

Prefer master data approach when the complexity of the data model increases the risk of issues that may arise due to a complete data reload and reporting performance of the system is up to the mark.

To report this post you need to login first.

1 Comment

You must be Logged on to comment or reply to a post.

Leave a Reply