We often end up in a situation within SAP BI where the master data code of an object (example material master codes) changes over time in SAP ECC and BI is expected to report all historical as well as new data with the new code.
Example: A material in the SAP-R3 system had a number M100345. The IT team decided to upgrade the R/3 to an ECC box, at the same time the data standards were revisited and the naming convention of material codes changed. So, in the new ECC box this material is now coded as I300456. However, in BW the historical transactional data was already loaded with the old code (M100345).
Requirement: SAP-BW report is expected to aggregate the historical transactional data (i.e. the data with the old material code M1000345) and the new transactional data (i.e. the new code I300456) in a single line item displaying the new material code in the report output.
Solution Options:
Traditional approach:
Such situations have traditionally been addressed by a complete data reload of the transactional data as well as the master data in BW. The transactional data is been converted to the new code through a mapping file during the reload process. This approach usually introduces a lot of effort and any changes in the mapping file lead to a complete reload of all transactional data in the BW system. Here are the sequence of steps that can be followed with this approach:
Utilizing the Master data as mapping file:In this approach the new codes (new material numbers) can be loaded in the master data as a navigational attribute of the object (new material as a navigational attribute of material object). Post this, the navigational attribute can be used in reports. This approach can deal with the dynamic changes in the mapping file as well as does not require any reload of transactional data in the BW system. Here are the high level steps that can be followed for this approach:
Traditional VS Master data approach:
Traditional Approach | Master Data Approach | |
Complexity | Low | Medium |
Reporting performance | High | Comparatively low (should not be an issue with HANA as a database) as it is using navigation attributes for reporting vs the data stored in the cube/DSO. |
Data Reload effort | High | No reload required |
Flexibility | Cannot accommodate mapping file changes, will need a data reload | Can accommodate mapping file changes dynamically with a master data refresh only. |
Traceability to old codes | No traceability (Old codes are lost) | Yes, data can be reported on old and new codes. |
Risk | High (a complete reload of transactional data is high risk to data quality) | Low (transactional data is not reloaded) |
Downtime | Transactional and master data reload will incur system downtime, depending on data volume | No downtime required, the changes can be moved through transports and MD is usually a full load. |
Recommendation / Conclusion:
Prefer the traditional approach where data volumes is low, data modeling logic is simpler (to avoid risk of regression testing of all reloaded data) and reporting performance is a major concern.
Prefer master data approach when the complexity of the data model increases the risk of issues that may arise due to a complete data reload and reporting performance of the system is up to the mark.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
7 | |
5 | |
5 | |
5 | |
4 | |
4 | |
4 | |
4 | |
3 | |
3 |