4. Conceptual Architecture and System Landscape
4.1 Business Intelligence Data Architecture
The existing landscape uses a BW system to store data coming primarily from SAP ECC and some flat files. As a result, much information from external systems does not get integrated into the reporting environment, leading to “silos” of data for each business area.
Reporting tools are limited to BEx Analyzer (Excel based) and some web applications viewed through SAP Enterprise Portal. The end user’s ability to develop their own reports is almost non-existent.
The diagram above gives a generic Business Intelligence architecture which will be used as a standard guideline for designing future SAP BI based reporting applications at YOUR COMPANY. A singe BI instance would suffice to the reporting requirements across the enterprise. This will be regardless of the configuration of the SAP ERP landscape or source systems. The BI landscape is designed with a view that data will be loaded from not only SAP sources but other non-SAP sources as well. The principles are based on a combination of standard Data Warehousing and SAP BI Best Practices.
184.108.40.206 Enterprise Data Warehouse (EDW)
An Enterprise Data Warehouse (EDW) is an organization-wide data warehouse that is built to include all the different layers. A data warehouse should be designed using a layered architecture. An EDW helps organizations in solving the challenges faced with having a heterogeneous source system landscape. These challenges include the following:
· Redundant extractions
· Isolated data stores
· Multiple warehouse and reporting solutions
· Different data models
220.127.116.11 Design Guidelines
Some of the concepts which form the basis of the design are:
– Common Master Data, Text and hierarchy: This is part of the SAP BI NetWeaver solution offering. This is achieved via the extended star schema design which is the core of the SAP BI architecture.
– Transactional Data: From any Source to be extracted and transformed into a common structure for aggregation for Reporting and Analysis.
– Data Flow layered approach from :Source System à Staging à Integration (Operational /Atomic) à Data MartàAnalytical
– Reporting Layer: All reporting will only be based on Multi-Providers, leveraging BEX Analyzer as MDX conduit. Exception to this are those legacy structures where data records will not integrate with the NetWeaver landscape but through the common semantic layer
4.1.3 Interim Architecture
The interim term landscape will consist of two separate BW systems – the existing SAP BW 3.5 system and a SAP BW 7.x system for divisions converted to the new SAP ECC system. This means that all current data will remain in the old BW 3.5 system.
Since the ultimate goal is to create an Enterprise Data Warehouse, a methodical approach will be made to bring in data from non-SAP systems in the new system. A unifying layer called a semantic layer will be created through BusinessObjects universes to join data from the new system with data from the old BW 3.5 system as well as information residing outside of the data warehouse.
Master data is important in any BW environment, however, it will be even more critical for YOUR COMPANY since there will be a significant time period in which two systems exist for consolidated and cross-divisional reporting requirements. To synch these two separate systems, dependencies exist on cross reference master data files to deliver on these requirements.
18.104.22.168 Interim Solution Risks
• Master data models are very different between old and new worlds
• Consolidation requires leveraging cross reference conversion files which are yet to be determined
• Pre deployment data will remain in old BW 3.5 system which could lead to confusion and misunderstanding of data by users when detailed analysis is required
• Will require maintenance of two systems by support team creating complexity to meet reporting demands from converted and non-converted divisions
• Level of granularity for cross-divisional and historical reporting has yet to be determined
• Semantic layer (BusinessObjects Universe) would be used for joining of similar data across systems/platforms at a certain level of granularity for cross-divisional & historical reporting
Decommission Old World System
• A strategy for retirement/sun setting of the dual reporting environments is required to reduce change management issues and ensure adoption of the new reporting tools.
• When we decommission the systems the definition needs to be changed in the semantic layer
4.2 Design Template and Global Data Structure
This will be the central design philosophy for achieving a common Global BI platform at YOUR COMPANY.
Every BI Application design can be split into two aspects, Application Design and Data Architecture Design.
4.2.1 Application Design Template
This covers the areas of developing a standard Design Template concept for every application under consideration. The template will be designed and developed centrally by the BI Center of Excellence. It will be used globally with a regional reach to promote consistent and normalized information as an enabler for business improvement; leveraging the standard business content delivered by SAP NetWeaver as Industry Leading Practice. Master data would be used across areas. The template will take into considerations all the necessary objects required for a specific application. The list will be as follows:
– Master Data Objects and their extractors.
– Transaction Data Sources
– Staging Objects i.e. PSA or Tables and their Structure.
– Info Providers: DSO’s Cubes.
– Transformations and Data Transfer Processes in BI.
– Reporting components: Queries/Reports, Variables, BusinessObjects Universe Design.
– Loading Process Chains and Jobs.
– Any Programs, Function Modules, Data Dictionary or ABAP related Objects
Differentiation between the locations an object belongs to, will be done via naming conventions. Refer to Appendix E Naming Conventions and Standards section of this document for more detail. The example below illustrates the concept of the application template design proposed:
Template release will be controlled via the BI Center of Excellence and can be deployed to multiple locations using the above concept.
4.2.2 Master Data Considerations
22.214.171.124 The Issue
Poor Master Data affects both YOUR COMPANY’s top and bottom lines. Following are some of the issues which have been identified:
– Inaccurate Material data leads to high costs in supply chain
– Duplicates in vendor master and inconsistent payment terms and cash discounts (pricing conditions) impair the efficiency of accounts payable processes
– Conflicting definitions of data elements of employee master which makes it difficult to do global reporting.
– High cost incurred due to sub-optimal supplier selection and rationalization
– Costing is not done properly because:
– No control around cost elements
– Work Centers are not aligned with similar work processes
– Activity types do not correctly represent labor, machine and potential overhead
– Full absorption costing is done with a single activity type today
– “owner/driver” not determined as a definitive cost worldwide
– No solid control around master data objects and elements which impact profitability reporting (such as product hierarchy, etc)
– 30-40% of materials in V&M are duplicates or obsoletes. There is no clear differentiation between OEM and generic parts and products. This has led to duplicate part numbers for same type of parts.
– Duplicates exist, credit limits are not uniformly checked or applied across the organization, cash discounts, billing terms, and payment terms on cash advances lack key governance and standardization.
126.96.36.199 The Challenge
– There are no global definitions of material, vendor and customer data
– No standardized ways in which semantics about material data can be consistently interpreted
– The data source or system of record is not clear or well defined
– Poor data quality
– No Master Data Strategy, no enterprise perspective
– No common data architecture by data domains
– Lack of master data infrastructure to manage YOUR COMPANY’s enterprise data
188.8.131.52 The BI Design
This aspect of the design involves careful analysis of data required for analysis and design. This task will involve mainly Subject Matter Experts, Data Architects, and Power Users.
A key aspect of Data Model design is the definition of a Global Data Structure for a specific subject matter. This will give a consistent definition of each of the data element regardless of the source of the data.
For example to carry out financial reporting it will be necessary to identify the key fields for the Global Data Structure (GDS) within the model, in a flexible manner such that local elements can be appended while keeping the Global Structure intact. The following example shows how the GDS structure would be designed:
SAP BI 7.0 Platform allows for extractions of data from any source both SAP and non-SAP type data sources. SAP BI provides an add-in into SAP MDM Platform for seamless reporting of master data.
The SAP BI platform will be used for both transaction and master data.
Shared Master Data Considerations
Having a global environment means not only sharing resources but also sharing data objects e.g. Master data InfoObjects will be shared across regions (i.e. Operational Chart of Account will be stored in Global Business Content InfoObject 0GL_ACCOUNT.
Problems with loading are:
– Locking: Locking of tables can occur if the same object is being refreshed by loads triggered by different regions. This has to be avoided and controlled by centrally monitoring the load process at a pre-defined schedule
Master Data Info Objects will be divided into 3 Types:
– Global Critical Objects : These are shared objects and have to be loaded at agreed times before loading the transaction data
– Regional Critical Object: Region specific master data which will be loaded independently for each of the regions. However, region master data must be loaded before the region specific transaction data
– Non-Critical Objects: These master data objects are important, however, they do not have a significant impact on loading processes and can be refreshed in parallel to other objects. Example of this being Account Description.
4.2.3 Near-Real Time reporting / Zero Latency
Finance and some other functional areas request near real time data access. In most cases of analytical use of business intelligence, a prior day or even less current data set is sufficient.
– Frequent push of incremental data to a near-real time Reporting Staging area to achieve near real-time performance
– SAP BI 7.0 provides a Real-Time Data Acquisition (RDA) process which is achieved by an RDA Daemon running at fixed intervals to pull data from source system.
– Virtual Info Providers (SAP BI 7.0). Virtual InfoProvider does not store any physical data; however, is purely a logical representation of the multi-dimensional cube structure. The data is retrieved from the source when a query is performed on the cube at runtime.
184.108.40.206 Load Strategy
A service level agreement (SLA) needs to be developed and communicated in order to set expectations on when data should be available for reporting to users in YOUR COMPANY’s defined regions around the world.
By leveraging an enterprise data warehouse and multi-layer concept, a loading strategy will be implemented in order to meet the demands for information to be available to users across multiple time-zones while not impacting reporting for other regions. An initial staging, or Data Warehouse, layer will contain data for the entire enterprise (all regions) loaded from the source system. These loads will be delta-enabled to prevent source system overload. The analytical layer will be divided into regions when possible to load and contain data by predefined regions. It is critical that master data be loaded and activated before any transactional data is loaded into the Data Warehouse layer.
220.127.116.11.1 Load Process
Some of the basic loading principles have been outlined below:
– All loading will be carried out using Process Chains.
– Email notifications can be set up at the beginning and/or end of each process chain to notify support team of process chain status
– Process chains will be triggered by events and not time dependencies. These events can then be triggered internally within SAP BI environment or using external tools like Tivoli.
YOUR Process here:
4.3 SAP BI Accelerator / HANA
YOUR Process here
– With SAP BI 7.0 onwards, SAP has introduced Business Intelligence Accelerator (BIA), a new transparent approach based on TREX to boost BI query performance. Performance speedup factors between 10 and 100 can be achieved without changing the BI user experience (i.e. transparent to users)
– For any medium to large implementations SAP BIA is mandatory to fully benefit from the SAP BI Platform. At YOUR COMPANY, BIA will be part of the SAP BI installation (late 20XX) and sizing will be done along with the SAP BI server sizing. SAP BIA will only be required for 3 of the environments, Production, UAT, and Disaster Recovery.
Add your BWonHANA plans here:
Next in the series will be