Large, Global SAP BI4.x new deployment
“Humans and data quality errors are inseparable” MIT Symposium
Big data and Business Intelligence
Business Intelligence software and systems created a level of automation to handle the big data revolution. That automation should remove the cognitive biases and confirmation biases humans have evolved dealing with data.
Complex BI systems are designed and implemented by humans. They have the same failings and bad cognitive attributes. A BI system can suffer the same performance problems. The larger the BI program, the greater the data flow. The more chance of performance issues due to errors in design, and implementation. A speaker at an MIT Chief Data Officer Symposium in 2013 said: “if humans are involved in the production of data, you should expect it to be imperfect.”
“Bad data is first and foremost a human phenomenon” MIT Symposium.
Build in the quality
The philosophy of quality control is not new. The aviation industry became a champion in the 1920’s. Aircraft production boomed during that decade. The high human error rate was a sobering statistic. Automation in production eliminated many quality problems. The aim was to build in the quality not inspect it in.
In 2015 Xoomworks BI produced a presentation for SAPinsider at Nice titled “A Step-by-Step Process to Design and Manage Successful SAP BI Implementations and Upgrades for Large Deployments.”(The long title itself could contain human errors.) The aim was to provide a framework to reduce the error rate in design and implementation. Build in the quality, not inspect it in. Following a multi-tiered architecture design also makes the task of troubleshooting easier.
Gartner has been hailing, analysing, and deriding BI software for several years. The analysts often report on poor performance and success rates of BI implementation. Their Magic Quadrant takes a software approach looking at the strengths and weaknesses of their leaders and followers. A useful guide but ignoring the human element in the design and implementation that could mean success or failure of any BI system. And in the final analysis, even the best software will fail when deployed if the system is not well designed.
The Correct Approach Leads To The Right Solution:
Leave questions open, don’t worry about not knowing the answers. Get information, test and prove things yourself. Don’t make decisions based on bad facts that lead to bad conclusions.
Record and share your knowledge. Do it with clear technical documents for all processes. Add who does what, where, and how and keep it updated.
Always put yourself in the customer’s shoes. One single report may have more value than any other. Test and explore all project variables and dependencies.
Be curious when looking for RCA of any issue, starting from OS issues up to app issues.
Just because things look right for the user, it doesn’t mean it is right at the back end.
Five Tier Architecture
An SAP BI4.x implementation depends on everything. Network components, OS settings, authentication, access, security. The implementation depends on all the connected data sources. The implementation on human behaviour and human failings. We have talked previously about human behaviour in the field of Change Management for users, here we look at the design and implementation.
The goal is to have an unbreakable, dynamic configuration. A configuration with the ability to change and remain error free as a BI solution.
“You purposefully need to create standardised systems, processes and methods. Otherwise, you will always have high failure rates and poor performance.” Lifetime Reliability Solutions
The design architecture for a new deployment follows five steps
Sandbox. This environment tests and explores issues within other environments. Sandbox can test new SAP BI4.x functionality, and software upgrades or patches. The test environment can also look at new integration with other software or private fix regression testing.
Development. Developers use this environment to create BI content. Development data sources will be connected. The information doesn’t have to be valid, just be structured as Production data source.
Acceptance. This environment is used to validate the configuration of a build. A comparison is made with connected Acceptance databases before sending the content to the next step, Quality. Data sources will contain an old copy of Production data to allow validation of formatting. That will also allow validation of calculation results and single report performances.
Quality. This environment validates via an internal BI Centre of Excellence (COE). It looks at the quality of the BI content generated. Performance tests need to be carried out in a real-life scenario. Validating production data volumes, testing concurrent usage, scheduling and publication under a worst-case scenario.
Production. This environment is used for the final reporting, consumption, and job processing for the connected Production data sources.
Understanding the full capabilities of SAP BI4.x and the features is a necessity. With that knowledge, the best solution can be found for the requirements of the business. Building relationships with the people who will be involved in design and implementation is another necessity. Understanding their positions and clearly defining their roles and responsibilities.
The strategy is to define and run a proof of concept to prove the business requirements are met. This phase will bring out technical issues. Troubleshoot, fix, and document all the issues using a similarly rigorous and defined approach as for the design. There will be configuration after the installation, for success it is important not to take short cuts. Security configuration is vital, how you assign groups and custom access levels.
A business intelligence system is an existing deployment once it has started running, and it can be monitored as such. SAP BI4.x growth and usage are dynamic. The situation can change daily depending on the type and level of controls which are implemented. There will be ongoing maintenance and software upgrades to undertake.
The problem wasn’t with the processor speed but the channel throughput specification for that CPU in production. The CPU with a higher frequency had 100MHz Front-Side Bus, whereas the CPU model used on the test system had a 400MHz FSB.
Intel and AMD noticed the speed problem many years ago for simple calculations on a large data set. The FSB couldn’t keep up with the CPU. Whereas with complex calculations, the greater time spent in the CPU meant the FSB could keep up. The human error in this example was a failure to have detailed information on the chosen hardware.
Errors do not have to end in failure
It is important not to design an over-complex BI system architecture. An architecture proposal may look good on paper. If a system has installed components surplus to the business requirement, redundant issues may be raised during a proof of concept phase. The redundancy may be enough to jeopardise a new project.
Human design and involvement will produce errors using any approach to design and deployment. If the correct procedures are followed fixing them becomes part of that procedure. Errors do not have to end in failure.