Managing the economics and ownership costs within the data center has become a key issue lately for data center operators and IT-managers in general, much due to the explosion in data volumes and the rising cost of owning and managing increasingly complex hardware landscapes. The pressures from running a real-time business for an enterprise also demands faster system provisioning, as well as greater uptime within data centers.
These forces have led to a push to the Cloud as we have seen over the past decade, as companies look to offload their IT costs and efforts to a third party provider, as well as the growth of In-Memory DBs to process increasingly larger workloads. Still despite the Cloud shift, at least 40-50% of workloads will still remain on premise at least in the near future, due to availability or legal reasons.
As such, CIOs today face 5 unique strategic challenges:
1. How to enable on-going innovation while ensuring business continuity
2. How to reduce planned or unplanned downtime
3. How to accommodate growing data volumes and landscape complexity
4. How to provision systems quickly, scale up or down according to demand
5. How to reduce the costs of running a data center – improve data center economics
As most data centers today are hardware defined, scalability and virtualization has been one of the major challenges in provisioning multiple workloads. From a cost perspective, the practice of simply buying-and-adding new hardware for provisioning has proven to be cost ineffective in the long run. The manual effort in installation, provisioning and maintenance in a heterogeneous data center landscape unfortunately leads to operational complexity, as well as an increased TCO. For reference, 72% of IT budgets today goes simply towards maintenance, with a slim 28% retained for innovation.
Cost Challenges in Detail – Hardware Limitations, Redundant Data Footprints, Unscalable Labor Efforts
From an architectural perspective, most enterprise data centers are somewhat limited in the hardware aspect. Dedicated system setups mean multiple systems requiring dedicated hardware, which leads to a heterogeneous data center full with different equipment types.
Next, multiple systems contribute to redundant data footprints, where data is constantly replicated, copied and stored for different use cases. Analytical workloads and data are extracted, transformed and loaded and stay separated from the original data. Redundancy in data footprints, simply means an increase in costs and a loss in agility.
To operate in an increasingly complex environment, additional database administrators are needed to simply extract, transform, load and manipulate data sets. This adds additional cost pressures on the IT department, as seen below by this graphic put together by my colleague, SaiprashanthReddy Venumbaka.
Where CIOs should start focusing – Reducing Data Footprints, Simplifying Hardware landscapes, Data Processes and Operations
With the Internet of Things, unstructured data volume is set to grow exponentially, and tackling these increasing data volumes would be the first step in managing one of the largest cost drivers CIOs face.
In the example of Medtronic, a US medical devices company, its data warehouse simply doubled in the span of three years, with a vast majority of the data coming in as unstructured text data. Medtronic’s data warehouse began experiencing IT performance issues, and new requirements to merge even larger amounts of data from multiple sources further compounded these problems. They had since leveraged in-memory technologies to help manage large data volume sets, and had been one of the early adopters of SAP HANA.
Aside from leveraging column store databases for efficient data compression, CIOs should look also to reduce special caching, indexing, and data duplication processes, which are usual to ensure smooth operational reporting and analytic performance on traditional databases.
With in-memory technologies however, SAP HANA allows users to now process large data sets (as seen in the above example), all at in-memory speeds with OLTP and OLAP transacting concurrently. Processes or analytical queries that would take hours to do, could now be done in seconds or minutes, with little or even no need for the above preparation steps.
In the example of Nissha Printing, a Japanese impression technology company, by running reports and analysis processes on the SAP HANA platform, the company has reduced by 45% the number of data cubes needed from 103 to 56, and cut the number by 65% of pre-written queries from 700 to 250. In another example, the University of Kentucky has reduced the extract, transform, and load process time from 8 hours to less than 1 hour, and was able to redeploy some IT staff to more strategic activities.
The functional scope of SAP HANA and the SAP platform is not only able to provide real-time analytics from the HANA based system, it is also to combine functionality that previously could be only be provided by specialty servers. Search, GeoSpatialdata, Graph and Predictive libraries are now simply embedded within SAP HANA, and this removes the need for separate, specialized engines. By collapsing multiple systems onto one, IT managers can solve data redundancy and quality challenges, as well as remove the need for data duplication.
With less data or hardware to manage, data processes and operations can now be further simplified. There are less steps to Extract, Transform, Load data, as SAP HANA works simply on one single data copy. This further helps reduce the effort needed by DBAs who need to do manual preparation.
Simplifying IT operations: Provisioning quicker, prioritizing data volumes
New technologies in containerization and virtualization can be further leveraged by CIOs to further reduce the total cost of ownership, by simply provisioning instances quicker.
- SAP HANA in its latest release SPS9 comes with Multi-tenant Database Containers, which allows workloads to be provisioned quicker, and resources allocation for applications to be optimized. Much more workload can now be based out of less hardware, potentially reducing the hardware footprint.
- In the context of efficiency, SAP Cloud Appliance Library provides flexibility to deploy SAP solutions into a cloud provider of choice, serving as an extension of an on-premisedata center. This helps accelerate consumption of SAP software on-demand, and within minutes.
- Provisioning SAP systems is also one of the complex tasks that can be further simplified. SAP Landscape Virtualization Management essentially allows IT Managers to automate processes, lifecycle management tasks and system provisioning activities. This helps reduces the time and effort needed for provisioning.
- Growing data volumes place another challenge for holding all the data in-memory. As only around 15% to 25% of all data is accessed on a frequent basis, it can be expensive to hold all data in memory. With Dynamic Tiering, data can now be managed better by prioritizing them according to their importance and their value. With freed up database resources, we are now better able to support petabyte scale of development, since we are not as limited by the size of memory as before.
Leverage existing hardware infrastructure
Existing data center assets and processes can also be further leveraged, as SAP HANA now comes with tailored data center integration. With the SAP HANA Tailored Data Center Integration model, this allows some customers especially those with large IT infrastructure landscapes, to maximize existing infrastructure hardware. This brings a cost reduction to hardware and operation cost, as they no longer need to purchase additional hardware or change certain processes in their data centers.
The benefits of in-memory technologies like SAP HANA can help enterprises tackle the increasing challenges of data volume growth as well as landscape complexity. With new SAP HANA features such as Multi-tenant Database Containers and Dynamic Tiering, IT can now have a further opportunity to reduce their hardware footprint and data holding costs. Various other tools such as SAP Landscape Virtualization Manager and SAP Cloud Appliance Library further help ease the configuration and provisioning efforts today, while services such as Tailored Data Center Integration ensures that changes in processes or hardware within the data center are kept to a minimal.
To give further insight into the cost economics of a data center, Forrester Research, Inc. has produced a study, “Projected Cost Analysis of Sap HANA, April 2014“, where an organization has been noted to potentially save up to 37% across hardware, software and labor costs alone by shifting to SAP HANA. While these figures are merely projections that may vary between enterprises, it still bears to note the simplification opportunity for data landscapes, volumes and processes, offered to an enterprise by simply moving onto the SAP HANA in-memory platform.
Additional Resources (SAP employees only)
For more information, do feel free to view the available resources:
Our presence at Sapphire 2015
We will be present at this year’s Sapphire 2015, so feel free to reach out to us at Booth PT425: “Improve Your Cloud and On-Premise Data Center Economics and Security”. See you at Sapphire 2015 in Orlando!