Skip to Content
Business Trends
Author's profile photo Volker Haentjes

Join the Hybrid Data Management Services Beta Program

Induced by IT trends like digitalization, Big Data and in-memory computing, the amount of enterprise data continues to grow. According to the IDC Digital Universe study, world-wide data volume will double in size every two years until 2020, reaching an astonishing 44 Zetabyte (ZB), and enterprises will presumably have liability or responsibility for 85% of it. At the same time, 82% of organizations interviewed by SAP Performance Benchmarking believe that the ability to manage and analyze Big Data is critical to meet strategic objectives. As a consequence, system upgrades are getting more and more challenging.


The Challenge of Leveraging Technology Innovation without Disrupting Production Systems

Every system admin knows how important it is to safeguard stability of their system landscapes during and after updates and to ensure high availability throughout the platform lifecycle. System admins are required to run a thorough simulation and testing prior to every upgrade of a productive IT system to not jeopardize live IT operations. And there is a multitude of cases where such tests are mandatory, be it hardware, software or configuration changes, or changes in table distribution, disk partitioning or indexing. Sometimes, stability and performance checks are even required during normal operations to find the root cause of system irregularities. To stay compliant with SLAs and IT budgets, system admins also need to predict performance and cost impact of IT updates.


A common practice is to run load tests in the productive system, using simulation data. However, this entails quite some challenges: Running load tests during productive use typically biases test results and slows down live business processes. Furthermore, using simulation rather than productive data biases test results even more. A remedy is usually to copy the system into a dedicated testing environment. However, system copies of huge data sets require significant manual effort and represent high additional demands for IT infrastructure, especially for large-scale in-memory environments. There are 3rd party tools available which offer these capabilities, but they can be quite pricy as well. As a result, customers are asking for an integrated suite of performance management tools, especially for SAP HANA.


Simplifying IT with SAP HANA Capture and Replay

The good news is that with SAP HANA SPS12, released on 11 May 2016, there is a solution in sight: SAP HANA SPS12 comes with the first version of SAP HANA capture and replay, a capability of the new SAP HANA performance management tools kit. With SAP HANA capture and replay, system admins can capture real system workload rather than generating test data. SAP HANA capture and replay is fully integrated into the SAP HANA Cockpit and comes with an intuitive-to-use SAP Fiori-based user interface, which makes managing and monitoring the capture and replay progress an easy task. Features like selective capturing with filters and runtime comparison between capture and replay make the life of system admins significantly more relaxed when they need to run performance tests. SAP HANA capture and replay is a built-in capability of SAP HANA and requires no extra license.



Leveraging the Cloud Capture and Replay for Greater Agility & Lower TCO

And now the revolutionary part: For those who can’t, or don’t want to afford a separate testing environment on-premise, there is the SAP HANA capture and replay hybrid cloud service.  Rather than making a full system copy to a separate system, you can copy a snapshot of your productive SAP HANA system into a secure cloud managed by SAP.


With SAP HANA capture and replay, you can simulate and analyze a complete system upgrade cycle using your own productive data, which ensures high precision when it comes to performance and cost KPIs. A typical cycle starts with capturing the workload in the productive system for a defined time slot, including all incoming SQL statements. After copying this data into the test system, the required changes, like software upgrades, hardware changes or modifications to the configuration are applied. After pre-processing the capture files, the workload is replayed in the testing environment. Finally, in the analysis phase runtimes between capture and replay can be compared to identify issues and hot spots and to plan optimization and tuning measures. These phases can be iterated until the desired performance KPIs are reached.


The benefits for the system admin are manifold:

  • Capturing real system workload ensures more exact workload simulation and analysis.
  • Users benefit from fully integrated replay and analysis capabilities for the SAP HANA database and enjoy the SAP Fiori-based Web UI integration into SAP HANA Cockpit.
  • Performance and cost impact of software and hardware updates and modifications become predictable, and manual effort for testing of changes in custom deployments is significantly reduced.
  • And all this is possible without the use of 3rd party tools.

To learn more about SAP HANA capture and replay, visit To find out more about the Beta program for hybrid data management services in cloud, read this FAQ.   Finally, to become an early adopter of capture and replay in the cloud, register for the Beta program now.


Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.