Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

Introduction

In Part I Why Size matters, and why it really matters for SoH (Part I) of this blog I discussed why size matters.

i.e. Performance, cost, meeting SLAs, etc

In Part II Why Size Matters and why it really Matters for HANA (Part II) I discussed SAP data archiving and what data
you should never really keep in your OLTP database

In this section I'll talk about test data and Landscape strategies.

A Typical SAP Landscape

Lets consider a typical SAP landscape.

Due to ease of system copies these days and cheap storage, most customers never really though about test data management,
unless there were legal reasons why you could not use production data (e.g. HR data).

Of course there are inherent risks with using copies of production, such as data leakage, but these can be mitigated.

With a traditional database copy you can size the compute component to be appropriate for the use case.

i.e. your training system may only need to support 20 concurrent users rather the the 1000s of users on the production system,
hence you can get away with a very small host for the database (either physical or virtual), regardless of the database size.

With SAP HANA this is no longer the case. You must have the same memory footprint as the production instance.

Maybe not a problem if your HANA instance is only 256GB on a 2 socket machine, but a fully loaded 4 socket with 3 TB

or an 8 Socket box with 6TB is not cheap to buy or run.

In the case above you have six instances and that is only with one application.

Now consider the same landscape with ERP, CRM,SRM,BW and SCM

Your infrastructure requirements can go up considerably.

So now is a good time to compare the costs of implementing a test data strategy and/or a new landscape strategy  and compare the costs to purchasing and
maintaining a lot of infrastructure.

Test Data Creation

SAP has for a long time offered TDMS to create test data. Not only can it take a snap shot of your production data but also transform/desensitize the data.

Using tools such as TDMS you can radically reduce the size of your non production systems, e.g. by only extracting one months of data, rather than
a full copy.

Here is a sample of a few questions you should ask yourself.

  • What data do I actually need and in which system?
    • Do you really need data older than a year for training purposes?
    • What kind of testing are you doing in each environment, i.e. a unit test does not require more that a few records.
  • How sensitive is the data I am copying from production?
    • Are developers/consultants allowed to see production data?
    • Are your internal staff allowed to see data from other parts of the Business?
    • Do you have any sensitive IP that could be exposed?
  • How quickly do I need to create non production systems?
    • How long does it take to do all of your system copy post processing, i.e.
      • Changing authorizations/users
      • Desensitizing data (potentially a lot of it)
      • Do you need fast backup/restore functionality 

A system copy is often the easiest technical solution for the creation of test systems, but not always the right solution.

The Landscape and change management

When I started with SAP 20+ years ago, most companies had a very simple landscape.

Development, QA and Production with potentially a Disaster Recovery site.

All changes, be they Business as usual, patching or new functionality was developed in DEV, transported to QA for testing and then onto Production.

Yes it was a spreadsheet nightmare at times but it worked. BUT we only had R/3 to deal with.

Now every environment has to include every application, as a business process is no longer confined to one system. E.g. it could easily span,

SAP PI, ERP,CRM and BW. All of these need to be consistent with each other.

Landscapes have also evolved massively with parallel development landscapes, multiple tiers of test landscapes, training systems, pre-production systems,
etc, etc, etc.

Every project/developer/business unit requests their own test system and landscapes become huge and un-manageable.

SAP and other vendors have been creating change management products for years, alas adoption has not been that great

but with in memory databases and the agility required by modern businesses the need for these is greater than ever.

A good change management strategy and the right tools can greatly reduce the number of systems required

Final thoughts

I realize there has been little depth to this series of blogs, but there are far more detailed blogs/documents/help out there on each of the topics
covered.

In a few years I'm likely to come back to this blog and think.

Why all the fuss, memory will be cheap, operating systems will handle data tiering
as part of memory management with persistent memory as tier one, PCIe flash as tier two, SSD as tier three and spinning rust as tier four.

All stretched out over 100s of compute nodes that look like a single OS image.

Inter node communication will be via quantum entanglement thus eliminating latency for good.

Big data systems will be able to access data in any database, be it structured or unstructured and the idea of data aging/archiving will be gone
for good.

But that is a long way away. We are already approaching the end of Moore's Law (number of transistors in an integrated circuit doubling every two years) with two and at a push three more node shrinks possible.

In the mean time SAP needs to keep optimizing SAP HANA, there is still some data that I personally does not belong In-Memory where the concept of data aging is key

e.g.

  • SAP ABAP code that should be loaded straight into the application servers.
  • Temporary data, such as IDOCs, ARFC data, application logs, etc

I'd love to see a lot more effort/training/partnerships put into the idea of Scale-out for Suite on HANA, especially with the emergence of hyper-converged
infrastructure, but costs and speeds for back-end networks need to come down dramatically (both TCP and infiniband).

In the mean time we will have to dig out all those arguments/documents we used in the late 90s/2000s in regards to data management, focus on our change management strategies and our landscape designs.

Unfortunately in all of the above cases, technology is not the problem it is political, persuading the business on the value of their data, persuading your test teams they don't need all of the production data and persuading your development team/implementation partner that a free for all is no longer possible and they actually need to talk to each other and coordinate their developments.

Labels in this area