When does HANA become mainstream?
We’ve now all heard the numbers and seen the customer references, so why isn’t everyone and their mother buying HANA? Over the past several months I’ve had discussions with clients, partners and SAP to better understand this. There have been numerous blogs, articles, tweets, etc about the value of in-memory computing and how HANA is a game-changer. If there is all of this sizzle, where is the steak? More importantly as a service provider the inevitable question is “When will people be knocking down my door to implement it?”. So, here we go…
Is it proven?
If you’ve ever worked in software sales, one of the most powerful statements you can make is about referencability. SAP software is used by thousands of companies and millions of end-users everyday. The software is proven time and time again. For all of the horror stories you hear about IT, it is undeniable that the largest companies in the world rely on SAP software day in and day out to help run their businesses. Easy sell, right? However, HANA itself is brand new. While it utilizes technologies that are proven (i.e. TREX, BWA, etc.) it is still very much a new product. Sales people love their sales pipelines. But pipelines can only be created between customers and products that have already had the ground dug up before them. So, trying to predict the market and size of the market that HANA is hogwash at this point. This is the scary part for both vendor and customer of drastically innovative technologies, it’s hard to align strategies.
Drastic Business Change
Software tools are only as good as the businesses who utilize them. HANA is an unbelievable enabler for pure business change. While the evangelists of the world (including myself) are shouting “Hooray!”, CIOs are silently grumbling to themselves. And how come? Because drastic business change is very hard to implement, its costly, and the risks are high. What if I implement HANA and it doesn’t change my business? What other companies has it changed? …and how? These are all important questions running through the CIOs head right now. The early evangelist CIOs will be the ones to take on the risk (and reap in the subsequent rewards). Business transformation is disruptive, that’s undeniable.
Data Lifecycle Management
I wish I started a company whose sole purpose was to analyze a customer’s data lifecycle. Why? Because almost every company I’ve ever seen, gets this wrong, and the benefits of getting it right have a huge effect on the bottom line. Now extrapolate this to HANA. For HANA, SAP is charging you licensing fees per Terabyte, plus the cost of hardware. So it’s absolutely critical that you nail this down and affects the bottom line of the client purchase. This leads to a major technologic question – how do you archive off old data in HANA? As far as I know this is not possible. This is already an “issue” for BWA that is largely solved by removing old data structures (i.e. delete indexes for year 2005 COPA InfoCube). I’m not sure how this will be possible in HANA without date dependent partitioned data structures.
Costs, Scalability and ROI
I’ve just seen the prices and my first reaction is WOW. I have no idea how “public” they are yet, so i’ll spare you the specifics and let you take that up with your SAP AE. I have no doubt that the big boys (aka companies with deep pockets) will be able to afford HANA, but for anyone making less than a couple of billion of year, this will be a hard pill to swallow. In general your costs are going to look like this: (1) software licenses based on 64GB units, meaning for every 64GB you use as to store data in memory you will pay SAP a licensing fee for. The more units you get the cheaper this gets. (2) hardware costs will vary from vendor to vendor. For example, IBM-UK charges roughly ~£80,000 for a 128GB server. Now tackle in shipment costs (could be negligible or discounted off), installation costs, and hardware support, you can see how just the total cost begin to accumulate. On top of that you need an SI to implement it, and then someone to support it. The other main issue this presents is scalability. There still is no real clear solution for infinitely scaling a HANA box. In my conversations with IBM they are working on a hardware (non-network) solution to daisy chain multiple HANA boxes together. But in order to handle future growth for all BI applications this will be a major sticking point.
So, put this all together and you have a very expensive solution with cost implications on initial size and future scalability estimations. At the end of the day, what does this mean for an ROI (Return on Investment) now? For example the CO-PA accelerator is said to decrease month-end COPA activities. If a company can close the books quicker what does this mean in terms of money saved from the business?
What happens if…?
Probably the most important sticking points which I’ve saved for last is about back-up, disaster recovery and high-availability. What if I have a power outage? What if an SSD (Solid State Drive) fails? What if the facility that houses my box burns down? These are all typically valid enterprise concerns. Enterprises have to adhere to strict governance, which in any well-run organization includes these three areas. As of right now, there is no concrete solution for back-up, disaster recovery and high-availbility in HANA and is largely up to the hardware partners. Which for many organizations immediately presents a no-go decision for going productive with HANA. I’ve seen this in BWA with the early adopters. In those cases you were able to fail-over to the database (with terrible performance) but still did not mean “production down”. The hardware vendors (and SAP) are actively trying to find solutions, but for now it is the elephant in the sales engagement room.