Skip to Content
Author's profile photo Former Member

SAP HANA — the X Factor

Flying back to Palo Alto the long way gave me time to realize that I had spoken about SAP HANA in a video, in my SAP TechEd keynote, and then with the press, analysts, customers, etc, but I had neglected to talk to you, the SDN community!

One of the things I like most about SDN, is that I can talk to a technology savvy group without holding back. So, no flowery phrases, here are the SAP HANA numbers:
•    460B Rows Aggregated 20x Faster
•    200x Price Performance
•    Scan 2B, Aggr 10M records /second /core
•    Dunning  — 1200x Faster
•    Settlement – 50x Faster
•    Ageing – No Aggregates
•    Linear Scaling

These numbers were so phenomenal that SAP issued a separate press release with further technical details about SAP HANA, so please take a look.

All of the numbers prove one thing: SAP HANA provides a foundation on which a new generation of applications can be built, enabling customers to analyze large quantities of data from virtually any source in real time. (That is a line from the press release, but it is not flowery phase, it is a fact.)

In-memory computing reduces complexity by many factors – layering, orchestration, reading, writing, parsing, intermediary formatting, etc, — so we can unwind the complex layers that were put in place to take application logic processing out of the database. The simplicity of this design – and the lack of layers – creates the ability to develop applications extremely quickly while writing only the minimal amount of code necessary.

With SAP BusinessObjects Strategic Workforce Planning, which was built in only 70 days, we are delivering the first application built on SAP HANA. This is a portent of great things to come, and I cannot wait see what the SDN members do with SAP HANA!

The best place to get started learning about SAP HANA is right here, on SDN.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo John Appleby
      John Appleby
      Hi Vishal,

      Good to see you yesterday and thanks for blogging on SCN. It's important and we welcome you :=)

      My questions on HANA would be whether the devil in the NewDB roadmap is in the detail.

      There are several reasons why we build Data Warehouses today - but a few of them are:

      1) Harmonised Master Data - single view of the truth
      2) Business Transformations - the ability to abstract database tables into Business Objects
      3) Consolidation and Aggregation - the ability to consolidate data from various data sources, and aggregate it into a high-level view
      4) Timing - the ability to extract data so we have an overnight view of the data

      My concern with the end-point of the HANA journey is that I don't understand how you will do 1) 2) 3) and 4) in real time. Do we build some sort of semantic layer that allows HANA to calculate this?



      Author's profile photo Witalij Rudnicki
      Witalij Rudnicki
      Hi John. Question is to Vishal, but I hope you do not mind me sharing here as well. Nt that I know all the answers, but supposingly ICE should help build better data warehouses - no matter BW or BO-driven: (a) both column and row stores to boost the performance, (b) temporal tables for data-series analysis, (c) built-in OLAP functions and MDX support, (d) new ICE SQL Script language to build non-materialized calc views with caching etc. But I am quite sceptical too when it comes to your points 1) and 3) - like where in real-time data scenarios the data cleansing suppose to happen, or how to harmonize data from different sources.
      My one more just-off-the-press post on HANA:
      Take care. Time to sleep. -Vitaliy
      Author's profile photo John Appleby
      John Appleby
      Of course I don't mind, SCN is a public forum :=)

      Really looking forward to SAP being able to share with us some of the real meat in the HANA data warehousing strategy!

      Author's profile photo Harshit Kumar
      Harshit Kumar
      Hi Vishal,

      Welcome to the SCN!! I was one of the participants at TechEd Bangalore where you introduced HANA. I must say that today HANA is THE most talked about product in SAP community.


      Author's profile photo Martin Lauer
      Martin Lauer
      Hi Vishal,

      will it be possible to run a standard benchmark, e.g. TPC-H, on HANA?

      I'm wondering about the Aging/Aggegates approach. Does it make sense in a multi-user environment to perform the very same aggregate calculations again and again?
      I understand that one can scale HANA to perform my query at unprecedented speed but it also keeps the CPUs hot.
      I think its more interesting to know how many users can run their own (or some standard) analysis/queries in parallel with (almost) unnoticeable processing times (e.g. less than 1 second) with a fixed total number of CPUs.

      There must also be some caching to take advantage of the multi-user setup? Reuse of aggregates (until cache invalidated) of a different user could speedup my query (even if a perform different follow-up calculations with these aggregates). Just because one can calculate aggregates on the fly doesn't mean that this is an efficient approach as every calculation eats CPU time.

      Mixed OLTP and OLAP processing is a big bang but I don't know if there are applications that share OLTP and OLAP loads equally because otherwise you probably miss potential optimizations for either the OLTP or OLAP load side. Maybe it's possible to build an intelligent optimizer 🙂

      I'm looking foward for more HANA news.


      Author's profile photo Former Member
      Former Member
      Hi Vishal,
      I think Martin is right to ask for an official benchmark with transparent rules. No disrespect for the numbers you give but, for example, what's the cost in the "price/performance" ratio that you provide? Is it TPC-H-like? Why are there only HANA numbers on analytical queries while H.Plattner claims equal performance for OLTP and OLAP? What about writes, updates, deletes? How does HANA behave during concurrent querying and transactions, multi user scenarios? What's the cost involved to keep a HANA environment up and running? A TPC-? benchmark would help to understand HANA better.
      Author's profile photo Sandeep Kumar
      Sandeep Kumar
      Hello Vishal,
      I "loved" the demo shown at Teched'10 Bangalore:)
      The way data fetching is speed up is unbelievable .
      I hope we can change the way we do complex computing in future using "HANA".


      Author's profile photo Former Member
      Former Member
      Hi Vishal,

      Have you tired to evaluate the performance benefits of loading the entire repository part of the SAP database coupled with some application tables into HANA/In-Memory? I mean if the ABAP runtime and application code does not have to hit the DB, I feel the performance gain would be huge!

      Also the use of HANA for batch programs which perform a lot of disk I/O so that instead of Disk I/O the data instead can be read from memory!

      All the best with HANA. I'd love to see it go more mainstream and deliver benefits to existing SAP ERP/CRM applications.