Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
irfan_khan
Active Participant

This is Part 2 of my (re)introduction to SAP HANA. Part 1 reviewed how HANA came to be and laid out the principal differences between it and other offerings. This time, we hone in on the one fundamental difference that makes all the difference.


In the wake of the successful roll-out of SAP HANA and its adoption by more than 3,300 customers (to date), there is a growing consensus that the real-time technology that SAP has pioneered represents nothing less than the future of enterprise computing. Gartner has declared in-memory computing to be a top strategic initiative and has predicted that 35% of large and mid-sized companies will have implemented in-memory solutions by the end of 2015. A growing body of analysis shows that full in-memory operations would enable the typical data center to store 40 times as much data in the same space, while cutting energy consumption by 80%...all while boosting performance by many thousands of times.

The question is no longer so much whether your business will adopt in-memory technology; the question is when you will do so. It is the inevitability of this paradigm shift that has driven Oracle, IBM, Microsoft, and Teradata to announce their own "in-memory" and "columnar" technologies. I ended last time by suggesting that it is perfectly legitimate to ask whether any of these offerings can compete with SAP HANA.

The simple answer is no.

To understand why they can't, let's consider an analogy from another industry. There is a growing belief  - perhaps not yet a consensus -  that in the future, cars will run exclusively on electricity. The conventional auto makers seem to have embraced that possibility by introducing hybrid vehicles. They will tell you that such vehicles represent an evolutionary step toward the future of automobiles. Maybe. But is it the right step?

Building hybrids is all about adding. You take a conventional car and add an entire extra engine so that it can run on electricity (sometimes.) Next you add an enormous, expensive battery that accounts for a large percentage of the vehicle's total interior space, weight, and cost. Then you add an array of complex infrastructure to make the two engines and the battery work together. Finally, you add systems that monitor driving efficiency and that put much of the burden of fuel saving and positive environmental impact of the hybrid on the driver rather than on the car.

And what does all that addition add up to?  Hybrids are often significantly more expensive than conventional automobiles. They can also be more expensive to maintain and repair, and they struggle to provide comparable performance and handling. While they do offer better fuel economy than gas-powered cars, the gap between the two is not always as great as you would expect.

Compare the hybrid approach with the approach taken by Tesla Motors in the development first of their Roadster and subsequently their Model S. When Tesla set out to build an electric car, they didn't take a conventional car and start bolting additional stuff onto it. They started from scratch. As a result, they produced a fully electric, zero-emissions vehicle that can blow the doors off conventional high-performance cars (much less hybrids).  A Tesla doesn't even have a transmission; it doesn't need one.

Tesla didn't add. They subtracted.

When the conventional database vendors set out to incorporate in-memory computing, they took a page from the conventional auto makers' playbook. They came up with complicated schemes for moving data in and out of memory. They bolted column-store technology on top of their traditional row-based databases. They added indexing, optimizations, and additional copies of the data throughout the organization. And all of this still sits atop the old disk-based architecture, which hasn't gone anywhere. (Remember the internal combustion engine in a hybrid car?) It's just gotten bigger, and even more demanding of resources and maintenance.

The result? These vendors report that their hybrid solutions perform 10 times faster than their traditional disk-based systems. That sounds pretty good, until you realize that real-time enterprise computing with SAP HANA enables performance thousands to tens of thousands of times faster than conventional solutions. With that in mind, you can't help but wonder whether the standard database providers aren't going to an awful lot of trouble for a pretty meager result.

When SAP built HANA, we started from scratch. Like Tesla, we didn't add. We subtracted:

  • SAP HANA removes all barriers between applications, transaction processing, and analytics: uniting all three in a single platform.
  • HANA removes unneeded copies of the data from the enterprise environment, achieving everything with just one copy.
  • HANA eliminates the need to move data between the database tier and the application tier; all processing occurs where the data is.
  • HANA removes all delays between transactions and analytics: no more batch jobs, no more ETL, etc.
  • HANA eliminates the need to deploy separate environments for specialized analytics processing, bringing geospatial, text processing, statistical analysis, and all others together in the same unified platform
  • HANA removes layers of complexity between the people who need to use the data (business users) and the people who need to manage the data (data architects.)

The result? Author William Gibson is quoted as saying, "The future is already here -  it's just not evenly distributed." If you want to see the future of real-time computing in action, look to the organizations who have adopted SAP HANA. (I will be sharing stories from some of these companies in a subsequent post.) These forward-looking organizations are realizing the benefits of real-time computing right now. They are discovering a level of simplicity and performance, plus new efficiencies and capabilities, which were never before possible.

Sadly, those benefits will remain out of reach for any organization stuck with a technology that is best described as an evolutionary dead-end. The future is waiting. And it belongs to those who embrace the power of subtraction.

1 Comment