My last article on SAP Lumira, Edge edition generated quite a lot of buzz – both within our user community as well as internally at SAP. I think that’s great – My previous manager (who still manages the Lumira Product Management team) used to advise me that if I wasn’t generating enough controversy, I wasn’t pushing hard enough. Let me assure you, we are pushing extremely hard 🙂 .
Controversy also breeds speculation and speculation is the seed of misinformation: Will SAP Lumira, Edge edition replace HANA? What about my BI 4 deployment? Are we shifting away from HANA? What will the new architecture look like and will it co-exist with HANA? You can imagine from these questions, some of the controversy has even been from inside SAP!
In this space I’m going to shine a big spotlight on some of these fuzzy areas, but before I do, I want to tell you a story because I’m a firm believer that understanding strategy and motivation goes a long way to understanding future intention – and some of the feedback I’ve received recently is along the lines of “I see you are changing course for Lumira on SAP HANA with the introduction of SAP Lumira, Edge edition…”. The short answer is, “No, that’s absolutely not true”. For the longer answer, read on!
Due to this topic’s length (and apologies in advance), I’ve divided this post into multiple parts. This first one will focus on the story and strategy, and starting with the next one we’ll get back to some of the more technical details for Lumira Server, SAP HANA, and Edge edition.
Please note that while I am an employee of SAP, the opinions here are my own and the official SAP Lumira Roadmap is the authority on the future. All disclaimers about “the future can, and will change” apply here too.
“You can paint it any color, as long as it is black.”
Henry Ford is famous for this quote when talking about (the lack of) customization options for the Ford Model-T. It is a misconception that he said this in 1908 when the Model-T was first introduced. In fact, the Model-T wasn’t even available in black at all but came in green, grey, blue, and red from day one. By 1912 all cars were painted midnight blue, and only in 1914 was the “any color so long as it is black” implemented, mainly for cost and durability reasons.
The iconic vehicle opened travel to the common middle-class American and forced any would-be competitors like Chevrolet to up their game and truly compete with a superior product that could be delivered cheaper than anything else available at the time. By the time the Model-T was retired 19 years later in 1927, it had many forms including trucks, tractors, and snow vehicles.
“You can choose any platform, as long as it is…”?
In 2014, many customers viewed SAP Lumira Server on SAP HANA as the Model-T for data discovery because they thought SAP’s stance was, “you can choose any platform, as long as it is SAP HANA”. However just like the Model-T was available in many different colors and styles, SAP Lumira doesn’t have to have just one configuration. It is truly a misconception that Lumira was created to force SAP HANA as a platform.
However, there are many compelling reasons to choose SAP HANA as a platform today. As you hopefully read in my previous articles How SAP Lumira Server Runs on SAP HANA and How Lumira Storyboards Work on SAP Lumira Server, SAP Lumira on HANA works in a completely different way than pretty much any other BI product out there – and can do things that no other BI product made by SAP or anybody else can do. When we started building our “new BI” technologies, our options were to try to make the “traditional way” faster (which is what everyone in the industry is still trying to do today) or to take advantage of the massive SAP technology arsenal at our disposal and change the entire game. We obviously chose the path less traveled, but much of the magic required to deliver on it in 2014 was only available as part of SAP HANA.
Just like the Ford Model-T opened up travel for the average consumer and made it affordable, SAP HANA has changed how databases are architected, and even opened up new applications not possible before. Every major database player, including Microsoft, IBM, and Oracle, has introduced “new” in-memory technologies for their databases after the introduction of SAP HANA in 2010 (for the record, HANA’s contributing in-memory technologies like TREX and P*TIME have existed for far longer). SAP HANA arguably started the commoditization of in-memory technologies which has ultimately led to their availability to customers at a more affordable price – regardless of vendor. For Lumira Server, leveraging these in-memory technologies natively means opening up a new horizon of data discovery possibilities previously out of reach of the average business user.
Hmm… Making a previously unattainable technology like in-memory computing affordable to a larger audience while simultaneously provoking competitors step up and innovate more to compete? Pretty similar to what Mr. Ford accomplished over a hundred years ago for automobile customers don’t you think?
When “One Size Fits All” Doesn’t Work Anymore
The Model T had 19 years of continuous improvement from 1908 to 1927, but sales eventually started to drop. Competitors had many years to catch up – and catch up they did. General Motors’ Chevrolet automobile division gained market share by offering more powerful engines, additional features, more customization options, and even payment plans to appeal to a growing customer base. Customers became accustomed to many “nice-to-have” features coming as standard equipment and the aging Model-T no longer met the market’s needs.
In 1927 Henry Ford announced the “Model-A” as a competitive response to a growing array of opponents. Ford believed that the Model-A was so advanced that it “reinvented the automobile” and he therefore restarted the model naming at “A” again. The technical innovations of the Model-A made it another blockbuster success, but did not save the company from being the market leader with a 50% share to being destined to play leapfrog with the likes of GM/Chevrolet and Plymouth/Dodge for the next 80 years.
This is where the crucial difference in my analogy is – SAP Lumira Server has never had an “any platform as long as it is SAP HANA” as a philosophy – in automobile speak, it was simply the first “model coming off the line”. We don’t expect to take 19 years before we introduce the next innovation as Ford did, and we certainly don’t plan on playing leapfrog with would-be competitors in the future due to such complacency. In the highly competitive analytics space, we simply cannot afford to sit on our laurels because even 19 months is an eternity. In fact, we have new releases of SAP Lumira more often than 19 weeks – in fact we just had our 20th release in less than three years.
Something for Everybody, Not Everything for Somebody
SAP HANA is great – there I said it. However it is only great for some because HANA can handle large amounts of data and do some wicked-cool things with it on the fly. Many of our customers are not ready to consider SAP HANA because it would be overkill for their fairly small datasets or their limited number of users. But if we can offer “reasonable” performance and features for these smaller customers by using other SAP technologies in our technology arsenal, we would be able to make Lumira accessible to a much larger audience. Keep in mind SAP is not a “one trick pony” and our in-memory assets are not limited to just SAP HANA. Just consider: if we have five commercially available databases tailored to specific needs (MaxDB, SQLAnywhere, IQ, ASE, and HANA), how many “in-memory” technologies do you think we have? 😳
We cannot afford the mistake Ford made by resting on our laurels because we are ahead – and we ARE ahead: The way SAP does “in-memory” is not easily replicable because our technology isn’t really about keeping things in RAM (which anyone can do easily), it’s about creating new computational capabilities because the data is in RAM. We consider leaving a wide open space and giving competitors ample time to catch up to be a Really Stupid Thing To Do. However, just like a driver of a car doesn’t care how an engine works, business users don’t care about the technologies powering their analysis – so the real challenge is delivering a compelling experience to the user in the easiest, fastest, and most affordable way possible.
The risk is to assume we can accomplish that simply by making incremental improvements on how things have always been done instead of truly innovating. To some extent, you could argue that all “traditional BI” vendors such as Business Objects, Cognos, and Hyperion fell for that – and where are they now? They (we) are part of SAP, IBM, and Oracle respectively, and each one is helping their parent company compete against the smaller, more agile, but less holistic BI companies that are coming at them hard with niche “Self Service BI” and “Data Discovery” products that effectively created a new category for business intelligence in the wake of such complacency.
SAP Lumira’s Strategy With SAP HANA
SAP Lumira is very committed to SAP HANA. Introducing SAP Lumira, Edge edition as a non-HANA option for cases where the datasets, number of users, or both are limited is not a departure from our SAP HANA strategy. SAP Lumira Server for SAP HANA is still our native-on-HANA solution that leverages the full power and scalability of HANA. It still enables SAP Lumira to keep going long after competing solutions hit the wall, and is still the only solution in the market that truly can work on “live” data without caching or other artificial mechanisms to gain speed.
“SAP Lumira, Edge edition” is simply the next phase of our Lumira strategy that aims to meet the more modest needs of customers where our more scalable solutions aren’t required yet. Rather than enabling competitors to take advantage of the fact our greatest asset (SAP HANA) is also our greatest requirement (you need to have an SAP HANA), we are innovating to ensure that all of our customers can benefit from Lumira functionality today and have the option to scale up to SAP HANA when their needs grow.
It is also important to understand we are also not overlapping offerings: Once we do a better job of describing the differences between the “HANA” and “non-HANA” scenarios, there should be little doubt in your mind as to which to choose. I see very few cases where a customer would have a real dilemma on which way to go. The problem is there is scant information right now and that’s something I plan on fixing over the next few blog posts.
That said, the value proposition for SAP Lumira Sever leveraging SAP HANA continues to grow as hardware costs drop and more licensing options become available at the same time as core density increases and RAM’s cost per gigabyte nose-dives. Today, Lumira Server has a high cost floor because HANA does not “scale down” to smaller deployments in a cost effective manner. With the recent announcement that Intel E5 CPUs will be supported, it will be pretty easy to get a HANA-certified configuration for around $10,000 US soon. The price of memory is also dropping rapidly: want a terabyte of RAM in that $10K machine? Just add another $10K. What this means is that in a few years from now, SAP HANA will have effectively destroyed the premium in-memory databases command today and those still pushing traditionally architected databases (even when run “in-memory”) will look pretty expensive compared to the value HANA will bring. It’s only a matter of time.
Hmm… Maybe it isn’t SAP that has the “Model T” but some other companies with 19 year-old technologies they are trying to make “faster” instead of “better”? 😉
Stay tuned for my next blog in which I will cover Edge edition in more detail, and specifically some of the design decisions and trade-offs that we are contemplating to retain a “HANA-like experience” without actually running on a HANA appliance.