Skip to Content
Hasso used his now-familiar blackboard style to educate the audience on the revolutionary promise of in memory analytics. As an interesting contrast, he pointed out that the much hyped-advances cloud computing and virtualization are valuable, but are just fundamentally delivering existing solutions in a new way.

Hasso laid out a simple vision, even if the math makes it a bit complicated. He believes that an individual computing blade will soon have 8 CPUs with 16 cores each. Each of these blades will be able to hold up to 500GB of memory. With 10 blades, you would have 5 TBs and 1280 computing elements.

Using column store compression such as in BWA, raw transactions can be compressed by a fact of 10. This means that this 10 blade system can hold the equivalent of 50 TB of data. To put this in context, this is approximately the size of the printed collection of the U.S. Library of Congress. Or, if I followed Hasso’s math, large enough to run the combined financials of 70 companies.

The column store compression makes querying the data fast, as well. Hasso re-demonstrated SAP BusinessObjects Explorer reinforcing accessing 1B records in less than a second. In a twist, he showed the analogous demo from within Microsoft Excel pointing out the limitation to the current non-accelerated version of Excel. 

Will everything change? Only time will tell.

To report this post you need to login first.

4 Comments

You must be Logged on to comment or reply to a post.

  1. Vijay Vijayasankar
    At a minimum, I expect our approach to data modelling to change. At a simplistic level, I would expect that a lot more BI data will move over into an account model from a key figure model. At a macro level, as I mentioned in my own blog, I expect OLTP and OLAP and acelerator appliances all to merge someday soon.

    But this keeps me thinking – how much is too much? Human brain can only assimilate so much information at a time. And with the amount of data that we generate now – I wonder if a lot of analytics power will be just watsed because of our limitations to use it.

    (0) 
    1. Jonathan Becher Post author
      To your question on how much information is too much, I think we need to distinguish between exploring data, consuming data, and presenting data. Perhaps machines can explore unlimited amounts of data, experts consume modests amounts of data, and we present summaries.
      (0) 
      1. Muthu Ranganathan
        Often there is confusion between data, information out of data, insight out of information and wisdom out of insight. If in-memory analytics help transform data to wisdom, businesses would be excited
        (0) 
  2. Bernd Boecker
    you need massive redundancy with these in memory databases – it takes a long time to start up a 100gb in memory database (it has to be read from somewhere, loaded, indexed – it is almost like building a database from scratch everytime you startup, the start times are long)

    I would not say it is “X versus Y”, it is more like “X can be used to solve a problem you might have with Y in some very special cases – at a pretty large cost, and it does not replace Y, it works with it”

    (0) 

Leave a Reply