Skip to Content

SAPTechEd, The aftermath

Sitting here with a big bowl of soup and a sore throat, I ponder upon  the past week at Sap TechEd. It’s been, as always, an interesting  experience which leaves me with a lot of food for thoughts. There was a  definite focus on 3 main topics:

  • Mobility
  • Cloud

This is absolutely not unexpected, because these three topics  have been dominating the entire SDN community for the past year. Let’s  take a look at each of them and highlight the key take-aways from the  event.


HANA  (In-Memory computing) has been a hot topic for 2 years already  actually. Back in 2009, the keynote at TechEd Vienna was all about the  bottleneck in speed, which the hard disk proved to be. Now, 2 years  later, I’m happy to report that this bottleneck has finally been  tackled, and we can now read data directly from the memory. This means a  huge performance improvement on data heavy operations. Even up to  100.000 times faster!

This sounds incredible, but let’s  not forget what is at the base of this performance increase. In order  to boost performance, you need to start from a situation where  performance is lacking. In all cases, the reason for poor performance  was the sheer amount of data to be processed and the ineffective way in  which the programs tried to process it.

I’ve seen  examples in my consulting days, of programs that ran for multiple days,  to process a rather large dataset. After tweaking these reports with  proper design principles however, we managed to reduce the runtime to  less than 2 hours. This also sounds magical and was achieved without  using HANA, but by simply using proper design principles.

I’m  not saying that HANA is not a huge leap, because it is. The thing that  bothers me however, is the fact that HANA promises to bring a  performance increase out-of -the box which is far greater than can be  achieved by refactoring. That’s a statement I don’t like. It means that  systems run in danger of getting cluttered with poor code and future  developers, no longer faced with performance issues, will implement  rubbish in systems.

HANA might eventually lead to lazy  developers. This is me being paranoid here,  because it is truly an amazing technology which I can’t wait to put in  practice, using proper design principles and still using my good old  binary read.


If you have read any  of my previous blogs, checked my LinkedIn profile or read some of my  tweets, You should know by now that I’m a mobile technology adept. This  was caused by a rather random encounter at TechEd Vienna back in 2009  and was certainly not the case before this encounter.

It  doesn’t seem that long ago, but boy, has this technology matured in  those 2 years! Out of 10 people on the floor, 8 of them had a tablet  device (apart from my own Galaxy Tab, they were all iPads). Those  without a tablet device felt inadequate. (quote from an ex-colleague of  mine)

From a technological perspective, I saw a real  maturity on mobility. Nearly everybody knows what a MEAP is and how it  works. They all understand the principle of synchronization and offline  storage and they all understand that one-size, does not fit all.

Accross  the border at SAPHireNow, I heard a very different sound… There, they  were still informing business oriented attendees on what mobility is,  and what it is not. Apparently, many still fear/hope that it’s some sort  of mobile SAPGui. The single greatest fear there, is that mobile  technology is not mature enough for enterprises.

This  ofcourse is not true, rather the opposite. Enterprises are not ready for  mobility, because it demands a certain openness and trust in your IT  and staff. It demands a confidence in innovation and new technology.

A  lot of this fear is actually being fed by a vicious circle of  misinformation. Business presents to business. How can they ever bring  accross a correct message if they misunderstood the message themselves. I  actually heard one presenter describe his mobile project like a “Ford  Model T”. To him, this may have seen like a compliment, because the  Model T meant a huge leap in automotive industry. He couldn’t have chosen  a worse metaphor. Half of the audience now thought that Mobility was not  a mature technology, just like the Ford was merely the start of a new decade. The other  half understood that his project resembled an antiquated slow and  uncomfortable ride.

Key message here is: Mobility must  be an IT driven project, not a business driven project. I’m sorry, but  we have the knowledge of it, let it up to us to solve your  functional needs with the proper technology.

Cloud Technologies

This  last hot topic suffers from very much the same problems as Mobility.  Cloud technologies are actually quite mature, but busines still seems  very much uninformed on the advantages and possibilities. Most heared  objections are: “Our data is somewhere out there!”, “What about  security?”, “What about my personell?”.

Do  you as a CEO know where your data is stored now? No, but surely your IT  staff will be able to tell you, if you find the right person to talk  to. Cloud is no different, they embark in a service contract with you.  You want to know where your data is? Ask them, they’ll be able to answer  you.

All your internal IT systems are locked away in  your highly secured private network. No different with cloud. the  systems are also in your secured private network. Only physically, they  are somewhere else. But does it really matter if they are in a bunker in  the USA or in a bunker in your backyard?

Your  personnel manages your systems now, mostly without actually touching the  hardware. This won’t change in the cloud. Your own personnel will still  maintain the systems, only the hardware is no longer your concern. It’s  a matter of letting experts focus on what they’re good at, so you don’t  have to find new experts on the job market.


There  were many more topics which weren’t quite as hot. One in particular was  NetWeaver Gateway. to me, it’s closely related to mobilty, but I did  see some things which I want to highlight here. SAP promotes Gateway as  the module which allows access to your backend data via REST and OData.  SOAP was too complex and heavy for some lightweight applications, so  they implemented a REST/Odata provider instead.

Being  SAP, they obviously needed to add some more information in the OData  syntax in order to get some metadata accross. Much like WSDL’s for  SOAP… Take a lightweight protocol, add unnecessary data to it, put it  in a separate module which adds extra overhead. I’m not sure if it’s  actually going to be faster and easier than SOAP.

Oh  and as an extra, Gateway allows to generate REST based services from  Screen Scraping. Yep that’s right, we’re going back to “Batch input”. Oh  dear…

So if you, as a developer, ever think of using a lightweight protocol instead of SOAP, please read thisRESTful webservices in ABAP using a generic controller

On a final note I  would like to say that this blog would probably not have existed  without the relentless airconditioning in the Airplane, Hotel and IFEMA,  giving me my sore throat and keeping me indoors. But I’m not  complaining, because I most definitely needed some rest anyway 🙂

I’ve seen a lot of progress in technology and education and met a lot of enthusiasts who share my vision and passion.

I can’t wait to see what next year brings!


ps: find original version of my blog here

You must be Logged on to comment or reply to a post.
  • Hello,
    Thanks for this pragmatic blog.
    One question: You opine that even with HANA you would suggest performance optimizing measures!
    Is that really true? Also, I read somewhere that ABAP for HANA would not be the same and it would not be backward compatible! Am I wrong?
    To be candid, I was having a struggle within as to whether I should read the book on 'Optimizing techniques' anymore!


    • Hi Kumud,

      With HANA, you indeed need to use a different approach to your data selection. Basically, you first need to create a new model in which you load your data so that you can query it afterwards. Compare it a bit to having a BI in your business suite, which you can directly query and which keps it's data in-memory.

      So you still need to be careful in creating your model.

      The need for optimization will be lower if you get your model right, but there will still be situations in which you'll require the good old binary read. If you try to push everything in the model, than you'll just move the heavy operations to the offloading in HANA, which isn't visible in the program itself, but it does have an impact on your servers.

      Other than that, HANA exists next to the classical database, so you'll always have the choice between a classic select, or a HANA operation.

      Conclusion: That book on optimizing techniques is still very useful.


      • Hello Again,
        You would be surprised but your answer ended one of my long standing doubts!
        Did you also attend hands on workshop on HANA!
        Can you please link me to the appropriate blog written by you on it!


        • I haven't attended a hands on for HANA specifically, but the subject was touched in nearly every session, so I managed to connect the dots.

          If I manage to dig a little deeper in the subject, I'll definitely write a blog on it.


  • Tom,
    You ask: does it really matter if (data) are in a bunker in the USA or in a bunker in your backyard?
    Yes, actually, it can, depending on the data and on the data protection laws governing the organization's operations. The SAP user group in Germany DSAG has written an excellent and comprehensive document on data protection for ECC 6.0, and the location of personal data is a consideration in Germany and other EU countries, and potentially elsewhere. So before dismissing such concerns, I encourage readers to review their recommendations. I have posted a copy of their document on, and it is also available on the DSAG website.


    • Hi Gretchen,

      Thank you for pointing this out. There are indeed certain data protection laws that influence the cloud offerings. To tackle these issues, many vendors have started offering private clouds. Basically, it's a clouded infrastructure where you know the physical location and where the contract with the provider includes a while lot of legal addenda to adhere to legislation. (I don't know the details, but apparently they cover the legal problems)

      Accenture for example already offers such a service, and I think that SAP River includes such ponderations as well (though not absolutely certain, might have misunderstood).

      So rather than dismissing the idea of cloud services, enterprises must (and they actually do) think of solutions to their problems.

      The problem is not so much IT, but legislation. Information can't and must not be limited to a geographic region. Even in the old days, people would send information across borders using pigeons.

      But since IT is more flexible than legislation, I'm very happy to see the creative solutions such as private clouds.

      Thank you for bringing this point up. It's definitely an interesting development which I neglected to tackle in the blog.