Skip to Content

Object Orientation and Performance – Mission Impossible?

Why is OO perceived as incompatible with high performance? I asked myself the question after reading one of the comments to the blog ABAP Trilemma, where the author discusses the weighted relevance of Security, Performance and Design. The comment related to the well-known “truth” (some would call it Conventional Wisdom) that Object Orientation and Performance does not go well together.

I posted a brief reply to this comment, but realized this is too important a subject to be left alone. Hence this (hopefully) provoking rant.

The two tongues of ABAP

The ABAP world is unique in the sense that it spans the cleft between the two main design paradigms in computer programming; Procedural and Object Oriented. To my knowledge, no other language has such a schizophrenic group of followers as ABAP. On one end, there are the “Object Evangelists”, a mix of converted oldtimers who have seen the light and the younger generation of developers who never bothered with any of the stone-age languages in the first place. On the other side, we have the “Procedural Protestants”, firmly rooted in the lore of Cobol, PL/1 and other legacies of the largely IBM mainframe way of doing things – the origins of SAP.

The comment focused on the well-known “fact” that Object Orientation does not mix well with high performance. Good design (and strangely everyone – regardless of camp affiliation – seem to agree that this means Object oriented programming) implicitly means we’ll have to sacrifice something on the performance side. If it looks nice, it won’t run well.

Should we believe this? Honestly, I don’t think so. Not anymore. Let’s look at what the above “truth” implies. I will do this by postulating a few theses of my own.

1. Applications dealing with high data volumes can be made less complex with OO.

Most of the applications dealing with high data volumes tend to be fairly straight-forward, conceptually. They read data, process, then provide some form of output or data update. This should actually provide a good reason for dealing with it in an OO-centric way. The data access itself is usually down to relatively simple select statements (I’ll deal with that later). What complicates matters is the actual manipulation of the data, and it’s my humble opinion that this is far better handled by architecting a proper OO development. Defining and using business objects and their methods will simplify the application, as opposed to writing a (perceived) streamlined classical ABAP, complete with subroutines and call to function modules. The OO-based application will be easier to read, maintain, and should not bog down your system perceivably. Even if it does, there are other techniques you can use, such as Shared Memory Objects, parallel processing (qRFC’s) and so on. None of these techniques are tied to classical, procedural ABAP.

2. Performance issues are largely related to (persistent) data access.

Most of the performance-related issues with large data volumes relate to the DB access itself. This is largely independent of the application design. Considerations such as intelligent table access via proper indexes, reading a limited set of fields, proper use of indexed internal tables and so on can easily be done also within an OO context. If the report or application is going to be run consecutively with varying select criteria or processing parameters, one should consider building a shared memory object to hold the desired data, as opposed to repeatedly reading large data volumes from data base tables. Again, splitting the processing using tRFC’s or the newer qRFC technique should be investigated. In any way, the DB access issues are not related to how the application itself is designed.

3. What about the other aspects of “performance”?

There are different flavours of “performance”. What about reuse? Ease of maintenance? Total cost of ownership for procedural vs OO? I’d much rather maintain and enhance an (well-written and properly structured) OO application than a 5000-line procedural monster. Anytime. Also, I’ve never seen an ABAP application that has not been modified at a later point in time. In more than 95% of the cases, the modifications have been done by another developer than the creator of the original application. Usually from a different service provider, as well. I believe this speaks for itself.

4. The “OO is not performance-friendly” statement is used as an excuse to avoid having to learn OO.

Lastly, and maybe most importantly, I believe the “OO is not performance-friendly” mantra is – to a large extent – being put forth by those developers/managers/devco’s who are not too comfortable with the OO world themselves, in order to avoid having to deal with it. It’s convenient to chuck the whole OO discussion down the drains by using a cheap argument. Also, I believe that most of the ABAP’ers dealing with performance-related developments are still very firmly rooted in the procedural world. No offense intended.

Conclusion

Sure, there are cases where a short, 50-lines procedural ABAP does the trick, and can be written in 5 minutes – as opposed to doing an OO design approach. But these are exceptions. Normally, you don’t get assignments like this. Your standard development project spans weeks, if not months, and involve whole teams of programmers, not just you.

This last statement also shows the OO-induced shift towards a more community-centric way of working (think Scrum, Agile, XP). Speaking in OO is, to some extent, a social experience. Your team is developing a community of entities (objects) that work together to resolve a common goal. Creating a behemoth of a procedural beast is more often than not a fairly lonely task, undertaken in splendid isolation in a cubicle or office at the end of the corridor. As we did in the past.

To report this post you need to login first.

9 Comments

You must be Logged on to comment or reply to a post.

  1. Former Member
    yeah definitely OO makes code much interpretable and lot easier in terms of support or development perspective.

    Thanks for sharing this.

    Best Regards
    Saujanya

    (0) 
  2. Former Member
    Trond,

    thanks for writing this. ABAP Objects can, of course, slow down each and every project when used inappropriately, but that applies to about every programming technique that’s out there. If you just try to write procedural programs using ABAP Objects statements and things get ridiculously slow, hey, then it must be ABAP Objects that’s slowing things down, right? (That’s a variant of your argument 4).

    It should also be noted that it’s not necessarily a good idea to use ABAP Objects like other object-oriented languages. If you have learned OO in Smalltalk where everything down to the most simple primitive value is an object and then try to recreate this in ABAP Objects, things will get really slow – naturally so, because ABAP Objects classes were never designed to support this scenario. Sometimes it’s just a better idea to pass along a data structure with some parameters instead of creating an object just because you think that everything has to be an object.

    The bottom line: Whenever and wherever blind belief in a certain paradigm or blind opposition to it overrides common sense, expect heated discussions, but little else.

      Volker

    (0) 
    1. Former Member
      ABAP OO is definitely more time consuming as the design has to be well thought about before the build which is not the case in procedural language. Debugging a OO is a bit harder than procedural, the readibility improvement is questionable, a code can made very readable in conventional ABAP too. OO should be mostly used in component developments and I find it futile to use it in reports. Using OO just for the heck of it does not make sense.
      (0) 
  3. Matt Harding
    Firstly – Apologies if this appears twice – Commenting was acting up on my first try…

    Hi Trond,

    The core issue with this argument that you hear often (at least IMO) is the OO purists ignoring the context of perceived performance problems and sticking to basic OO patterns not wanting to “dirty” their code (I use purists not to describe real OO purists that have way more experience and knowledge than me, but those OO purists who don’t and have read something and they stick to it without really understanding reality).

    But if we could take this argument with our procedural counterpart and put an example to it, we could then pull in an OO pattern to tackle the design which could help explain these approaches to tackling high performing requirements. But also as you say, sometimes a 50 line procedural piece of code does cut the mustard – especially if you’re working in Data Migration 🙂

    Another great example to discuss more recently is HANA where “layers” are being encouraged to be removed to potentially increase performance a …well a lot – HANA is inherently asking us to break our OO constructs to head back to more of a stored procedures style approach and my concern here is that the procedural programmer may see this as a way to start ignoring OO.

    Well I say this but it isn’t really asking us to do this if we’re a good OO designer. It is asking us to lose some of the higher level OO wrappers that can protect us from needing to understand everything to do something (which is what procedural programming typically requires of us); but it is also asking us to build a performant layer that integrates with our design; still leveraging OO practices. eg. Just because you have a stored procedure or complex SQL that converts financial transactions in various currencies to a single currency, doesn’t mean you don’t wrap it with the right OO logic that is then integrated through the OO design, ensuring it does not break all objects logic in between.

    Anyway, I admit there are even extreme cases where I may not use OO at all for performance reasons – but these are isolated, and don’t have too much logic to fit into my head. What worries me most is if people just buy HANA to avoid good performant, secure, maintainable design.
    Cheers,
    Matt
    ps. I used to work with real-time systems, and there was very little OO in that space, because in those days, there was very little memory, CPU cycles and throughput – but those days are pretty much non-existent in the SAP world – even on mobile devices (again – IMO).

    (0) 
    1. Trond Stroemme Post author
      Hi Matt,

      I agree with all who say that extremist views are counterproductive and should be avoided… that was what inspired me to write this in the first place. A badly written OO app can be a lot worse than an excellent procedural solution. But I still believe that good OO can and will be at least as good as any procedural solution.

      As for HANA, it’s an interesting concept. At this year’s Teched, there was one comment (can’t remember from whom) stating that lots of BI implementations were sub-optimal – badly designed models/cubes and so on – and that porting these to HANA, while improving performance, would still not solve the underlying architectural problems. I guess bad design can be found everywhere, which just reinforces my points… and yours.

      My impression is that SAP, both with HANA and the recent emphasis on BOL (recently ported from CRM to the ERP world) is focusing on “black-boxing” much of the application logic into the (persistent) objects themselves. Looking forward to see where this brings us, and the impact on the whole ABAP community. Not sure it will drive more people away from OO; hopefully it will have the contrary effect.

      Regards,
      Trond

      (0) 
  4. Naimesh Patel
    Hello Trond,

    First of all thanks for the article. It covers both the aspect – OO ABAP & Performance – which I try to respect both while developing. I write a lot about ABAP objects along with other ABAP aspects on my ABAP Help Blog.

    We should try to use Persistent services when we are dealing with data for updates. But if we are dealing with large volumes, persistent services for updates it would hamper performance: Selecting the Data using query services &  Instantiating objects for each row.  Than we should focus on getting best out of both – OO & Hardware using Parallel processing. If we are worrying about the sequence of fields in the Select Query or  use DB HINTS to speed up the performance, we must refrain using the Persistent services.

    Performance on OO is also largely depend on coding practice. Performance of Copying internal table, even in the normal programming, will hamper the performance. Instantiating the object within a LOOP would take higher resources but can be eliminated by Implementing Singleton design patterns. Similarly we should try implementing other OO Design Patterns.

    Another argument of not using OO would be strict timelines. If developers don’t have proper command over OO they would end up writing bad OO e.g. copying data from one class to another class instead of using the them as reference. Sometimes they want to use OO, but they just don’t know how to efficiently do that so, end up creating STATIC methods. This would in turn increase the TCO as static has their own set of problems – can’t be redefined, can’t get rid of object reference as there is no object.

    Thanks,
    Naimesh Patel

    (0) 
    1. Trond Stroemme Post author
      Hi Naimesh,

      thanks for your comments. I believe the transition to using BOL would largely mean using persistent services, although I see BOL as an abstraction layer on top of the persistent objects. This is a good thing, in my opinion, as the persistent services concept in itself can be both confusing and complex. As for performance issues, I believe they need to be investigated before making the transition. Again, HANA looms on the horizon as a possible life-saver here. Lots of unknowns!

      As you say, using best practices (design patterns) is a must, whatever you do.

      I’m not in agreement of the argument related to strict timelines. IMHO, project managers should insist on using OO. This includes inducing (or inspiring) the necessary levels of competence in all project members. Again, IMO, the only way of ensuring this is to force OO as the only acceptable way of coding, with the before-mentioned exceptions (ad-hoc programs to perform very limited tasks). I believe anyone using the “timeline” argument will have to pay for it later, when yet another load of unmanageable developments is added to the code base. Or, maybe more to the point, their successors will have to pay 🙂

      It’s a little like saving on the cost of building a house by buying sub-standard materials and skimping on construction rules. Then, when the earthquake hits, we all know what happens…

      Regards,
      Trond

      (0) 
      1. Naimesh Patel
        Hello Trond,

        Argument related to strict timeline is the excuse I have heard a lot. Sometimes its a hype as the people don’t know how to actually work on that. But most of the time the governing body who controls the budget only looks for the end product in the budget.

        If we try to introduce agile with proper TDD which ultimately result into very robust software, but if they don’t realize that they would have trouble when earthquake comes – just after Go-Live!

        Thanks,
        Naimesh Patel

        (0) 
  5. Suhas Saha
    Hello Trond,

    I could not have asked for a much better start to 2012. So many blogs on OO, goes to show you there are people who want to champion the OO ABAP cause.

    Take care,
    Suhas

    (0) 

Leave a Reply