Fellow Odaters,


I have previously published this blog,  Improved Inside-Out Modelling  mainly to try and point out the problems with mapping complex FM’s outwards and hopefully to discourage this practice. It was well-received and the follow-up is well overdue, sorry!

This current blog is not the blog I intended to write.

What I wanted to do was publish a blog to compare the cases for implementing the same scenario, using inside-out and outside-in modelling paradigms. Why would I wish to do this? Well, the truth is that an overwhelming number of help requests on SCN are related to problems with mapping RFC. This might not have been so bad, except that this is a bad habit to get into and not a good way of developing services.

An even worse trend is that developers are now writing custom function modules to act as service patterns – sorry guys, but you’ve got to stop doing this.

I’ve already pointed out elsewhere on SCN that SAP do not use RFC mapping generators to build Fiori services (although the lack of standardisation in Fiori services is another space topic for debate). Why don’t they do it? Have a guess…

To cut a long story short, I had to abandon the approach on that blog because the average service (with several entities, navigations, filters, etc) is incredibly difficult to build using the RFC generator.

My colleagues and I have written many services – we do use functional modules at times, but we never use the RFC generator.

Here is one line from my original draft that I do wish to retain:

Unless the service is extremely simple, such as one read or query operation on one entity, using the RFC/BOR generator is not advisable.

While there are some pros and cons to using function mapping, the cons outweigh the pros and overall the tool is just there to assist developers and does nothing towards producing an OData compliant model. This is especially true when you consider the new functionality available with OData 4.0.

Pros

  • Parameter names and type are extracted into model.
  • Bound to Dictionary. Useful for documentation and automated data conversion.

Cons

  • Extracted names often need editing for OData conformity.
  • Bound to Dictionary, backend exposure.
  • One function rarely performs all CRUDQ operations…
  • …and separate functions needed to cover CRUDQ may not use the same data typing for the entity sources.

Furthermore, the RFC generator does nothing that you cannot do by writing ABAP in the first place. I am not against using tools if they are useful; prior to the release of Service Builder all data and model providers had to be written from scratch, with the model provider code being especially long-winded. Service builder pretty much removed the need to do any work within the MPC, the extensions only need to be used for features that service builder does not support.

With the DPC, the generated artifacts are built on top of a solid class foundation which does most of the heavy lifting, all you need to do is provide some code in the right places.

Which leads us to “Code-based implementation”. I don’t like this description, because it’s misleading – implementations coming out of the RFC generator are also code-based!

I think I’m going to start calling them “Tailored implementations” instead. In fact the tailoring aspect is a good analogy.

Most FM’s are too big/small for the client body or may not even be suitable for the intended body part (underpants for a hat?). If the client is particular oddly shaped, you may spend weeks looking for a suitable “fit” in a limited wardrobe. In this ill-fitted scenario it’s more efficient to go to a tailor and tell them what you want. The main difference here is that you can be the tailor if you can sew a few bits of ABAP together, no high costs and waiting for delivery.

I’ll leave things at that for now. I do intend to follow this up with the practical example in a further blog but I’m really keen to get this topic aired properly. What I am hoping to do later is show a design that is outside-in but uses RFC functions without using the generator tools (to keep the RFC zealots happy; I’d use OO in preference any day).

That day has come: Outside-in modelling – a practical guide (RFC Rehab 1)


Regards

Ron.

To report this post you need to login first.

5 Comments

You must be Logged on to comment or reply to a post.

  1. Krishna Kishor Kammaje

    Great start Ron.

    Looking forward for the next blog. I am sure that it would take great effort to come up with a comparison of both approaches. But definitely much needed for the community. Let me know if you need any help in that.

    (0) 
  2. Prabaharan Asokan

    Hi Ron,

    Good blog once again. Since Gateway SP09 we have a feature called “soft state ” to store data in gateway hub bache . Can this be incorporated for such performant heavy operations ?

    Regards

    Prabaharan

    (0) 
    1. Ron Sargeant Post author

      Hi Prabaharan,

      I’m aware of Soft State and need to take a look at it in more detail on my SP09 system.

      Caching is something that might be considered as part of the runtime design but really has nothing to do with the differences in modelling paradigms. Either method can provide unwieldy amounts of data that caches may or may not solve. It’s not really meant to convert OData to a stateful platform.

      Regards

      Ron.

      (0) 
  3. Paul J. Modderman

    If I could like this entry more than once, I would.  I’ve seen you fighting this battle in the forums, and know this: you have my sword

    When I was learning service creation and modeling, I started with a teaching blog that gave step-by-step instructions to import RFC functions.  What I should have realized then was that the focus of those examples was to get readers used to the concepts at work in OData modeling and how things like $filter and $expand work.  Instead, I wound up creating my first couple of for-real services with that in mind: I would create a custom FM and they try to cram it into the RFC generation tools. 

    Later, upon seeing more *_EXT examples and needing to do things more than just wrap an RFC function, I saw the true power of tailoring my ABAP for whatever need was in front of me.  Nowadays, even presented with a near-perfect RFC fit for a service need, I would still choose a ‘code-based implementation’.

    I wrote a blog post a while back that gave an example of using the RFC import, and I’ve been playing with the idea of re-writing it for the cleaner code-based paradigm.  I am now putting it on my calendar to carve out the time to do the re-write. 

    Thank you for your voice on this topic, and others.

    Paul

    (0) 

Leave a Reply