Skip to Content
Author's profile photo Former Member

Developing high-quality OData services (1) Hints on common problems

I would like to share some of my experiences in developing OData services using SAP Gateway in this article, and I would be glad if a discussion evolved out of it. Let´s start with an overview of common problems and some hints on how to avoid them. I see those hints as an addition to the Best Practices published by SAP.

Problem 1: Requirements and domain knowledge – just start coding!?

That problem is not specific to OData service development, it is an issue for almost every software developer. In contrast to a traditional ABAP developer, a backend service developer may have to work separately from the front-end developers building the client that the service interacts with. That may lead to unnecessary errors and iterations to adapt your code to requirements revealed to you step-by-step and too late.


  • Start working without precise requirements if necessary, but make it clear to everyone from the beginning that the risk of errors and the development effort rises without precise specification.
  • If you are not an expert in the respective domain and SAP module yet, ask for someone who can answer questions on domain terms, transaction codes and tables.
  • Precise requirements are no contradiction to an agile working mode where more and more user stories are added over time, as long as expectations for each user stories are clear and precise.

Problem 2: Service Modelling – a picture is worth a thousand words

The Gateway Service Builder is an effective tool for modelling OData services with SAP Gateway – but only if you already have a picture of the entity set types and there relations ships in mind. If not, you will likely get lost as soon as you have to define associations, referential constraints and navigation properties, and even more when you start implementing navigation in service operations.


Gateway Service Builder (Transaction Code SEGW)

The SAP-internal data model can be very complex, comprising dozens of business object types with hundreds of attributes and allowing for special cases which may not be needed in your application. If you do not take the chance to simplify the data model exposed by your service, then your service will be harder to understand and to work with for client developers. Moreover, it will show suboptimal performance, as more data will have to be transmitted.


  • Visualize the service model. Start with pen and paper. If it get´s too complex or a “neat” documentation is needed, use a tool such as the OData modeler
  • Expose as few properties as possible for each entity set, get rid of everything which is not required for sure, it will be rather easy to add it later.
  • Before exposing the SAP-internal data model “as is”, try to find out whether several business objects in SAP can be combined into one for your application safely.


SAP OData Modeler for Eclipse

Problem 3: Implementation – to RFC or not to RFC

There is some serious criticism against it in SCN, and it is still so true: On of the biggest misunderstandings in Gateway service development is: “There are standard BAPIs for the operations, and you just have to wrap them”. Such statements are mostly made by consultants who never looked at the details, or by developers who never worked on complex services yet.

Typical reasons which prevent simple wrapping of existing RFC modules are from my experience:

  • Missing fields
  • Missing filter / select options
  • Unspecific error messages in case of business errors – e.g. “Message E … can not be processed in plugin mode HTTP”
  • Unexpected errors during processing by the Gateway framework, e.g. due to empty date fields or unexpected Commit Work.

An unnecessarily long runtime is another drawback of RFC module calls, as far more data is read from the DB than required, but for the first version of the service that may be acceptable. And there are also specific weaknesses of the RFC/BOR generator when used without custom coding, some of which are:

  • Occasional errors in data types of properties (when used for modelling entity types)
  • No chance to handle more than one line of table parameters.

However, the approach of using existing RFC modules and possibly the generator as starting point also has some strengths: It gives you a very quick start with data access using water-proof business logic, some error handling, readiness for Hub deployment and paging (skip/top). Apart from the implementation, I do consent to modelling entity types using the RFC/BOR generator. I believe that´s fine if you plan to use RFC modules as a basis for implementing one or more service operations and if you thoroughly select and rename some properties to have consistent and clear names throughout your model.

3_RFC_Generator.pngGeneration of service operation by mapping of RFC Modules with RFC/BOR Generator


  • Never believe that everything will be quick and easy when wrapping existing RFC modules, but expect some manual effort to make them work as required.
  • If a remote-enabled function module for a service operation is available (standard or customer), then a flexible option is to quick-start with the RFC/BOR generator, copy the generated code from the DPC class, re-implement the method in the DPC_EXT class, paste the code there and then adapt and amend it according to your requirements.

Problem 4: Testing – is green really green?

The Gateway Client is included in SAP Gateway as service testing tool and you should use it – but will you only test HTTP response codes. Particularly with reading operations, it may easily happen that you return a response with missing attribute values due to some mistake in your service implementation. As long as there is no runtime exception the HTTP status code will be 200, and the Gateway Client Test will be green.


Gateway Client /IWFND/GW_CLIENT – will test HTTP response codes, but not body content


  • Create a test case in the Gateway Client for every relevant operation in your service in the development system and run all of them before transporting changes to test system. Do the same in the test system before transporting to the production system.
  • Use the export / import functionality for test cases (in menu Gateway Client) to transfer them from one system to another. In the test system, select only the read operation test cases and export them for import in the production system.
  • Make sure that someone checks response content after every relevant change to your code – either yourself, your colleagues, your customer or an automated test tool.

Problem 5: Performance – oh my…

Let´s face the truth: Performance is often a weakness of SAP systems – and there are some good reasons for it, such as extensive checks of user authorization and input data consistency. However, as service developer you have to do your best to keep response times in an acceptable magnitude. Especially the Query operation of a service (Get entity set) can cause problems. If the implementation is bad in terms of performance, already a few hundred or thousand of datasets in the result set may lead to response times of several minutes.


  • If performance is not satisfactory, first check your service model for any properties which are not used by the clients and delete them after you copied it to a backup project.
  • Review and refactor the implementation of the operations with long response times, most likely you will find room for improvement. If that´s not sufficient, runtime analysis with transaction SAT may help (that´s a story for a forthcoming article).
  • Instead of having clients ask for updated data every 5 minutes: do opt for a Push Service (only if the clients have fixed IP addresses / host names or if you have SAP Mobile Platform).
  • Ask client developers to request JSON format instead of XML whenever possible. As a rule of thumb you can expect shorter response times for all requests returning about 100 entities or more.
  • Check SAP Note 1801618 for recommended values of system profile parameters. That note is intended for Gateway Hub systems. In case of embedded deployment, you can still use it to find out whether some of the parameter values in your system are far too low for optimal performance.

You will find more hints in Gateway Performance Best Practices – How to consume Gateway from performance point of view by David Freidlin.

Problem 6: Cry for re-use

A number of tasks in service implementation will occur over and over again, e.g. reading filter criteria, get object key values, writing log entries, throwing business exceptions with messages, implementing paging with $skip and $top. Coding that stuff again and again would be a waste of time and mental capacity. Copy & Paste from existing projects is a possible solution, but it´s not particularly convenient and it´s one of the top root causes for errors in code.


  • For tasks which can not efficiently be extracted to separate methods, extract code templates including the required DATA declarations. If you have no idea yet, you may take code from the RFC/BOR generator as a starting point, e.g. for:
    • Reading filter values
    • Extracting key values
    • Determining lv_skip and lv_top values for paging
    • Throwing business exceptions and collecting messages
    • Creating timestamps from separate date / time fields and vice versa.
  • However, do not use templates to duplicate sections of identical code, as it will be laborious and error-prone to make changes later on! Rather extract it to separate classes, e.g. for:
    • Reading and writing longtexts (DB access and conversion to/from table are always the same, key composition is object-type specific and can be done in sub-classes)
    • If you have to read from several related DB tables repeatedly, create a table View in SE11 (gives you an inner join, not applicable for left outer join).


Editing a simple code template for timestamp creation

Problem 7: Documentation

Service documentation should at least comprise information on the contained entities and relationships, on the meaning of entity properties and on the relation of entity properties to fields visible to SAP dialog users. If you do provide that information, either you or somebody else will likely spend more time later on understanding the service before being able to work with it than you would have spent documenting it. The OData standard provides some degree of service self-description, however compiling a complete documentation still requires quite some manual effort, which you should take into account when planning service development projects.


  • Get the OData modeler for Eclipse and import the model from the Service Catalog of your SAP Gateway system to produce a nice diagram showing entity types and their relations as shown above for problem 2.
  • In addition to the property names of entity types, do enter a short but easily comprehensible description in the “label” column for each entity and for each property in the Gateway Service Builder (cf. Figure 1). This will be included in the response when clients request the metadata document of your service.
  • For documenting the relation of entity properties to fields visible to SAP dialog users I personally work with screenshots from the Gateway Service Builder and SAP dynpros – see screenshot below for an example.


Portion of a screenshot from Gateway Service Builder as part of documentation of entity type properties

IW33 Header Data.png

Screenshot from SAP transaction with references to entity type properties as part of the service documentation

What´s your experience?

Which problems and solutions did you encounter in modelling, implementation, test and documentation of OData services with SAP Gateway? Let´s discuss them.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Ekansh Saxena
      Ekansh Saxena

      CC:  SAP Gateway

      Author's profile photo Ekansh Saxena
      Ekansh Saxena

      Hi Ringo,

      Eagerly waiting for next story on SAT 🙂 .

      Author's profile photo Former Member
      Former Member

      Many thanks for sharing the knowledge.



      Author's profile photo Former Member
      Former Member
      Blog Post Author

      I´m glad that you found the article helpful. There are tons of challenges and possible solutions to be investigated - I will be trying to cover some of them during the next months.

      Author's profile photo Florian Henninger
      Florian Henninger

      Well written, but I think all these problems are part of our job. Also there are tons of solutions out there how to prepare.

      You just need to decide what way are the way you want to go. So let's see how your way looks like 😉


      Author's profile photo Former Member
      Former Member

      Thank you for sharing your experience!

      • If you have to read from several related DB tables repeatedly, create a table View in SE11 (gives you an inner join, not applicable for left outer join).

      Additionally you can use ABAP CDS Views (from ABAP 7.40 SP5). Outer joins are possible here. Is is even possible to generate a Gateway service directly out of a CDS view.

      Author's profile photo Former Member
      Former Member

      Ringo, thank you for collating this information, very useful resource for developers.

      One area I am keen to learn from others is how to reuse existing SAP Odata models. In our developments we frequently have to do some master data look ups e.g. Plants, Business Partners, Sales Order etc.  When looking at the SAP standard Fiori Apps / Factsheets the data providers are using deep expands or custom BADI classes to fetch data - so it is not easy get re-usability from the existing models.

      Building a new Odata model for each GW project to enable f4 value helps from scratch does not seem right and slows development down?

      - Warren

      Author's profile photo Pavan Golesar
      Pavan Golesar


      Very Helpful.


      Pavan G

      Author's profile photo Anubhav Pandey
      Anubhav Pandey

      Very nice blog!

      Author's profile photo Former Member
      Former Member

      Thank you for your post!

      It is of great value and help for people who want to bring their OData Services to a higher level of quality.

      Thanks again!