Skip to Content

During the 2011 Mastering SAP Technologies conference, Matt Harding and I presented an entry titled “Semantic SAP”. The idea for the entry came from my experience using a Firefox plugin called “Operator”. In a nutshell, Operator reads the HTML of the current web page and parses out any Semantic mark-up encoded within the page e.g. Microformats, such as vCard, or RDF as RDFa. It then uses this information to offer Actions to the user, such as mapping addresses or creating contacts in your address book.

SAP is full of data that can be used in this way. Consider the Customer Entity. Each Customer has a Name, an Address, and a number of telephone numbers and email addresses to use for communication. Customers also have Contacts which are linked to the Customer. So the premise stands that if we can apply Semantic meaning to this data, SAP can make intelligent decisions about the data it is displaying, and in our case, offer the user Actions, providing different ways to consume the data.

For example, SAP could offer the following Actions to a user viewing a Customer

  • Create an Outlook Contact from the Customer or the Contacts
  • Call the Customer or Contact via integrated Telephony (e.g. Skype)
  • Show the Customers location in Google Maps
  • Send an Email to the Customer or Contact

In this blog I will outline the approach we took to build a prototype which offered this functionality and discuss some of the possibilities that it may open up for future development.

Before I go any further I’d like to acknowledge SAP Mentor Tobias Trapp who has blogged extensively on Semantics. I’d recommend reading his blogs to get a better understanding of the work that is going on in this area.


It all starts with a Model.

Before we can attach any Semantic meaning to our SAP data we need a model, or more precisely an Ontology, which will formally define the domain we are working in. For this we use OWL, the Web Ontology Language. Our tool of choice for modelling the Ontology is Protégé, which is a free, open source Ontology editor developed at Stanford University.

Our Ontology is fairly simple and can be seen below. We have modelled Classes to represent Customer, Contact, Name, Address and Telephone. Customer and Contact are our own creations, hence the “demojam” namespace. Address, Name and Telephone are based on the existing OWL ontology for the vCard microformat.


It’s important to notice the relationships between the Classes. In OWL, these are called Object Properties. We can see that Customers have Contacts. We can also see that both Customers and Contacts have Addresses, Telephone Numbers and Names. This modelling allows us to define the relationships between our data. These relationships are not based on the SAP data model – the Ontology needs to accurately reflect the Domain you are working in.

What I haven’t shown is that when we create Classes in Protégé, we also add Data Properties (similar to attributes in ABAP Classes) to each Class. For example, the Telephone Class has Data Properties to represent “work” and “mobile” telephone numbers.


The Big Picture.

Although the goal of the prototype was to offer Actions to the user based on the data they are viewing, the underlying goal was to have SAP describe its Business Objects using the Resource Description Framework. By doing this, SAP will be applying semantic meaning to its data, and this will allow the relevant Actions to be determined. The Ontology will provide SAP with the instructions on how to do this.

RDF is a language that allows us to make statements about resources and their relationships. It does this by making statements in the form Subject-Predicate-Object. Often these statements are referred to as Triples.

For example, “The Customer ABC has a mobile phone number 123”.

  • The Subject is “Customer ABC”
  • The predicate is “has a mobile phone number”
  • The object is “123”.

So we need to write some code in SAP that will take our Ontology and a Business Object as input, and generate an RDF representation as output.

Simple 🙂

To demonstrate this idea further, the following is an RDF representation generated for a Customer in my ERP System.



Some of the statements we can make about this data are:

  • “Customer_0 has a name Name_1” – Name_1 is an object that holds the components of a Name.
  • “Name_1 is of type Name” – We defined the Class “Name” in the Ontology.
  • “Name_1 has a Family Name of MyCustomer” – We defined “Family Name” as a Data Property of the class “Name”.
  • “Customer_0 has an Address Address_2” – Address_2 is an object that holds the components of an Address.
  • “Address_2 is of type Address” – We defined the Class “Address” in the Ontology
  • “Address_2 has postal-code 3004” – We defined “postal-code” as a Data Property of the class “Address” (part of the vCard microformat).

With our Customer now rendered as RDF, we are able to apply intelligent reasoning to the data and offer Actions to our users. The screen prints below show the RDF rendered as a directed graph (a common way of visualising RDF), highlighting the Name and Address relationships described above.




Show me some Actions!

At the beginning of the blog we discussed the type of Actions we could offer to our Users once we applied some Semantic meaning to our data. The vision we had for Semantic SAP can be seen in the following screen print.



We have used the Global Object Services (GOS) menu to offer the Semantic SAP Actions. From a SAP GUI perspective this allows us to apply our Semantic SAP Actions to almost any SAP Object.

The following Sequence Diagram will give an overview of the process for building the Semantic SAP GOS menu.



It’s worth discussing a couple of parts of the implementation, such as how the Data Provider and the Semantic Engine/Parser work, and how we get the Ontology into SAP.


Getting the Ontology into SAP.

Protégé allows us to export our Ontology as an RDF/XML file. We then import this into SAP which stores the information in Custom Tables that represent the Classes, Object Properties (Class Relationships) and Data Properties (Attributes).

Once the Ontology is imported we need to do a couple of configuration steps to help SAP map from its view of the world to our Model.

Firstly, we link SAP Business Objects to some of the Classes we imported. The GOS framework will provide the Object Type and Object Key. In the Customer case we need to know that KNA1 is a Customer.

Secondly, we link the Data Properties we defined to SAP Data Elements. This effectively provides a mapping between the SAP Data Dictionary and the Class Data Properties.

Once we have this information, we can dynamically generate a structure for each of the Classes that are mapped to a Business Object. This provides a container for our Data Providers to populate, and our Semantic Engine to parse.

The convention we follow for the generation of the structure is:

  1. Each Class in the Ontology will be a Structure. 
  2. The Data Properties of each Class are created as fields in the structure, using the Data Elements that we mapped earlier.
  3. The Object Properties (the relationships between our Classes) are  represented by nesting structures within structures based on the  modelled relationship. These may also be nested tables if the  relationship is 1:n. The Object Property name is used as the name of the  field.


Data Providers

Each SAP Business Object needs a Data Provider. A Data Provider Interface defines the contract that these classes must conform to.

The responsibility of the Data Provider is to populate the structure generated from the Ontology with the relevant data for the requested Business Object. The advantage of this approach is that it encapsulates the Business Object specific logic needed to get from the relevant database tables to a well-defined structure based on a known convention. This means that although each structure is different for each Business Object/Ontology, it can be parsed in the same way by the Semantic Engine.


The Semantic Engine

The job of the Semantic Engine is to take the output of the Data Provider and turn it into RDF Triples. This involved the development of a recursive parser and the use of the RTTI libraries. The Parser is reliant on understanding the conventions that were used to build the structure it is parsing. It can then use the information stored when the Ontology was imported to generate the RDF representation of our Business Object.

Now that we have an RDF representation we can determine the appropriate Actions!


The Semantic Action Factory.

The Semantic Action Factory returns a list of Actions to be shown in the GOS Menu. It does this by asking all the registered Semantic Action Classes if they can provide an Action based on the RDF representation of our Business Object.

This framework makes it easy to add new Actions. You simply create a new ABAP Class that implements the Semantic Action Interface, and look for the Semantics that you understand in the RDF that is provided.


Some Examples of Semantic SAP in action.

So what can it do? The screenshots below demonstrate some of the Actions that were implemented for the Semantic SAP prototype.


Add the Customer to your Outlook address book.


Show the Customer’s location on a map.


Call the Customer.


Email a Customer.


Where to next?

As an example of how extensible this approach was, during the development we modelled Work Orders, and offered Calendar integration based on the different “Events” that happen in a Work Order. And because we had also modelled Customer, we just needed to create a relationship between our Work Order and Customer Class to enable all the Actions we had for Customers to also be available for Work Orders. This is a great example of the power of Linked Data.

We like the idea of using the WebDynpro CHiPS technology to offer Semantic SAP to any Web Dynpro Application.

We also exported our entire Customer database using the RDF representation and used a triple database and SPARQL to make queries on the exported data. This creates amazing opportunities for discovering relationships in your data that a relational database will never show.

And then you can load it all in HANA and have the whole thing in Memory…

Finally, if after all that you’re interested in the Demojam presentation, Graham Robinson has provided it SAP Inside Track Sydney and other Aussie get togethers. Thanks Robbo!

Applying Semantic Technologies at SAP (PS, you need to listen right to the end 😉 )

To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Former Member
    Well, this makes it easier to understand why I couldn’t (and shouldn’t have tried to) explain  the Semantic Engine in under a minute, but maybe we can enlist Kaj at TechEd as a ring-in because if anyone could do it, I imagine he could.
    And finally, I wonder if whiteboards on stage at DemoJam would be a first!

    Nice work Al!


  2. Former Member
    hi Alisdair

    thanks for the blog – great stuff.

    I was at the demojam and kinda missed the huge practicality of this one. It would have been great for those of us who’s synapses don’t fire quite as quickly as yours to have had this blog before the event

    ta john


Leave a Reply