Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
glennverhaag
Explorer
This blog post is part of a series of posts, describing an integration project that connects the oee.ai manufacturing intelligence platform with the production planning capabilities of an on-premise SAP S/4HANA system.

The project was realized by students from the FH Aachen, University of Applied Sciences, with great support from project partners at oee.ai and SAP. You can check out the main article describing the project here.

Transferring Production Orders and sending back Confirmations


In this blog post, I will give an overview on how to:

  • Transfer production orders from an on-premise SAP S/4HANA system to an external manufacturing intelligence platform

  • Send a confirmation back from the manufacturing intelligence platform to SAP S/4HANA once the production order is finished


Note: We are using an S/4HANA 1809 system. The APIs used in the steps below might not work for all other versions of S/4HANA. More information on which APIs are available for your particular version, can be found in the API Business Hub.

Cloud Connector


Since our S/4HANA system is (most likely) running behind a firewall, we need a Cloud Connector in order to connect to the S/4HANA APIs from inside of our Integration Flows running in the BTP.

The Cloud Connector can be installed directly on an OS or inside of a Docker Container, a detailed description of the set up process can be found here.

Note: When setting up the “Destination Configuration” (see step 2 in the tutorial linked above), remember the name you choose for the destination. This is used to identify the specific Cloud Connector we want to access from inside the Integration Flow later on. In our case, we chose “s4hana”, as it’s easy to remember.

Fetching Production Orders from S/4HANA


 


Shown above is a complete Integration Flow, used to transfer production orders from our local S/4HANA system to the external manufacturing intelligence platform “oee.ai”. As you can see, the Flow is quite complex, so in this blog post I will focus on the actual pulling of production orders from the S/4HANA system (see the steps inside the red box above). However, for your better understanding, here’s a quick rundown of what happens in the Integration Flow before pulling the production orders from the S/4HANA system:

  1. We add authorization and request a list of all production orders previously transferred to the oee.ai system

  2. As the response from the oee.ai system is in JSON format, we need to convert it into XML to use it in the next step

  3. We create an order ID filter based on the before mentioned response, in order to make sure that we only pull production orders that aren’t transferred already


For more detail on the Integration Flows and the integration in general, check out our main post about the project.

Before we start actually fetching production orders from our S/4HANA system, it also makes sense to think about other use case specific filters. Maybe you only want orders from a certain production plant or machine to be transferred. In our case, we are using an OData filter to single out production orders containing a specified material.

As there are multiple user dependent variables (like the S/4HANA client number) we need to set in the process, it makes sense to set up “Exchange Properties” for individual configuration first. These properties act like global variables and can be used throughout the integration flow.

Exchange Properties can be accessed like this:

${property.Name}, replacing Name with a specific property name.


 

To set up Exchange Properties, first create a “Content Modifier” (Found under “Message Transformers” in the main navigation bar, see yellow highlight above). In the tab “Exchange Property”, you can then “Add” (blue button on the right) properties.

In our case, we need the following:

  • An S/4HANA client ID (sap_client), used to identify which client is making the request. Choose your specific S/4HANA client ID as the “Source Value”

  • An address to your local S/4HANA system (sap_address), used as a base URL to which we append the specific API address later on. Choose your S/4HANA URL as the “Source Value” (The URL can be found by opening the transaction “/iwfnd/maint_service” in your S/4HANA system, choosing a service and clicking on the “Call Browser” button in the “ICF Nodes” menu on the bottom of the screen. You will now see a pop-up window, showing you the URL to your specific S/4HANA system)

  • Properties to use as an OData filter, in this case we filter by material (sap_material). Any other valid OData filter would be possible here, for example filtering by production order ID or production plant. In case of material, choose your specific S/4HANA material ID as the “Source Value”. We are using “OBCP1000”, as that’s the material we have set up inside our S/4HANA system


Note: The S/4HANA system we used for this project, wasn’t protected by a firewall. If your local S/4HANA system is running behind a firewall, you might need to set up an RFC connection to your Cloud Connector, instead of accessing the API directly via the URL. For more information on how to set up an RFC connection, see the second part of this blog post.

Next, we create another “Content Modifier” for adding the filter. This time, we add a “Message Header” containing our property sap_material.


We need to set a name (in our case we chose “odataQuery”) and select “Expression” as the “Source Type”. As you can see highlighted in yellow above, our message header has the “Source Value” $filter=(Material eq, followed by the material property.

If you are filtering for a different property, change the Material eq part accordingly. It’s also possible to filter by multiple properties at once. For more information on OData filters, check out this tutorial. In our case, the full “Source Value” looks like this:

$filter=(Material eq ‘${property.sap_material}’)

Lastly, we will set up our actual query by creating a “Request Reply” (found under “Call” -> “External Call”) and a “Receiver” (found under “Participants”, we named it S/4HANA to signal which system we’re accessing). We then link the two with a “Connector” and choose “OData” as the “Adapter Type”.


 

As you can see above, the following values are needed for the “Connection”:

  • Address: “${property.sap_address}/API_PRODUCTION_ORDERS” (using the address property we set up in the beginning)

  • Proxy Type: “Internet”

  • Authentication: “Basic” (meaning, that we use a username and password combination to authenticate. More on Basic Auth in an SAP context here)

  • Credential Name: “s4hana_basic” (-> name of our Basic Auth credentials stored in the BTP Credential Store. This is out of scope for this blog post, so check out the documentation for the Credential Store here)


For the “Processing”, we also need the following:

  • Operation Details: “Query (GET)” (using the REST method “GET”, in order to pull / fetch data)

  • Resource Path: “A_ProductionOrder” (this specifies the path to which our filter is applied, here we apply it to the production orders)

  • Query Options: “${header.odataQuery}” (the filter we configured in the last part)

  • Custom Query Options: “sap-client=${property.sap_client}(our preconfigured client ID property)

  • Timeout (in min): “1”

  • Custom Query Parameters: “sap-client=${property.sap_client}(again using our ID property)


The query is now fully set up and ready to fetch all production orders corresponding to the filter we set. To see the next steps in our integration, you can once again check out the whole Integration Flow above, as well as our main article. Here’s a quick overview on what we do after obtaining the S/4HANA production orders:

  1. Apply a mapping to the production orders and convert them from XML to JSON, so they match the format we need for the oee.ai system. This step has to be repeated for every production order

  2. Once again add authorization, before creating the production order, the product (in case it doesn’t exist yet), as well as the corresponding “line item” (oee.ai specific data item) using POST APIs. Further mappings and conversions are required to format the data correctly


These steps may vary, depending on your external manufacturing intelligence platforms data model and specific use case.

 

Sending Production Order Confirmations to S/4HANA


 


Moving on to sending production order confirmations, we once again set up a whole Integration Flow using multiple BTP features as well as data from a custom MQTT listener. To give you a general idea of this Flow, here’s the most important steps:

  1. Whenever a production order changeover happens inside the production plant, a MQTT message is sent out to the MQTT listener and the Integration Flow is triggered. For more information on this part of the project, check out this post.

  2. We modify the MQTT message, to get the data we need from it. The message includes the new production order, that has just been started

  3. The new production order is saved in a database

  4. Once the new order is marked as running, the previously running order is marked as finished and a production order confirmation for the old order is sent to the S/4HANA system


While reading from S/4HANA using an OData API is pretty straight forward, writing is a little bit more complicated, as we need to use a BAPI (basically a SAP specific API).

The setup in our Integration Flow is similar to an API call, first we use a “Content Modifier” to add all “Exchange Properties” we need for the request, before configuring the connection.


As you can see, we need a production order ID (sap_PO_ID), as well as the quantity that was produced (sap_PO_Quantity) for the BAPI call. We obtain this information by setting up an XPath expression to the corresponding part of the production order (which we get from our database as mentioned in the overview above).

We can use the same “Content Modifier” to add the actual Confirmation in XML form. To do that, navigate to the “Message Body”, choose Type: “Expression” and add the following XML as a body:
<?xml version="1.0" encoding="UTF-8"?>
<ns1:BAPI_PRODORDCONF_CREATE_HDR xmlns:ns1="urn:sap-com:document:sap:rfc:functions">
<ATHDRLEVELS>
<ITEM>
<ORDERID>${property.sap_PO_ID}</ORDERID>
<YIELD>${property.sap_PO_Quantity}</YIELD>
<FIN_CONF>X</FIN_CONF>
</ITEM>
</ATHDRLEVELS>
</ns1:BAPI_PRODORDCONF_CREATE_HDR>

 

NOTE: The <ORDERID> must consist of 12 characters in total, in order to be accepted by the BAPI. In our case, the production order IDs in our S/4HANA system showed up as 7-digit numbers, as the 5 leading zeros were filtered out by the UI. This is the reason the <ORDERID> value above has 5 zeros added before the ID property. You might need a different workaround for this field, depending on the format your specific order IDs are in.

For <YIELD>, we use our Quantity property.

The field <FIN_CONF> has three possible values:

  • “X”: final confirmation

  • ” “ (empty field): partial confirmation

  • “1”: auto


In this case we chose “X” to mark the production order as done.

To set up the BAPI call, we again create a “Request Reply”, as well as a “Receiver” (S/4HANA) and link the two, this time choosing “RFC” as the “Adapter Type”.


As destination name, we need to enter the name we chose when setting up the Cloud Connector, in our case “s4hana”.

And that’s it! The confirmation is sent to your local S/4HANA system and the production order should show up as finished.

Conclusion


The S/4HANA APIs and BAPIs, when combined with features from the BTP, are a really powerful tool to integrate production planning into an external manufacturing intelligence platform. Once running, the link between the two system works flawlessly and can save a lot of time, as up to date production orders are automatically available in both Platforms.

The setup, however, can be quite challenging. Especially adding older components, like BAPIs, to the integration can cause a multitude of problems and the existing documentation isn’t always perfect.

I hope this post was helpful to those of you working on a similar project, if you have any questions feel free to leave a comment!
Labels in this area