Skip to Content

The following steps will explain how to create the very first Java project to call OData services using the SAP S/4HANA Cloud SDK.

Note: This post is part of a series. For a complete overview visit the SAP S/4HANA Cloud SDK Overview.

Goal of this blog post

In this tutorial, we will do the following:

  1. Enhance the HelloWorld project stub to call an existing OData service.
  2. Deploy the project on
    1. SAP Cloud Platform Neo
    2. SAP Cloud Platform based on Cloud Foundry
  3. Write an Integration Test

If you want to follow this tutorial, we highly recommend checking out Step 1 (Setup) and Step 2 (HelloWorld on SCP Neo) or Step 3 (HelloWorld on SCP CloudFoundry), respectively, depending on your choice of platform. You will not need any additional software besides the setup explained in the first part of the series as the server will run on your local machine. If you would like to know more about communication management and identity & access management artifacts in S/4HANA, please follow the Deep Dive on this topic. We also highly recommend reading about the OData Virtual Data Model, which provides a much easier access to OData endpoints from the SAP API Business Hub.

Note: This tutorial requires access to an SAP ERP system or, as a fallback, any other OData V2 service.


In order to execute this tutorial successfully, we assume a working and reachable system of SAP S/4HANA on-premise or S/4HANA Cloud. You may substitute the presented cost center service by any other API published on the SAP API BusinessHub.

If you do not have a S/4HANA system at hand, you may use a public service such as the Northwind OData Service as a fallback solution.

Please note that depending on the platform (Neo or CloudFoundry) you are using, the configuration to the respective S/4HANA system might be different. In the following, we list the methods by which you can how you can access your system.

SAP Cloud Platform, Neo SAP Cloud Platform, Cloud Foundry
S/4HANA on-premise SAP Cloud Connector required with HTTP Destination SAP Cloud Platform Connectivity and Cloud Connector
S/4HANA Cloud

Direct Connection with BASIC Auth (technical user)

Direct Connection with SAMLOAuthBearer (PrincipalPropagation with BusinessUser)

Direct Connection with BASIC Auth (Technical User, see below)

Note that your application code is not dependent on this. Using the S/4HANA Cloud SDK, you can write your code once and it is capable of dealing with all different authentication and connectivity options.


Write the CostCenterServlet

The SAP S/4HANA Cloud SDK provides simple and convenient ways to access your ERP systems out of the box. In this example we will implement an endpoint that performs an OData query to SAP S/4HANA in order to retrieve a list of cost centers from our ERP system.

To get started open your previously created Hello World project (in our case this is called firstapp) and create a new file called in the following location:



import org.slf4j.Logger;

import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.util.List;


public class CostCenterServlet extends HttpServlet {

    private static final long serialVersionUID = 1L;
    private static final Logger logger = CloudLoggerFactory.getLogger(CostCenterServlet.class);

    protected void doGet(final HttpServletRequest request, final HttpServletResponse response)
            throws ServletException, IOException
        try {
            final ErpEndpoint endpoint = new ErpEndpoint();
            final List<CostCenterDetails> costCenters = ODataQueryBuilder
                    .withEntity("/sap/opu/odata/sap/FCO_PI_COST_CENTER", "CostCenterCollection")
                    .select("CostCenterID", "Status", "CompanyCode", "Category", "CostCenterDescription")

            response.getWriter().write(new Gson().toJson(costCenters));

        } catch(final ODataException e) {
            logger.error(e.getMessage(), e);

The code is fairly simple. In the servlet GET method, an ErpEndpoint is initialized for the default destination, which we will define later during deployment. With the help of the SDK’s ODataQueryBuilder a query is being prepared, build and executed to the endpoint. The query result gets wrapped to a navigatable List of CostCenterDetails . Finally the servlet response is declared as JSON content and transformed as such.

In addition, we require a new class called which is required to read the OData query response in a type-safe manner. Create this new class in the following location:



import lombok.Data;
public class CostCenterDetails
    @ElementName( "CostCenterID" )
    private String costCenterID;
    @ElementName( "CompanyCode" )
    private String companyCode;
    @ElementName( "Status" )
    private String status;
    @ElementName( "Category" )
    private String category;
    @ElementName( "CostCenterDescription" )
    private String costCenterDescription;
  • The SDK @ElementName annotation maps OData values to their corresponding object fields.
  • The Lombok @Data annotation automatically generates the boilerplate code for us:
    • getter and setter methods
    • constructor (for @NonNull fields)
    • hashCode(), equals(...) and toString()

You need to add Lombok to your provided dependencies. Add the following inside the dependencies section of the application/pom.xml file as an additional dependency:


In case you are working with OData endpoints from the SAP API Business Hub, we recommend trying out the OData Virtual Data Model. It already has all required classes and features  implemented and ready for you to use.


Deploying the project

Depending on your chosen archetype and SCP setup you can deploy the project on either SCP Neo or SCP CloudFoundry. If you face any problem with connecting to OData please find the troubleshooting paragraph at the end.


On SAP Cloud Platform Neo

Now you can deploy your application to your local SCP using the following maven goals:

cd /path/to/firstapp
mvn clean install
mvn scp:clean scp:push -pl application -Derp.url=https://URL

Replace URL with the URL to your SAP ERP system (host and, if necessary, port).
Note: the -pl argument defines the location in which the Maven goals will be executed.

Maven will then prompt you for your username and password that is going to be used to connect to SAP S/4HANA. Alternatively, you can also set these values as command parameters: -Derp.username=USER -Derp.password=PASSWORD

If you now deploy the project with the Maven command and visit the page http://localhost:8080/costcenters you should be seeing a list of cost centers that was retrieved from the ERP system. Note: Please login with test / test).


On SAP Cloud Platform Cloud Foundry

Before you can deploy the new version to Cloud Foundry, you need to supply the destination of your SAP S/4HANA system.

Connecting to SAP S/4HANA from SAP Cloud Platform CloudFoundry

In order to perform queries against your ERP system, you have to inform CloudFoundry about the location of your ERP endpoint. To do this, you need provide an environment variable with the destination configuration. Currently, you have two ways of accomplishing that.

Setting destination as environment variable using CF CLI

 cf set-env firstapp destinations '[{name: "ErpQueryEndpoint", url: "https://URL", username: "USER", password: "PASSWORD"}]'

Please change the values URL, USER and PASSWORD accordingly. Depending on your command line interface (for example, on Windows), you may need to use double quotes instead of single quotes and escape the double quotes:

cf set-env firstapp destinations "[{name: \"ErpQueryEndpoint\", url: \"https://URL\", username: \"USER\", password: \"PASSWORD\"}]"

Note: You can also add more ERP endpoints to this JSON representation, following the same schema. However, please note that “ErpQueryEndpoint” corresponds to the default destination used to create our ErpEndpoint.

Setting destination as user-provided variables using the Cockpit

Alternatively, you can directly set the destination variable using the cockpit. For this, find your application in the SCP Cockpit and provide the variable as user-provided variable:

Deploy to Cloud Foundry

In order to have sufficient memory available for the application, verify that you have adapted the memory and metaspace configuration in your manifest.yml file as explained in Step 3.

Now you can deploy your application to Cloud Foundry using the Cloud Foundry CLI (command line interface):

cd /path/to/firstapp
mvn clean install
cf push

If you change the destinations afterwards with one of the two methods outlined above, you need to at least restart (or restage) your application so that the environment setting becomes effective:

cf restart firstapp

Run on a Local Server

As mentioned in Step 3 of this tutorial series, you can run the project also on a local TomEE server. Here, you need to supply the destinations as an environment variable on your local machine (replace set with the corresponding commands to define environment variables on your command shell).

set destinations='[{name: "ErpQueryEndpoint", url: "https://URL", username: "USER", password: "PASSWORD"}]'

Again, supply the correct values for your S/4HANA system.

Afterwards, you can again use mvn tomee:run (within the folder applications) to start the server. Visit https://localhost:8080/costcenters to see your new feature in action.


Integration test to check CostCenterServlet

To construct an extensible integration test for the newly created CostCenterServlet, the following items will be prepared:

  • Adjustment: Maven pom file
  • New: test class
  • New: JSON Schema for servlet response validation

Adjustment: Maven pom file

First, let’s adjust the Maven pom file of the integrations-tests sub module, by adding a dependency for JSON schema validation:



New: test class

Navigate to the integration-tests project and create a new class:



import com.jayway.restassured.RestAssured;
import com.jayway.restassured.http.ContentType;
import io.restassured.module.jsv.JsonSchemaValidator;
import org.jboss.arquillian.container.test.api.Deployment;
import org.jboss.arquillian.junit.Arquillian;
import org.jboss.arquillian.test.api.ArquillianResource;
import org.jboss.shrinkwrap.api.spec.WebArchive;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.slf4j.Logger;

import static com.jayway.restassured.RestAssured.given;
@RunWith( Arquillian.class )
public class CostCenterServiceTest
    private static final MockUtil mockUtil = new MockUtil();
    private static final Logger logger = CloudLoggerFactory.getLogger(CostCenterServiceTest.class);
    private URL baseUrl;
    public static WebArchive createDeployment()
        return TestUtil.createDeployment(CostCenterServlet.class);
    public static void beforeClass() throws URISyntaxException
            new URI("https://URL")).build());
    public void before()
        RestAssured.baseURI = baseUrl.toExternalForm();
    public void testService()
        // JSON schema validation from resource definition
        final JsonSchemaValidator jsonValidator = JsonSchemaValidator.matchesJsonSchemaInClasspath("costcenters-schema.json");
        // HTTP GET response OK, JSON header and valid schema

Please change the value URL accordingly.

What you see here, is the usage of RestAssured on a JSON service backend. The HTTP GET request is run on the local route /costcenters, the result is validated on multiple assertions:

  • HTTP response status code: 200 (OK)
  • HTTP ContentType: application/json
  • HTTP body is valid JSON code, checked with a costcenters-schema.json definition

New: JSON Schema for servlet response validation

Inside the integration-tests project, create a new resource file


  "$schema": "",
  "title": "Simple CostCenter List",
  "type": "array",
  "items": {
    "title": "CostCenter Item",
    "type": "object",
    "javaType": "",
    "required": ["id", "companyCode"]

As you can see, the properties id and companyCode will be marked as requirement for every entry of the expected cost center list. The JSON validator would break the test, if any of the items was missing a required value.

That’s it! You can now start all tests with the default Maven command:

mvn test -Derp.username=USER -Derp.password=PASSWORD

Please change the values USER and PASSWORD accordingly.

If you want to run the tests without Maven, please remember to also use include the parameters.





Hint: Remember ERP username and password

If you do not want to pass the erp username and password all the time when executing tests or want to execute tests on a CI where more people could see the password in log outputs, you can also provide credentials in a credentials.yml file that the SDK understands.

To do this create the following credentials.yml file in a save location (e.g., like storing your ssh keys in ~/.ssh), i.e., not in the source code repository.


- alias: "ERP_TEST_SYSTEM"
  username: "user"
  password: "pass"

Afterwards you may pass the credentials file using its absolute path:

mvn test -Dtest.credentials=/secure/local/path/credentials.yml



In case you are trying to connect to an OData service endpoint on a server without verifiable SSL certificate, you might see the following error message due to an untrustworthy signature:

Failed to execute GET https://<URL>/$metadata
  • To manually override the chain of trust, you can set a special flag on the destination configuration. To avoid any further issues with untrusted certificates in your local Neo deployment environment, please change the TrustAll flag in your destinations configuration file ./config_master/service.destinations/destinations/ErpQueryEndpoint
  • If you are running into the same problem in a CloudFoundry deployment environment, please adapt the destinations environment variable to additionally include the properties map:
    [{name: "ErpQueryEndpoint", url: "https://URL", username: "USER", password: "PASSWORD", properties: [{key: "TrustAll", value: "true"}]}]​


If you are still facing problems when connecting to the OData service, try the following to get more insights into what is happening and what can be logged:

  • Add a logger implementation to the test artifact’s dependencies in order to get more detailed log output during tests: expand the dependencies section of integration-tests/pom.xml with:
  • Supply a custom error handler to the OData query execution. For example, use the following class ./application/src/main/java/com/sap/cloud/sdk/tutorial/
    import org.slf4j.Logger;
    public class ODataV2SimpleErrorHandler implements ErrorResultHandler<ODataException> {
        private static final Logger logger = CloudLoggerFactory.getLogger(ODataV2SimpleErrorHandler.class);
    	public ODataException createError(String content, Object origin, int httpStatusCode) {
    		String msg = String.format(
    				"OData V2 Simple Error Handler received backend OData V2 service response with status %s, full response was %s", 
    		ODataException e = new ODataException();
    		return e;

    and in CostCenterServlet, add the following to the ODataQueryBuilder chain:

        .errorHandler(new ODataV2SimpleErrorHandler())
To report this post you need to login first.


You must be Logged on to comment or reply to a post.

  1. Julian Frank

    Hey Alexander,

    first of all: great example on how to use the SDK.

    One question from my side: is there anything more to do to establish the ERP connection when deploying to CloudFoundry? I always receive an UnknownHostException that the name or service is not known.

    Thanks and best regards


    1. Philipp Herzig

      Hi Julian,

      this depends a little bit on the setup you intent to use. We have added a small little prerequisite section that explains which configurations are currently available. To which system do you try to talk to? Is it an S/4HANA Cloud system? Or do you try against an S/4HANA on-prem system. Also it matters whether you try to do it from Neo or CloudFoundry (but see yourself in the updated prerequisite section).

      Don’t mind to drop us additional questions.

      Best regards



  2. Yangyang Chen

    Hey Alexander,


    Very good tutorial, give us many valuable information. We have a similar scenario that we need to call an Odata service provided by S4/HANA cloud from our Application which is deployed on SCP CF. I have two questions related to this scenario,


    1.How can we adapt this tutorial to support multi-tenant?

    2.If we want to query the Odata service directly from UI, are we able to do that? We are thinking about directly consuming Destination service provided by SCP CF, user can configure tenant-level destinations through Cockpit, but it looks like Destination service is still at Beat version on SCP CF.


    Could you kindly share us some ideas? Thank you in advance.





    1. Sander Wozniak

      Hi Yangyang,

      regarding the secure setup of a multi-tenant application on CF, please have a look at our tutorial on this topic:

      You are right, at the moment, the destination service on CF is still in Beta. As long as this is the case, you could rely on the Neo environment to consume a multi-tenant destination. In addition, please consider the option to also expose an OData V4 service using the SAP Cloud Platform SDK for service development. A tutorial on this topic can be found here:

      Best regards


  3. Bastian Schiele

    Hi Alexander,

    thank you for the great blog.

    We had some problems with calling the OData Service. We wanted to call the cost centers service from our local S/4 HANA and got the following error:

    Failed to execute GET https://<URL>/$metadata

    As the service-side certificate is not signed from any root CA, hence, is not trusted (if you call the service in browser you get a warning).

    To avoid any issues with untrusted certificates you may set the property TrustAll=TRUE in the created destination property in following location of your project:


    With that configuration it works fine for us in Neo.

    Best regards,




    1. Alexander Duemont Post author

      Hey Bastian,

      thank you very much for your feedback. Finding out about the TrustAll flag was a good catch! I changed the blog post to also include your solution, see Troubleshooting.

      Best regards,


  4. Sankeerth Narsina

    Hi Alexander,

    Thanks for such a great blog.  I tried the steps mentioned here – deploying the app locally and accessing the data at localhost:8080. I am able to see the expected JSON data. But when I am trying to access the endpoint to display the JSON data after deploying the app to Neo environment, it gives me following error.

    HTTP Status 500 - Destination "ErpQueryEndpoint" not found.

    Could you please help me fixing this issue.




    1. Philipp Herzig

      Hi Sankeerth,

      you need to configure the ErpQueryEndpoint destination in Neo using the destination tab of your application. It has to look similar to what Tri has posted below.

      Best regards


      1. Sankeerth Narsina

        Thanks for the response. I see the result in the webpage like as below.

        Failed to read metadata for "TimeEntryCollection".

        I also added the destination along with the property TrustAll as TRUE. But it didn’t help.

        I am facing this issue only in neo environment but not in cloud foundry. Can you help me with this?

        1. Alexander Duemont Post author

          Hi Sankeerth,

          In case a metadata file is either too big (thus taking long to load) or non-existing, then please consider disabling the look up altogether:

            .withEntity("/some/path/to/endpoint", "SomeEntity")
            .select("Some", "Fields")

          By using the “withoutMetadata()” modifier, the meta information about data types will not be loaded. As consequence some special (de)serialization of unusual attribute types may not work as expected.

          Best regards


    1. Philipp Herzig

      Hi Tre,

      as written in the blog, please try setting the TrustAll property to TRUE. In the destination service this can be done via additional properties you see in your screenshot.

      Best regards


          1. Tri Minh Le

            Hi Philipp,

            Thanks for your effort.

            This is error when i check connection.

            Even though I can save it successfully, the result is

            Failed to execute GET$

            I’ve checked the log:

            Caused by: connect timed out (local port 34229 to address (, remote port 443 to address (

            Have you tried this successfully on Neo, Philipp?

            Thanks in advance.



Leave a Reply