Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
HenningH
Advisor
Advisor
0 Kudos
The new version of the SAP S/4HANA Cloud SDK Java libraries is available since today. You can update your dependencies to version 2.5.0 and consume the new version from Maven Central. We have also released version v11 of our out-of-the-box continuous delivery offering consisting of a ready-made Jenkins server and a complete delivery toolkit.

In this blog post, we will walk you through the highlights of these releases. For a complete overview, visit our release notes for the Java libraries and for the continuous delivery toolkit. The release notes also include the change log of all our releases so far.
At the end of the article, you will find a set of instructions on how to update to the new versions.



Register for the upcoming openSAP course on the SAP S/4HANA Cloud SDK. For more details, read the announcement blog post.






SAP TechEd 2018 Las Vegas took place last week, and Barcelona is approaching soon! Like in Las Vegas before and in Bangalore afterwards, the SAP S/4HANA Cloud SDK is well-represented with numerous sessions, including hands-on workshops and CodeJams.


Check out the agenda for SDK-related sessions in Barcelona. Especially make sure to register for hands-on sessions now, as capacity is limited.








Beta of SAP S/4HANA Cloud SDK for JavaScript


In case you have missed it, we launched the SAP S/4HANA Cloud SDK for JavaScript (beta) at TechEd Las Vegas. It bring the capabilities known from the Java Virtual Data Model (VDM) for easy consumption of OData services to projects written in JavaScript or TypeScript. See it in action in our tutorial.

Java Libraries: Release Highlights


Eased consumption of SAP Leonardo services


SAP Cloud Platform offers powerful services that give access to advanced capabilities in the areas of machine learning and blockchain.

Already version 2.1.0 introduced an experimental class ScpCfService that gives easy access to such services on Cloud Foundry and allows adding required authentication headers quickly.

Version 2.5.0 improves on this with the introduction of (also experimental) dedicated consumption classes for specific services. In detail, you can easily access SAP Leonardo Machine Learning Foundation APIs and SAP Cloud Platform Blockchain APIs

To conveniently access machine learning services, include the new module com.sap.cloud.s4hana.services:scp-machine-learning as a dependency in your pom.xml file. Afterwards, you can leverage the new class LeonardoMlService from the package com.sap.cloud.sdk.services.scp.machinelearning as shown below. To run your application on Cloud Foundry, bind the corresponding service for machine learning from the service marketplace to your application. Take note of the service type that you have selected.
LeonardoMlService mlService = LeonardoMlFoundation.create(
CloudFoundryLeonardoMlServiceType.TRIAL_BETA, // select the service type from the options provided by the enum
LeonardoMlServiceType.TRANSLATION); // select the API that you want to access from the options provided by the enum

HttpPost request = new HttpPost();
// prepare request headers and entity in body (not shown)
mlService.invoke(request, response -> {
// process response to create some result (not shown)
return result;
});

To access Blockchain services, include the module com.sap.cloud.s4hana.services:scp-blockchain as a dependency and use the new classes FabricService (package com.sap.cloud.sdk.services.scp.blockchain.hyperledgerfabric) or MultichainService (package com.sap.cloud.sdk.services.scp.blockchain.multichain), depending on the blockchain service you are using.

Java VDM: OData media streams


OData entities can be linked to media streams. For example, in SAP S/4HANA Cloud, the attachment service gives access to the content of attachments.

The Java VDM now support retrieving OData media streams that are accessible via the /$value path on an entity. Entities such as AttachmentContent which expose media streams have a new method called fetchAsStream().
The usage of the method is similar to a navigation property fetch method:
List<AttachmentContent> attachmentContents = new DefaultAttachmentService()
.getAllOriginals("...", "...", "...")
.execute();
// find the right attachment (not shown) and access the attachment
AttachmentContent attachmentContent = attachmentContents.get(...);
InputStream inputStream = attachmentContent.fetchAsStream();

It returns a java.io.InputStream that represents the media content.The entity must have been retrieved first from the service, and there is no caching of input streams.

Further improvements


Roles and scopes have always been represented as Authorizations. Authorization is now a proper Java class, instead of an interface. This simplifies the transition of code related to authorization checks across the Neo and Cloud Foundry environments of SAP Cloud Platform.

Version 2.5.0 fixes an issue where queries could not be executed on Java VDM entities that were loaded through function imports or by fetching navigation properties.

Projects previously generated from our archetypes wrongly applied surefire.forkCount when running tests. This has now been fixed in the archetypes, so newly generated projects will correctly pass forkCount instead of the misnamed forkNumber. Existing projects need to apply this change manually in pom.xml of unit and integration test modules.

The exceptions classes TokenRequestDeniedException and TokenRequestFailedException are now public, as they are exposed as part of the API. Now, you can catch those exceptions explicitly.

Several further improvements are listed in the full release notes.

Continuous Delivery Toolkit: Release Highlights


Usage analytics


Improving any product is made easier by having good data about its usage. When deciding which features to develop, where to improve, and where to focus our efforts, it helps knowing how users are using our continuous delivery toolkit. That is why, in addition to our established feedback channels such as Stack Overflow and Github, we now start with analyzing the usage of the continuous delivery toolkit by collecting anonymous, non-sensitive telemetry data about the usage of the pipeline. We want to be as transparent as possible about what is being transmitted and analyzed. You can always find the up-to-date explanation with all details in this document
.
Collecting this data helps us to understand better how we can improve our offering. For example, we will be able to understand which features are most often used or which parts of the pipeline usually take longest, allowing us to improve where it is most needed. Of course, the knowledge gained from this data is always combined with qualitative feedback and requirements we learn of through the other channels. We hope for your support for this endeavour, and also offer an opt-out, as described here. By default, the data collection and sending is enabled.

For each run of the pipeline, we send the versions of the pipeline and related artifcats, the steps which are executed, non-sensitive configuration parameters, the runtime, and the outcome. For the complete overview, visit the documentation. We also collect information to correlate multiple runs of the pipeline for the same project (identified by Maven group and artifact identifier) and for the same Jenkins job (identified by job URL). Such information is always hashed prior to sending. The hash is computed including a private salt that is not known to SAP, in order to make the hash value irreverisble. The pipeline only transmits this irreverisble hash values instead of potentially identifiable information. The data is stored on premises of SAP SE only.

Further improvements


We have published our reasoning for the structure of projects as created by the archetypes of the SAP S/4HANA Cloud SDK in an architecture decision record. In case you ever wondered why the archetypes generate multi-module Maven projects, this document outlines the design decision for how SAP S/4HANA Cloud SDK projects are set-up.

You can find further fixes and improvements in the complete release notes.

How to Update


Java libraries


To update the version of the SAP S/4HANA Cloud SDK Java libraries used in an existing project, proceed as follows:

  • Open the pom.xml file in the root folder of your project.

  • Locate the dependency management section and therein the sdk-bom dependency.

  • Update the version of that dependency to 2.5.0.


With this, you are already done thanks to the "bill of material" (BOM) approach. Your dependency should look like this:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.sap.cloud.s4hana</groupId>
<artifactId>sdk-bom</artifactId>
<version>2.5.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
<!-- possibly further managed dependencies ... -->
</dependencyManagement>

You can now recompile your project (be aware of the compatibility notes, though) and leverage the new features of the SAP S/4HANA Cloud SDK in version 2.5.0.

Of course, you can also generate a new project that uses version 2.5.0 from the start by running the Maven archetypes for Neo or Cloud Foundry with -DarchetypeVersion=2.5.0 (or RELEASE).

Continuous Delivery Toolkit


If you are using the pipeline with a fixed version (as recommended since v7), update the continuous delivery toolkit with the following command, that you run on the server hosting the cx-server:
./cx-server update image