Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
quovadis
Product and Topic Expert
Product and Topic Expert




















This blog deals with business agility challenges which may be potentially addressed by kubernetes clusters and among them the Kyma runtime.

Kubernetes (k8s) it is a Cloud Operating system that can schedule containerised workloads. It is open-source but can be delivered as a managed runtime offering by a number of software vendors today, including SAP.

Kyma runtime, is a k8s platform for extending applications with either server-less functions and/or scheduling workloads with micro-services.


















Problem statement




Enterprises have tons of legacy, on-premise software and could greatly benefit from moving that workload to the cloud.

Customers and their business users populations are often frustrated by the time and effort it takes to have their needs met.


Solution proposal


 


In order to address the business problem we need to first sort out the IT challenges.

For instance, we could use some intelligent and dynamic workload management solution – as to improve the IT agility - making service deliveries speedier and less costly.

The solution proposal is to use a k8s technology and Kyma runtime.

Good to know:

  • SAP is heavily contributing to open-source Kyma project and offers a commercial and managed Kyma runtime - part of the SAP BTP Platform.

  • For any questions regarding the Kyma runtime by SAP please refer to the following SAP note 2970655 - SAP Support for Kyma Runtime




Customer succes




As afore-mentioned, the enterprises have tons of legacy, on-premise software and could greatly benefit from moving that workload to the cloud.

The business users often find it difficult and cumbersome to get access to data sources.

One such challenge is exploring data with data viz tools.This cannot happen without an agile access to data sources and may be very cumbersome to set up for the legacy IT landscapes (as one may need to procure dedicated licenses, hardware, compute power, networking resources etc).

In this blog I would like to explain how one could use the kyma runtime to build connectivity extensions template with micro-services to SAP Analytics Cloud Enterprise edition.

The Kyma based connectivity template will

  • allow to encapsulate and orchestrate the required connectivity middleware agent - resulting in a simplified IT landscape

  • alleviate the maintenance effort thus ultimately resulting in a reduced TCO.


The means to achieve it is through application dockerization and  workloads orchestration.

 

Demo brief


For the purpose of demonstrating the above approach I shall be implementing the Live Data connector on a kyma cluster.

Disclaimer. This is a demo brief, not meant for productive use.

We shall need the following parts:

  • SAP BTP Kyma runtime (that includes a k8s cluster)

  • SAP Live Data Connector linux flavour (available as a download from SMP)

  • SAP Analytics Cloud tenant (via a trial subscription for instance)

  • BOE 4.2 SP7 onwards with either unx or unv universes (a stock eFashion will do)


 

SAP Live Data (Universe) Connector


 









SAP Live Data Universe Connector (LDC or LUC) is a piece of software that implements the SAP InA connectivity between SAC front-end and the BOE backend.

In short, it is a connectivity agent that exposes the SAP Universes for SAC model consumption thus allowing to create ad-hoc data visualisations with SAC.

As of such LDC would, typically, be installed in the same network segment as the BOE backend system. Adding to the complexity of that landscape.

Instead, what I'd like to demonstrate is how to leverage a k8s cluster to run the LDC agent there.

Good to know:

  • you need a have a valid S-user with the SMP download rights to get access to LDC.

  • Furthermore, you will need access to a BOE platform hosting the universes (data sources) and a SAC Enterprise trial or subscription to create the data-viz.

  • The Kyma runtime is available either commercially and as well on SAP BTP Platform trial account for test and demo (T&D) purposes.


 

Hello Kyma











 


There is number of steps that are required in order to be able to run LDC on Kyma instead of running it in the on-premise IT landscape.

These steps include creation of a a docker image library available to Kyma cluster (you may look up an example of this here).

From the moment you have a running and well functioning LDC docker container all you need to do is to create a so-called deployment descriptor yaml file or files that will be fed to the k8s cluster APIs for deployment and scheduling.

 

Describe the desired state













 


The yaml files are sort of a contractual statements where you will declaratively describe the desired state of the application. And that will be up to the k8s cluster with its APIs and cluster services to deliver on this contract.

This is depicted in the below diagram. However complex it may appear all we need to care about is the left hand side of it with the LDC.yaml file.

Good to know:


Per aspera ad astra.











Here goes the final result.

There is a running kyma deployment and we have exposed the LDC endpoint through an API Rule.

The LDC endpoint is load-balanced and reversed-proxied and the LDC is running on a k8s cluster in a strict isolation from the outside world.

This is transparent to SAC tenant where we use the LDC endpoint's FQDN as the SAC connection host name. The port defaults to 443.

The use of SAML SSO for the SAC connection further enforces security and makes connectivity more seamless and well controlled.