Skip to Content
Technical Articles
Author's profile photo Frank Schuler

First impressions of SAP Data Hub 2.5.0

One year ago, I posted my First impressions of SAP Data Hub 1.4.0 and since then the product has developed significantly.

To start with, if you installed SAP Data Hub on SUSE CaaS Platform, please ensure that you implemented SAP Note 2776522 – SAP Data Hub 2: Specific Configurations for Installation on SUSE CaaS Platform.

Then, if you want to access SAP Data Hub Launchpad from a machine that is not part of the network of your Kubernetes cluster, you must create a NodePort service to expose SAP Data Hub System Management. This service type opens the same port in every node.

And similarly, if you want to expose SAP Vora Transaction Coordinator and SAP HANA Wire externally.

The first improvement over SAP Data Hub 2.4.1 becomes obvious when I recreate my scenario to Send your Raspberry Pi sensor data to SAP Vora via Apache Kafka managed by the SAP Data Hub. Now I do not have to enter the Kafka connection details into the Kafka Consumer Operator any longer, but can instead use a respective Connection (that does by the way not pass the Test Connection in SAP Data Hub 2.5.0 yet, but still works perfectly fine in both the Kafka Consumer and the Kafka Producer Operator):

Similarly, when recreating my scenario to Send your Raspberry Pi sensor data to SAP Vora via Eclipse Paho MQTT managed by the SAP Data Hub, I do not have to enter the defaultAvroSchema by hand any longer but can instead use the new configuration user interface:

Finally, when I recreate my scenario to Leverage your SAP Data Hub with SAP Agile Data Preparation, I get similar data preparation functionality within SAP Data Hub 2.5.0 now (not function equivalent but similar):

There are many more New Features in SAP Data Hub 2.5, but these have been my first impressions starting to work with it.

Assigned Tags

      5 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Bartosz Jarkowski
      Bartosz Jarkowski

      Thanks Frank!

      I have also tried the SAP Data Hub 2.5 but in my case I only struggled with various problems. The operators that I created in DH 2.4 stopped working - probably because the underlaying python libraries are changed. Some of the oprators also have additional dependencies that I'm unable to trace.

      Very often I'm unable to run graphs and they throw not meaningful errors. I even had problems with HANA Wire protocol.

      I thought about re-installing it but I I'll wait for next release 🙂

      Author's profile photo Thorsten Schneider
      Thorsten Schneider

      Hi Bartosz,

      I am sorry to hear that. For the Write operator – as Frank says – we are checking the problem at the moment.

      For the other problems you mention

      • Error handling of pipelines / workflows needs to be improved. I agree. We need to more consistently get the root cause through to the UI instead of showing a very “generic” messages in the UI and “hiding” the details in the pod logs. Having said that, the pods logs typically include quite a lot of useful information.
      • About the operators / HANA Wire. Hard to say from here what is / was wrong there. If the problem is reproducible, please provide some more details (here or over mail) and / or open a ticket. We will help to find a solution

      Cheers

      Thorsten

      Author's profile photo Bartosz Jarkowski
      Bartosz Jarkowski

      Hi Thorsten Schneider ,

      thanks for commenting! Could you please follow me that we can take it offline?

      Cheers,

      Bartosz

      Author's profile photo Henrique Pinto
      Henrique Pinto

      Bartosz Jarkowski are these python-based operators, using a custom dockerfile? If so, you need to add the tornado:5.0.2 python package to your custom dockerfiles (pip/pip3 install tornado==5.0.2) and add the tornado tag with version 5.0.2 to your dockerfile tags.

      Author's profile photo Frank Schuler
      Frank Schuler
      Blog Post Author

      Hello Bartosz,

      I understand where you are coming from, and as it seems the Write File Operator does still not work with a Microsoft Azure Data Lake either. I am following this up with SAP and let you know.

      Best regards

      Frank