Skip to Content
Technical Articles
Author's profile photo Chris Qi

Build Your First Scenario with SAP Cloud Integration Kafka Adapter

Introduction

SAP has recently released Kafka Adapter on both Cloud Foundry and Neo environment . This blog is to describe step by step how to build your first simple demo using Cloud Integration as both Producer and Consumer of Kafka to run scenario end-to-end.

Some important concepts of Kafka and Cloud Integration Kafka Adapter configuration has been described here Cloud Integration – What You Need to Know About the Kafka Adapter

The Cloud Integration Kafka Adapter current has the limitaion that Cloud Connector support is not available. Thus the Kafka used in this article is Kafka Cloud trial.

 

Getting Kafka Cloud Trial

1. Visit https://confluent.cloud/. Register and get free trial (basic version)

You can sign up for Confluent Cloud for free. Confluent Cloud provides a simple, scalable, resilient, and secure event streaming platform for us to easily test the Kafka scenario.

2. Create cluster

Basic version cluster would be sufficient for our demo. $200 USD credit is offered each of your first three monthly bills. So you won’t be chareged for the test.

confluent%20create%20cluster%201

confluent%20create%20cluster%202

confluent create cluster 2

3. Create Topic

Next essential step will be create topic. It can be created with default configuration. You can also customize the configuration according to requirement.

4. Create Kafka API Key

Then you need to create API key for CI access.

Copy the Key and Secret here. We will need them when creating Kafka credential in Cloud Integration.

 

Design the iFlow in Cloud Integration as Producer for Kafka

1. Set up Kafka Reciever Adapter

a. Copy host from Confluent Setting

b. Authentication

SALS with PLAIN machanism would be the easiest to set up in this scenario.  We need to save Confluent API Credential in CI for access.

To create the credential, we use the API key and secret from abov step as user and password.

 

2. Set up http Sender Adapter to trigger the process

configure the iFlow to be producer of the topic you created in Confluent. Configure the parameters as required.

3. Deploy the iFlow

 

Design the iFlow in Cloud Integration as Consumer (group) for Kafka

1. Set up Kafka Sender Adapter

The host and authentication setting would be same as reciever setting

Set the adapter to be consumer of the topic you created. Configure the other parameters as required.

2. Set up Script to log the message.

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
def body = message.getBody(java.lang.String) as String;
def messageLog = messageLogFactory.getMessageLog(message);
if(messageLog != null){
messageLog.setStringProperty("Logging#1", "Printing Payload As Attachment")
messageLog.addAttachmentAsString("ResponsePayload:", body, "text/plain");
}
return message;
}

 

3. You can customize the message also and send to any downstream processes.

4. Deploy the iFlow

 

Test the iFlow

1. Send HTTP request from postman

2. See message in Confluent

3. Check iFlow from Cloud Integration

With the above steps you will be able to set up a very simple end-to-end senario to utilize Kafka on Confluent Cloud and Kafka Adapter in Cloud Integration. There are many settings available in the adapter that you can further try and test. The blog is just to help you get started.

Welcome to provide feedback regarding the content in the comment section.

Thanks!

Assigned tags

      4 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Souvik Sinha
      Souvik Sinha

      Thanks for sharing the details blog. Like to try this.

       

      Regards,

      Souvik

      Author's profile photo Werner Dähn
      Werner Dähn

      Where do you define the Avro schema registry? Almost all data in Kafka is using Avro as a payload. This is supported, isn't it?

      Author's profile photo Chris Qi
      Chris Qi
      Blog Post Author

      Hi Werner,

       

      As to my understanding, the schema need to be created in Kafka when you create the topic (haven't try that out in Conflunt Cloud though). The payload to be produced or consumed via CPI which are the JSON-like content will be transfered like pure JSON or all other formats of data.

      Hope this make sense to you.

       

      Best regards,

      Chris

      Author's profile photo Werner Dähn
      Werner Dähn

      Thanks for the answer, would love to see that, Chris Qi .

      Are you saying the Avro Payload is converted to Json back and forth automatically? How does it even know if the payload is Avro, binary data, Json, ProtoBuf, text data? In Kafka all is possible so you need to configure something in CPI. And without a schema registry known to CPI, it does not even know the schema to use for the Avro deserialization.

      :confused: