Skip to Content
Author's profile photo Sridhar Gupta

Meeting of Worlds – Kafka Adapter

The meeting of two worlds – 3 Vs (volume, variety and velocity) of bigdata and the back office has brought in tremendous advantages to the business world and has technical challenges to be dealt with.

Companies use data collection systems for nearly everything from business analytics to near-real-time operations, to executive reporting. As organizations are moving towards integrating IoTs to the regular operations, the data integration to the back office and operations in real-time (or near-real-time) is more in demand now. Apache Kafka is a leader in streaming the data and can also manage the 3 Vs very well.

While streaming the data from IoT devices, data generating software with the CRM or ERP systems using Kafka and lot of other out of the norm use cases, where customers want a little different or the business process needs a little more than the regular/normal usage, it is a challenge or a gap that is filled by Advantco’s adapter.

Internet of Things to backend SAP system

A large advertising company use Apache Kafka to stream data collected from the advertisement in the mobile apps. But then the data needs to be transmitted to the backend processes in SAP ERP system for further processing and usage.

Data collection software to backend SAP system

A retailer with online and in-store presence use Apache Kafka to collect order details from a Point of Sale (PoS) software but then needs to transmit to the SAP ERP system in a real-time or near-real-time frequency.

Time bound Cleanup process

For a specific regulatory purpose, a manufacturing organization needs to delete customer data that has been placed in the Kafka cloud system after a specified amount of time.

Advantco adapter is being used for all these and more use cases. It enables to exchange data between non-SAP systems and the Integration Server or the PCK by means of Kafka APIs.

Benefits of the Advantco Adapter for Kafka

The key advantage of Advantco adapter is that it is fully integrated with the SAP Adapter Framework, which means the Kafka connections can be configured, managed and monitored using the standard SAP PI/PO & CPI tools. With SAP scalability and stability, Advantco Kafka Adapter ensures the highest level of availability.  It is also fully integrated with the SAP logging/tracing framework.

  • Integration with SAP NetWeaver®: The adapter is built on the SAP NetWeaver® Adapter Framework
  • Easy and quick installation by importing an SWCV transport (.tpz) into the Integration Repository/Enterprise Service Builder and deploying the J2EE software archive (SDA or SCA) via the J2EE SDM/JSPM.
  • Integration with SAP NetWeaver® Runtime Workbench and SAP NetWeaver® J2EE Logging/Tracing. Supporting SAP NetWeaver PI/PO 7.5, PI 7.4, & SCPI.
  • Configuration for the target topics and how to read streams of data
  • Configuration for polling
  • Adapter Specific Message Attributes
  • Content Conversion

Key features of the Advantco Adapter for Kafka

Sender Adapter

The Sender Kakfa adapter must be configured as sender channels in the Integration Builder or the PCK. In a multi-cluster environment, the Sender Kafka adapter implicitly guarantees that there is only one session active among all server nodes at a time.

A sender Kafka adapter is used to consume data from Kafka brokers then it sends data to the Integration Server or the PCK.

Receiver Adapter

The Receiver Kafka adapter must be configured as receiver channels in the Integration Builder or the PCK. Receiver Kafka channel sends message payloads received from the Integration Server or the PCK to Kafka Server.

Content Transformation

Content transformation or conversion parameters are for converting main message payload between Plain, JSON and XML formats.

Producer API

Producers are used to publish messages to one or more Kafka topics. Producers send data to Kafka brokers. Every time a producer publishes a message to a broker, the broker simply appends the message to the last segment file. Actually, the message will be appended to a partition. Producer can also send messages to partition of their choice.

Variable Substitution

To enable the use of message descriptor apart from the payload for enhanced processes, the variable substitution is available varied use cases.

Advance Kafka Configuration

If the basic parameters provided do not fulfill the requirements, advance Kafka configuration can be enabled using the checkbox to specify the supported configurations for Kafka producers. The configuration key is the same as supported by the native Kafka documentation.

Logging and Tracing Configuration

The adapter uses standard SAP Logging API to write log and trace messages to standard log and trace files.

  • Log messages helps users to get information about adapter execution, including processing errors
  • Trace messages are for Advantco’s developers to look deeply into adapter execution, which is helpful for debugging purposes.


The Advantco adapter for Kafka enables exchange streams of data with the integration server or PCK by means of Kafka APIs.

Other benefits include

  • Supports a variety of security options for encryption and authentication
  • Supports both free-structure payload and Avro-structure payload
  • Fully compatible with 0.11.0 and 1.0.0 brokers
  • It is supported by a dedicated team with 24X7 support and comes with maintenance and upgrades etc.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo chand basha shaik
      chand basha shaik

      Hi sridhrar,

      my requirement is KAFKA---->PO------->S4H

      we are using sender KAFKA Adapter

      1.KAFKA System having multiple records . Is their any Option in channel parameter to pick multiple       records

      2.we are getting invalided XML from KAFKA system . in XML no root node how to resolve this root node using KAFKA channel parameter

      XML file:

      <?xml version="1.0" encoding="UTF-8"?>
      <row entity="KAFKA_TEST" dml="INSERT">
      <data key="COMP_CODE">1010</data>
      <data key="DOCUMENT_CURR_ISO">CAD</data>
      <data key="DOC_LINE_NO">102</data>
      <data key="GL_ACCOUNT">33001</data>
      <data key="ITEMNO_ACC">300003</data>
      <data key="UNIQUE_ID">26</data>
      <row entity="KAFKA_TEST" dml="INSERT">
      <data key="COMP_CODE">1011</data>
      <data key="DOCUMENT_CURR_ISO">INR</data>
      <data key="DOC_LINE_NO">103</data>
      <data key="GL_ACCOUNT">33021</data>
      <data key="ITEMNO_ACC">30022</data>
      <data key="UNIQUE_ID">57</data>
      <row entity="KAFKA_TEST" dml="INSERT">
      <data key="COMP_CODE">1011</data>
      <data key="DOCUMENT_CURR_ISO">DUR</data>
      <data key="DOC_LINE_NO">103</data>
      <data key="GL_ACCOUNT">33021</data>
      <data key="ITEMNO_ACC">30022</data>
      <data key="UNIQUE_ID">67</data>


      chand basha

      Author's profile photo Rajesh PS
      Rajesh PS

      Nice blog Sridhar. Keep it Up!

      Author's profile photo Sanjai Marimadaiah
      Sanjai Marimadaiah

      With the release of Diffusion 6.6 Preview 1, a beta version of the new Kafka adapter is now available for on-premise Diffusion.

      The Kafka adapter translates data between Diffusion topics and Apache Kafka events. We’ve designed it to make it quick and easy to integrate Kafka with Diffusion.

      Kafka is a widely used event streaming tool with high performance, solid scalability and resilience within the data center. Diffusion is efficient in providing the last mile delivery of data across unreliable networks, using delta streaming and reliable session reconnection. The new Kafka adapter acts as a bridge between these two robust systems and enables you to build a complete real-time data distribution solution.

      Two-way translation between Kafka and Diffusion

      The earlier standalone Kafka adapter for Diffusion used the Kafka Connect framework and was available via GitHub. The new adapter does not use Kafka Connect, and is instead based on Kafka Producers and Consumers.

      A version of this new Kafka adapter has been available on Diffusion Cloud since August.

      For the on-prem preview release, the adapter is more refined with additional features and improved configuration options. The adapter is included within the Diffusion server installation and can be stopped and started independently from the main server.


      Author's profile photo Rajesh PS
      Rajesh PS

      Hi Sridhar Gupta

      Nice blog indeed. Cheers to this!

      I would like to know if kafka adapter supports SAP CPI fully fledged & how flexible it is (in comparison with SAP PO Kafka adapter) ?

      Secondly does SAP CPI supports schema registry, serialization as of now, and also how about the avro & json conversions?

      Is it a tactical long term reliable solution to use via SAP CPI Cloud?

      Also not sure about the license/subscription  cost ? post using Kafka adapter which should not ideally not end up in capacity or feature constraints and each has it pros and cons?

      Looking forward for your valuable thoughts in elucidate. Thanks in advance!

      Author's profile photo Sridhar Gupta
      Sridhar Gupta
      Blog Post Author



      Thank you for your interest in the Advantco Apache Kafka adapter.  I would like to schedule a 10-15 minute call with you to better understand your organization's needs and how you plan to utilize our Apache Kafka adapter.


      Please schedule a call with me using the link below.


      I've also included a link below to our three-page brochure, which provides an overview of Advantco, highlights the benefits of our integration solutions, and contains a complete list of our adapters.


      Please let me know if you have any questions.





      Steve Fracasso
      Licensing Manager
      Advantco International

      125 Remount Rd, Suite C1 #382

      Charlotte, NC 28203

      Direct: (919) 276-5127