Skip to Content
Technical Articles
Author's profile photo Michael Pang

Produce/consume messages in KAFKA with SAP Netweaver using Java Connector – Part 3/3

This is the final part of my blog…. we are now going to join the dots together.

The funny thing with doing a proof of concept on AWS is like getting a taxi ride… every minute counts… 🙂

Let’s recap.


We now have an EC2 instance running in AWS with:

  • SAP NW backend
  • RFC destination setup
  • KAFKA setup and we can produce and consume messges from a topic “my-kafka-topic”
  • SAP JCo Server setup and connection established with the SAP NW


Now, back to the fun part.



Produce a message from SAP to KAFKA

I copied the Java code from “” and put it inside provided by SAP. See from

(I will not take credit for this code.)

Compile and run the code

See code here

You need the following files:

Now this file is a combination of the SAP example “” and some code I copied from here. I didn’t spend too much time on making the code pretty and neat. I just did the necessary to make it work so please don’t judge.

What did I change in

    static String SERVER_NAME1 = "EXT_SERVER";

In the handleRequest method, I added the following:

            String message = function.getImportParameterList().getString("REQUTEXT");

            // Assign topicName to string variable
            String topicName = "my-kafka-topic";

            // create instance for properties to access producer configs
            Properties props = new Properties();
            // Assign localhost id
            props.put("bootstrap.servers", "localhost:9092");
            // Set acknowledgements for producer requests.
            props.put("acks", "all");
            // If the request fails, the producer can automatically retry,
            props.put("retries", 0);
            // Specify buffer size in config
            props.put("batch.size", 16384);
            // Reduce the no of requests less than 0
            props.put("", 1);
            // The buffer.memory controls the total amount of memory available to the
            // producer for buffering.
            props.put("buffer.memory", 33554432);
            props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
            props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
            producer = new KafkaProducer<String, String>(props);
            producer.send(new ProducerRecord<String, String>(topicName, message, message));
            function.getExportParameterList().setValue("RESPTEXT", "Message sent successfully");
            System.out.println("Message sent successfully");
            // producer.close();​

What the code does is take the text passed from the STFC_CONNECTION function module, and call the Apache KAFKA producer API with it. It’s that simple.

Compile and run. Notice I need to put both JCo and KAFKA libraries in the classpath now.

export KAFKA_HEAP_OPTS="-Xmx512M -Xms256M"
javac -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*


Run the JCo Server

java -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*:. StepByStepServer

Start KAFKA consumer

/opt/kafka/bin/ --bootstrap-server localhost:9092 --topic my-kafka-topic


Call RFC function module


Here’s a link to the youtube video for the result


Consume a message using the Java connector client and call RFC

Now let’s try the other way. Someone changed a transaction outside and published to a KAFKA topic and SAP wants to know about it and do something with it.


The flow is:

  1. Message is produced to KAFKA topic
  2. Java client (with Java Connector) consumes the message.
  3. Java client calls SAP RFC
  4. SAP RFC do something with the message.


I copied the Java code from “” and put it inside provided by SAP. See from

(I will not take credit for this code.)

Setup and run the JCo Client

See code here

You need the following files:

Now let’s explain. is a copy of the which I borrowed from here  as mentioned, combined with the code from the from the SAP example.

The code can already consume a message from a KAFKA topic “my-kafka-topic”, and I take that message and call function STFC_CONNECTION in SAP with the message.

The function will echo back the text showing it has successfully received it.


In the doWork method which is called when a message is received, I’ve added the code to call function STFC_CONNECTION. It should be straight forward what the code does.

Compile and run the Java client

javac -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/* -Xlint:deprecation *.java
java -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*:. SapKafkaConsumeDemo



Produce a message to KAFKA topic

/opt/kafka/bin/ --broker-list localhost:9092 --topic my-kafka-topic





Here’s a youtube video.



In summary, it is possible and also not too difficult to do so.

What we now need to explore is how to productionize this solution, how to make this HA/DR etc. I’m still have some unanswered questions on how this can handle massive volume in an enterprise environment, or whether the JCo server and client should be on a separate instance.



Anyway, thanks for you time, I hope you find this interesting. Leave me some comments below.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Ulrich Schmidt
      Ulrich Schmidt

      Hi Micheal,

      your blog gave me an idea on how to “productionize” this solution: SAP already has an enterprise-ready JCo Client/JCoServer: the SAP Business Connector (SAP BC). It can be downloaded for free like JCo and then installed as a permanently running daemon (Unix/Linux) or Windows Service.

      Granted, the installation is a bit more than just a plain JCo, but you get a number of benefits in return:

      • As I said, it can be installed as a Windows Service, so you get HA, failover security, maintenance, monitoring etc. for free
      • You don’t need to write the JCo client and server coding, that’s already done by the BC (and very stable as well). You customize with a few mouse clicks, what should happen upon receiving this or that function call, and in that action just implement the KAFKA coding like in this example.
      • You just customize the connection data to the SAP system (for client login as well as for the registered server) in a UI
      • It comes with a “Scheduler”, where you can set up a periodic job that polls the Kafka system and then pushes any new messages into SAP. (Again you only need to implement the action that checks Kafka. The scheduler coding and the JCo client coding you get for free.)
      • Massive volume is not a problem for the SAP BC… It’s the “Volkswagen” among the middleware tools, simple, robust, high-performance.
      • If you put all your Kafka-related coding (the "actions" I have been talking about, or "Java Services" in SAP BC language) into one package, you can save it as a zip file and other SAP BC users can simply install that zip file on their BC and use it right away.
        Perhaps add a little UI Page to that package, where they can enter the necessary logon data for their Kafka system, and they are ready to go


      It takes a few days to read the documentation and get familiar with it, but once you know it, it’s a very powerful tool.

      Author's profile photo Michael Pang
      Michael Pang
      Blog Post Author

      Wow, thanks for that. Let me take a look!

      Author's profile photo Ronald Schertle
      Ronald Schertle

      We chose a similar approach during a project in 2018.
      Essentially, we used JCo server for getting data from SAP, published it into Kafka, processed it in Spark Streaming and finally pushed it into a SAP BA system using JCo client, JCo context and the transactional COMMIT/ROLLBACK BAPIs.
      The customer requested to use AVRO as the serialization format for data in Kafka. Therefore we implemented a AVRO Serde for JCo objects.

      I am not sure if the Business Connector will go into EOL, as I found some posts discussing this.
      Instead, we implemented an own server using Akka to process requests in parallel and to include a scheduling and recovery process. Working with the JCo API got easier by wrapping it in Scala.

      You might have a look into for setting up a local Kafka instance. There is no need to install anything, because Zookeeper, Kafka brokers and needed topics will be started on the fly. We used it to implement fully automated, local end-to-end integration tests.

      From my point of view writing Kafka connect source connectors is one step further to a tight integration with Kafka. We already implemented Kafka connect type converters and tested the source connector locally using debezium. I would be happy to find a customer testing it in a real world confluent platform installation.