Skip to Content
Technical Articles
Author's profile photo Michael Pang

Produce/consume messages in KAFKA with SAP Netweaver using Java Connector – Part 2/3

This is the second part to demonstrating how to get SAP Netweaver to produce/consume messages in KAFKA from SAP.

Link to part 1 Part 1


If you are just keen to see the results, go to Part 3.


Below will cover the following:

  • Connect to your SAP server using SAPGUI
    • Install license
    • Developer Key
  • Setting up tools
    • Java Development Kit
    • Nano
  • Setting up KAFKA
  • Setting up SAP Java Connector (JCo)


Just to reiterate, I didn’t use the windows EC2 instance that had all the developer tools. I only used the SUSE linux instance installed with the SAP Netweaver backend.



Source code

I’ve put my code on github…. you might find it useful.



Setup your SAP system

You can either use your windows EC2 instance’s SAPGUI and it is already setup with a system to connect to.

If you have your own SAPGUI installed on your own machine. The application server should point to your elastic IP of your EC2 instance, or else it may change after every EC2 restart.


First log in

Client 001


Password: What you entered when you created your CAL instance


Install trial license

See here , or google how to get a minisap license.

In a nutshell, get your hardware key from transaction transaction SLICENSE.

Go to and generate a license for A4H, enter your hardware key.

Upload it to SLICENSE and you are done.


Developer key

The user DEVELOPER should already have a developer key assigned, so in case you are wondering, you don’t need one. Just don’t go and create your own user and expect to be able to get a developer key for it (I found out the hard way).





Setting up tools

First you need to make sure you can SSH into your EC2 instance.


Both KAFKA and SAP Java Connector requires Java, so it makes sense to start installing that first.

From memory a Java 1.6 comes with the EC2 instance but it is too old for KAFKA, so we need a newer one.

To install

Run command:

sudo zypper --non-interactive install java-1_8_0-openjdk-devel

To test:

Run Command:

java -version

Run Command:


If they didn’t return any error, you are good to go.



This step is entirely optional. I need an editor in linux, and I like to use nano. You can use whatever you like such as vi.

To install

Run Command:


sudo rpm -i nano-2.9.6-lp151.2.3.x86_64.rpm

If this link does not work (because I have no control over it, search for somewhere that that might have a nano rpm distribution)


To test:

nano <any file>


To exit:

Ctrl + X



Setting up KAFKA


Run Command:

curl -O

tar -xvf kafka_2.12-2.4.1.tgz

rm -rf kafka_2.12-2.4.1.tgz

sudo ln -s ~/kafka_2.12-2.4.1 /opt/kafka


Let’s explain

The script will first go to Apache and download KAFKA. If the link in the above command does not work, then change it to the latest from Apache.

The EC2 instance will download KAFKA tgz file, untar it and we create a symbolic link /opt/kafka to the decompressed kafka directory.

That’s it… now KAKFA is installed!




We are going with the most basic configuration/setup here, with single zookeeper, single KAFKA server. It’s up to you to explore more complex setup.

Now I’m going to refer to the Apache quick start guide since I can’t explain it any better and there’s no point of me repeating either.

In short, we are going setup one zoo keeper (on port 2181), one KAFKA server (on port 9092).  Remember these ports as you will see code referring to them later.

I run the following script to start the zookeeper and KAFKA server. You need to repeat this after each restart.

export KAFKA_HOME="/opt/kafka"
export KAFKA_HEAP_OPTS="-Xmx512M -Xms256M"
nohup sudo $KAFKA_HOME/bin/ $KAFKA_HOME/config/ > /dev/null 2>&1 &
sleep 2
nohup  $KAFKA_HOME/bin/  $KAFKA_HOME/config/ > /dev/null 2>&1 &
sleep 2

Note that I set the KAFKA heap to be smaller as the EC2 instance struggled with the default memory settings.

Remember to chmod it to execute.

chmod +x

See code here

Create a topic:

I’m going to create a topic “my-kafka-topic”. You can call it whatever you want. You only need to do this once. If you get an error it means your zookeeper or KAFKA server is not started properly.

/opt/kafka/bin/ --create --topic my-kafka-topic --zookeeper localhost:2181 --partitions 1 --replication-factor 1


Test it

Produce a message

/opt/kafka/bin/ --broker-list localhost:9092 --topic my-kafka-topic


Consume the message

In another SSH session, run this:

/opt/kafka/bin/ --bootstrap-server localhost:9092 --topic my-kafka-topic --from-beginning



Setting up SAP JCo

Make sure you get JCo 3.0 and not 3.1

I paid a hefty price of trying to get version 3.1 to work, only to wasted hours on trying to fix a ARPCRFC fast serialization error. So I’m going to use version 3.0 here


Where to download



Transfer and unzip the file to the EC2 instance

I suggest you download the “” file first in your desktop and upload it to the EC2 instance. I suggest you either put it somewhere public on the internet and use “wget” or “curl” in the SSH terminal to download. I put it in an S3 bucket and use AWS CLI commands to retrieve it – do whatever works for you.

After the file is on the server, you should unzip it to the folder sapjco30.


mkdir sapjco30
cd sapjco30
aws s3 cp s3://<my-s3-bucket>/
tar -xvf sapjco30P_20-10005328.tgz

If you are not using S3 bucket for transfer, skip the s3 line.

That’s it. SAP Java Connector is ready to be used!

Your directory should look like this.


Now let’s get the simple JCo Server to work. JCo server is basically the receiver of an RFC calls from any NW system. You need to first register this “JCo Server” with the SAP system, kind of like a handshake, so that the NW system knows where to call.


Create RFC Destination

Type T
Activation type Registered Server Program


Test the RFC Destination

It should return an error for now… don’t worry… it will work later on.

The error means it can’t connect to the JCo server.


Configure the JCo Server

I’m going to use the stock standard JCo Server example provided by SAP. You can make this a lot more complex and scalable and performant.

Create the JCo Destination file

This bit took me hours to figure out, as I couldn’t find any documentation that explains this. Perhaps it is straight forward. Standard HTML documentation that comes with the zip file shows you the code and assume you to know how to run it.

This tells the JCo Server code where our SAP system is. The JCo Server will use this to call SAP system to register a program ID.

  • EXT_SERVER.jcoServer
#A sample confidguration for a standard registered server 
#Replace the dummy values and put this file in the working directory.
#In eclipse environment it is the project root directory.

Make sure you enter the same program Id as the RFC destination file.

For the registration, the ABAP_AS1 is required as the JCo server makes a call to SAP.


  • ABAP_AS1.jcoDestination
#Sample destination configuration when using an application server over CPIC
#Replace the dummy values and put this file in the working directory.
#In eclipse environment it is the project root directory.
jco.client.client=<sap client>
jco.client.user=<sap user>
jco.client.passwd=<sap password>

Change your ashost to the server name. localhost might work, haven’t tried.


Compile the jco server example:

Edit the file that comes with the Jco zip file.

Configure the server name to be the name of your file.

DESTINATION_NAME=Program ID of RFC Destination.

    static String SERVER_NAME1 = "EXT_SERVER";

Comment out these lines.

    public static void main(String[] a)
//      step2SimpleServer();
//      step3SimpleTRfcServer();
//      step4StaticRepository();



javac -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*

java -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*:. StepByStepServer




Test it

Test the RFC destination

It’s working!

Test the function module

Go to transaction SE37 and call function STFC_CONNECTION with the RFC destination.

The call is successful. The Java code response with “Hello World”.

The call is successfully received by the JCo Server.


Finally, all the bits and bobs are set up… Now… back to why we are trying to do this, we want to be able for ABAP to call a java program using Java Connector, and produce a message on KAFKA… See Part 3.

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Markus Tolksdorf
      Markus Tolksdorf

      Hi Michael,

      nice blog and a nice idea. Have you reported an incident about that fast serialization issue? It should be fixed, no matter where it is. JCo 3.0 is about to phase out, so we should make sure that JCo 3.1 is working properly. Further, you can configure the system always to use classic serialization still, even if transfer is then slower.

      Best regards,

      Author's profile photo Michael Pang
      Michael Pang
      Blog Post Author

      Thanks Markus.

      I got a fast serialization APCRFC not supported short dump error when I tried with the 3.1 library. After hours of investigation suggest I need to update the kernel which I didn't want to entertain, and there is no means in the SM59 settings to turn it off and go back to classic.

      I didn't get it to work in the end so it would be good to know how to solve this for future reference.

      Author's profile photo Markus Tolksdorf
      Markus Tolksdorf

      Hi Michael,

      If you had read the JavaDoc you would have found fast how to turn off fast serialization like mention by Stefan below and in the release notes the prerequisites for fast serialization support (note and could have spared hours of investigation. And you can certainly turn off the fast serializaiton also in the destination configuration (advanced options tab)

      Best regards,

      Author's profile photo Stefan Gass
      Stefan Gass

      Hi Michael,

      you write that you were not able to find any documentation about the *.jcoDestination and *.jcoServer property files. If you are still looking for this, please see the JCo Documentation on pages 7 & 8, which is available as a PDF file from And of course also see the contained JavaDoc from the downloaded JCo SDK archive, here especially the package description for package and the JavaDoc for classes and

      Kind regards,

      Author's profile photo Michael Pang
      Michael Pang
      Blog Post Author

      Hi Stefan

      You are right, I missed the PDF documentation on the download page, thanks for pointing that out.

      I was relying on documentation in the zip file, and unfortunately I didn't think of going deep into the javadocs to find something so crucial that I personally expect it should be in either the examples documentation or in the configuration section. None the less, I arrived at the outcome eventually 🙂

      I can see example jcoDestination and jcoServer files are included in the examples of version 3.1 which is a great hint for me.

      I'll read these PDF's to see how I can improve this solution.

      Thanks again for the feedback. Much appreciated.

      Author's profile photo Steve Howard
      Steve Howard

      Thanks for doing this tutorial.  I have setup the instances as you mentioned, but can't login to the Windows AMI.  I decrypted the Administrator password in the AWS console by passing in the downloaded pem file from the SAP portal.  It successfully decrypts it, but I can't login with that username to the Front End using the Administrator user.  I am a systems integrator that simply wants to test connectivity to Kafka for a PoC, so I don't have an S User to download the GUI to my laptop (which is acceptable to me if I can get that to work).


      1. What user do I use to connect to the Front End VM using the RDP client?
      2. How can I get the GUI without an S User?


      Thanks again.

      Author's profile photo Michael Pang
      Michael Pang
      Blog Post Author

      Hi Steve

      1. I've only used the linux AMI, so not sure about Windows. Should be a common question in the AWS forums perhaps.
      2. I've always had a S User so not sure how to get a GUI without. Perhaps ask your colleagues who might have one?





      Author's profile photo Wendy Quaas
      Wendy Quaas

      Hi Michael,

      After following the blog, I can get an RFC connection test to work but the STFC test fails with this error message in the trace. There must be something else I need to setup within jco but I'm not sure what.

      Exception thrown [Tue Mar 02 10:00:00,516]: Exception thrown by application running in JCoServer (105) JCO_ERROR_APPLICATION_EXCEPTION: handler for STFC_CONNECTION was not installed

      Would you be able to help with this?

      Author's profile photo Michael Pang
      Michael Pang
      Blog Post Author

      Hi Wendy


      Never seen this error, but googling other people have the same problem.

      • Make sure the server name is correct - follow the instructions above.
      • Check your RFC setup
      • Check that the function module exist in your system and test that it works in SE37.
      • Check for authorization for the user you are connecting with
      • All the communication error are stored in transaction SM58 under the table TRFCQSTATE. Read the table with RFC destination name or Program name to check the Error and failure message.