Skip to Content

We have delivered two new sets of adapters with this release, the Advanced Message Queuing Protocol (AMQP) and the File/Hadoop Avro adapters. We have also introduced various enhancements to existing adapters.

New AMQP Adapters

Use these input and output adapters to connect streaming analytics to an AMQP message broker so you can send and receive data in different formats:

  • AMQP CSV Input and Output adapter
  • AMQP Event XML Input and Output adapter
  • AMQP JSON Input and Output adapter
  • AMQP String Input and Output adapter

These adapters transport messages using a third-party message broker. During this process, producer applications send messages to a message broker where an exchange occurs to send the message to the appropriate message queue. These adapters also support guaranteed delivery (GD).

The following transporters are part of the new AMQP adapters. They act as either producers or consumers when sending messages to or from a message broker. You can also use these transporters to create your own custom adapter:

Name What it does
AMQP Input transporter Sends messages from broker queues to the next module specified in the adapter config file.
AMQP Output transporter Obtains data from the previously defined module specified in the config file, and sends it to the message broker. The message is then sent and consumed by an application.

For more information about the AMQP adapters, check out Adapter Enhancements (New and Changed), as well as the AMQP pages in the SAP HANA Streaming Analytics: Adapters Guide.

 

New File/Hadoop Avro Adapters

There are two new toolkit adapters for using Avro files in your streaming projects:

Adapter What it does
File/Hadoop Avro Input adapter Reads Avro files from a local hard disk or Hadoop Distributed File System (HDFS), transforms data to streaming analytics format, and publishes to streaming.
File/Hadoop Avro Output adapter Reads rows from streaming, transforms data to Avro format, and writes and saves it to a local hard disk or HDFS.

These adapters use the ESP to Avro Record and Avro Record ESP formatter modules which have new properties that give you more options for storing and transmitting data. These include changes to compression and serialization behavior. The File Output transporter module also has a new property for setting sync intervals in Avro files. For more on these properties, check out these topics:

For details on the new adapters, check out the File/Hadoop Avro Input and Output Adapter (New) section in Adapter Enhancements (New and Changed), as well as the File/Hadoop Avro Input and Output Adapter section of the SAP HANA Streaming Analytics: Adapters Guide.

 

JSON Enhancements

There are various enhancements to the File/Hadoop, Kafka, and Socket JSON managed and unmanaged input adapters. These include a new property (or attribute, as it’s called in the unmanaged adapters), as well as improvements to existing ones. These enhancements all serve the same general purpose: to provide more granular options for retrieving data so you can write more specific expressions for mapping data to the receiving stream. For more details, check out the following:

There is also a new formatter available in the adapter toolkit: JSON Stream to ESP. This formatter reads a Java InputStream and outputs translated JSON strings to AepRecord objects. Use this formatter module to build File/Hadoop JSON Input unmanaged adapters. This formatter replaces the previously used JSON Stream to JSON String and JSON String to ESP formatters. For more details, check out JSON Stream to ESP Formatter Module Properties.

 

 

 

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply