Skip to Content

Highlights of the HANA smart data streaming (SDS) SPS12 release:


Kafka Adapter

  • Apache Kafka is a open source Message broker
    • http://kafka.apache.org/
    • Scalable, supports partitioning
    • Already commonly used in IoT and Big Data scenarios by SAP Customers
    • Does not provide event stream processing capabilities
  • New Kafka Adapters in Smart Data Streaming
    • Provide Input and Output adapters for integration with Kafka
    • Support Avro, CSV, XML, JSON and String message contents
    • Has already been used in customer facing POCs
  • The Kafka Adapter simplifies integration of the complex streaming analytics capabilities of SDS with the message streaming capabilities of Kafka


SAP HANA Multitenant Database Container Support

  • In SPS 11 and below, we did not support installing Smart Data Streaming in a HANA system configured to use MDC
  • For SPS12 SDS can now be installed with HANA systems using MDC
    • SDS is installed to the HANA system. Streaming clusters are then defined at a Tenant level.
      • An SDS cluster can consist of 1 or more SDS nodes
    • If you wish to have separate SDS clusters for separate Tenants, then each SDS cluster requires a dedicated host (physical or virtual machine)
    • Management of the SDS cluster is done through the Tenant
  • High Isolation tenants are not supported in SPS12
  • A single SDS cluster can interact with multiple data sources for both input and output through adapters.
    • This includes interacting with multiple HANA databases using HANA Reference elements, HANA Output adapters and Database Input Adapters

Manage Streaming Permissions from SAP HANA Cockpit

  • New tools to manage permissions on a smart data streaming cluster from the HANA cockpit
  • Grant privileges to users and roles
  • Assign users to roles
  • No more need to use command line tools


IQ Output Adapter Enhancements

  • Better support for “Send at Least Once” Architectures
    • “Send at Least Once” architectures implement guaranteed delivery by resending records if there is any question that it may not have already reached the destination
      • Fairly common architecture in IoT Scenarios
  • Added support for the IGNORE CONSTRAINTS option of the IQ LOAD TABLE statement
    • Allows loading data into IQ which may contain duplicate records
    • When IGNORE CONSTRAINTS is set to on, constraint violations
  • Performance Enhancement
    • Increased performance by enabling support for server-side loading.
    • This reduces network overhead but requires shared disk access.


Database Output Adapter Enhancements

  • Two new parameters for the adapter:
    • maxReconnectAttempts– Specifies the number of attempts the adapter makes to reconnect. For infinite attempts, specify a value of -1. The default value is -1.
    • reconnectAttemptDelayMSec– Specifies the number of milliseconds the adapter waits between attempts to reconnect. The default value is 1000.
  • Implemented reconnect capability for the Database Output Adapter
  • One new Performance Tuning parameter:
    • batchDelay– Specifies the number of milliseconds the adapter waits between entries beforecommiting the previous batch. Increasing this value will improve throughput at the cost of latency. The default value is 1000.


Streaming Web Service Output Adapter – Guaranteed Delivery & Connection Retry

  • Only applies to REST connections, not for Websockets
  • Only guarantees that messages are delivered to SWS
    • SWS does not yet guarantee delivery to the destination SDS project
    • No buffering of messages at the SWS layer
  • Old behavior
    • Connection failed, message was dropped, would try to reconnect a limited # of times to send the next message, no capability to retry over an extended time period
  • New behavior
    • Connection failed, same message will be retried rather than skipped
    • Have position indicator to identify the next message


Streaming Lite Encrypted Credentials

  • Streaming Lite has always supported the use of encrypted credentials, but it was not well documented.
  • Use the streamingencrypt utility to create a cipher key file and encrypt the password. When starting a streaming lite project with encrypted credentials, make sure the cipher key file is in the location from which you run streamingproject.
  • For full instructions, see the Encrypting Data Service Credentials topic in the SAP HANA Smart Data Streaming: Developer Guide.



To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply