Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
architectSAP
Active Contributor
For more than 3 years, I have been Sending my Raspberry Pi sensor data to SAP Vora via Apache Kafka managed by the SAP Data Hub.

With the availability of SAP Data Intelligence, trial edition 3.1, it was time to upgrade. I chose the deployment on Microsoft Azure:


My Raspberry Pi Zero W gets its temperature and further information from a Bosch Sensortec BME680 sensor now:


Leveraging Python:



#!/usr/bin/env python
import datetime
import bme680
from kafka import KafkaProducer
from json import dumps
sensor = bme680.BME680()
json_body = [
{
"millis": str(datetime.datetime.now()),
"temperature": sensor.data.temperature,
"pressure": sensor.data.pressure,
"humidity": sensor.data.humidity
}
]
producer = KafkaProducer(bootstrap_servers=['your.kafka.server:6667'], value_s
erializer=lambda x: dumps(x).encode('utf-8'))
future = producer.send('bme680', json_body)
try:
record_metadata = future.get(timeout=10)
except KafkaError:
log.exception()
pass

Destination is SAP HANA Cloud:


The Graph remained basically the same from when I Restored my NEO Internet of Things Service scenarios with Apache Kafka and the SAP Data Hub:


Running at 1.38M temperature measurements:


There are also 2 tutorials included if you liked to get hands on:


The SAP Cloud Appliance Library makes this so easy nowadays.
1 Comment
Labels in this area