Skip to Content
Technical Articles
Author's profile photo Sebasitan Simon

AWS S3Eventing with SAP PO

Overview

AWS Simple Storage Service or short “S3” is one of the most widely used cloud storage services and key to most AWS implementations. Objects stored in S3 are usually retrieved via API calls to the AWS S3 service.

However S3 also has a event notification mechanism that can proactively notify applications interested in state changes in S3 like new uploaded, archived or removed objects.

This blog shows how to use S3 event notifications to trigger processing in SAP Process Orchestration with the KaTe AWS Adapter via SQS or SNS.

No complex configuration like async/sync bridges or multiple api calls are needed, rather simply a sender channel configuration as the adapter natively fits into the AWS ecosystem.

S3 event notifications

If event notifications are enabled for a S3 bucket, state changes of a S3 object are sent to a configured destination, which is either a Lambda function or a SQS queue or SNS topic.

External applications can leverage SQS or SNS as inbound mechanisms for event delivery.

 

The receiving application can then decide upon the type of notification if it wants to take any action on the S3 object.

This opens a wide range of use cases for connected applications

  • Applications can be triggered automatically in “realtime” if a new S3 object exists instead of polling the storage (e.g. an external partner uploads orders via SFTP to S3 or another application creates a S3 object)
  • S3 can be used as durable messaging solution between applications (especially as SQS, SNS or Eventbridge as AWS messaging services have a 256 K payload restrictions whereas S3 has not.
  • If a S3 objected is re-uploaded with the same key reprocessing can be started or ignored depending on the notification type.
  • Archiving or Restoring of objects can be detected at the moment it happens (e.g. restoring from archive storage types like Glacier often take a bit of time, so processing can be triggered right when this happens).

We’ll now use SAP PO with the KaTe AWS adapter as consuming application via a SQS queue.

Taking Action

Setup S3 & SQS

In order to receive S3 event notifications, the event notification needs to be enabled on the S3 storage bucket via CLI/API or the S3 console.

Here we use a bucket named “my-uploads”.

As first step we create a simple SQS queue (name also my-uploads) and allow S3 to use it for notifications.

The Premissions need to be configured to allow S3 to send the notifications to this queue with sqs:sendMessage as action

Here’s the small policy statement that allows this setting:

{
  "Version": "2012-10-17",
  "Id": "arn:aws:sqs:us-east-1:<your-account-id>:my-uploads/SQSDefaultPolicy",
  "Statement": [
    {
      "Sid": "Sid1589613120671",
      "Effect": "Allow",
      "Principal": {
        "Service": "s3.amazonaws.com"
      },
      "Action": "sqs:SendMessage",
      "Resource": "arn:aws:sqs:us-east-1:<your-account-id>:my-uploads"
    }
  ]
}

Next the Event notifications need to be enabled in the source bucket my-uploads. This could be done via API/CLI or directly in the S3 console UI (here a screenshot from the S3 console that creates events for PUT/POST or Multipart Uploads of S3 objects in this bucket)

Now upon upload of a new object a notification is sent to the queue that looks like the event below.

Note that it contains the type of event (eventName is ObjectCreated:Put) and key of the object.

{
  "Records": [
    {
      "eventVersion": "2.1",
      "eventSource": "aws:s3",
      "awsRegion": "us-east-1",
      "eventTime": "2020-12-02T15:48:06.047Z",
      "eventName": "ObjectCreated:Put",
      "userIdentity": {
        "principalId": "<....>"
      },
      "s3": {
        "s3SchemaVersion": "1.0",
        "configurationId": "NewObjects",
        "bucket": {
          "name": "my-uploads",
          "ownerIdentity": {
            "principalId": "<....>"
          },
          "arn": "arn:aws:s3:::my-uploads"
        },
        "object": {
          "key": "< Key of your object>",
          "size": 149057,
          "eTag": "0b45ff64f81777078e58b7802d73a896",
          "sequencer": "005FC7B7370F0B4802"
        }
      }
    }
  ]
}

We re now ready to receive those events in the SQS queue.

Configure S3 sender channel

In order to process the events with SAP PO we need to configure a sender channel with the KaTe AWS adapter. The channel then can receive those notifications and processes the related S3 objects.

Below is the necessary channel configuration. First we choose the AWS adapter with S3.

In the general section we just set the IAM credentials and choose “Listen to S3 events” as S3 Action. This will allow us to set the previously defined queue “my-uploads” as input trigger.

The checkboxes in the lower end of the screenshot show another filter mechanism in the adapter to define which events should be processed (if not checked they’ll be simply ignored).

Here we check all “Create events”. This will process all object creates and updates as configured in our S3 bucket configuration (PUT or Multipart upload)

Upon upload (or multipart upload) of a S3 object a S3 notification event is sent to PO via SQS.

The sender adapter then is triggered automatically, parses the notification to extract the key of the S3 object, fetches the object & creates a PO message with the contents of the S3 object.

Deduplication

The adapter also handles deduplication.

Upon retrieval the S3 object is insepected and can be deduplicated via s3Key and eTag. The eTag is simply a hash value of its contents that S3 creates upon upload. This allows you to avoid double processing if a SQS message with the event is processed twice or the same S3 object itself is uploaded twice without change with the same S3 key name.

Content transformation

For JSON contents the adapter allows you to transform contents from JSON to XML while the PO message payload is created.

Error handling

if any persistent errors appear during processing of an object notification (SQS has configurable redelivery options), a dead letter queue can be used to “isolate” erroneous events for later inspection or replay.

Deletion of S3 objects

Also Deletion of S3 objects is supported. If you change the S3 event notification settings on your bucket e.g. to detect deletions, the adapter will transfer the information about the deletion into a SAP PO message with adapter specific message attributes that contain information on the deleted object.

Multiple receivers of a S3 event notification

Now let’s take a look what happens if instead of SQS as single receiver, SNS is used as more then one application wants to listen to S3 event notifications.

E.g. imagine receiving an order from a partner that is processed via SAP PO, but also should be processed by another application (e.g. for audit purposes or a order receipt)

Usually the person administrating S3 would add SNS as receiver and “bridge” the events to multiple receivers and also the SQS queue we currently use. Our PO receiver would still be able to receive the same events alongside other receivers.

 

Effect on our existing integration via SQS: None! 🙂

The adapter takes care of all changes that come with that setup (SNS – SQS bridging delivers slightly different payload formats as its wrapped into a Message field).

Conclusion

This blog shows how to process S3 storage notification events with the KaTe AWS adapter making SAP PO a “first class citizen” in a typical AWS architecture.

No additional configuration like async/sync bridges is needed, which helps especially in high volume scenarios and it’s also much more easy to monitor.

Also no changes is needed to change to a “fan out” setup with SNS to allow multiple receivers of S3 notifications of one S3 bucket.

Configuring S3 event notifications

Kate AWS adpater for SAP PO

SAP Appcenter KaTe AWS adapter for SAP PO

Assigned tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.