Technical Articles
Advantco Adapter-Amazon S3 Integration made simple with SAP CPI
Introduction:
In below blog I would like to share how we can integrate Amazon S3 using SAP CPI – Advantco AWS Adapter
Scenario:
Lets create simple scenario to create file in Amazon S3 Bucket using Advantco Adapter
Before we start – lets talk on other options to achieve this, we can do this using SCP Open connectors service, very good blog shared by saiprasad or using SDI or CPI-DS – please check reference section for the blogs
However open connectors service involves cost and I wanted to see if any other options available just for CPI- AWS integration and glad that I found Advantco adapter which is readily available for no additional cost along with advanced options when compared with open connectors
Go to Discover from CPI tenant and type [SAP OEM] Amazon WS Adapter by Advantco and then go to download option from service market place, once you download and extract you can see all installation and config pdf files, follow the guide to setup
Note:the custom adapter can be installed only using eclipse as WEB UI does not have integration adapter deploy aircraft available
Download:-
Documentation:
SAP Note:- additionally you can install other Advantco OEM adapters like for no additional cost
- Salesforce Adapter by Advantco
- SugarCRM Adapter by Advantco
- Microsoft Dynamics CRM Adapter by Advantco
Test IFLOW : Convert XMl to CSV and create a file in AWS S3 Bucket
Advantco receiver channel config
To get access key and secret key – please follow the blog by saiprasad in reference section and go to section Configuration in AWS[ IAM service Configuration ]:
Note: Secret key mandatory to be deployed as secure parameter
Advantco receiver channel advance options config
Variable substitution, conversion and other out of box options are available
Deploy the IFLOW and you should be able to see the file created in AWS S3 bucket
Login to https://s3.console.aws.amazon.com and you can create S3 bucket manually or CPI adapter can create one for you if it does not exists
Check file which is created in S3 bucket
Validate the File created in S3 bucket
Create a free tier AWS account and you can explore other options like life cycle policy to archive files
Regards
Sudheer Reddy
Reference:
https://blogs.sap.com/2019/05/21/hana-amazon-s3-integration-the-easy-way/
Hello Sudheer. Thanks for the blog. Did you figure out how the adapter works for larger files? And sending larger files in chunk to CPI and then to S3? I am talking about file of sizes in GB. Any insights would be helpful!
Thanks,
Vijay Konam (VJ)
Hello Vijay, Thanks for your question. There is no technical limitation on the file size. Let me know if you have any other questions. We can organize a demo for you, kindly contact me at abahl@advantco.com
Best,
Ashish Bahl
Hello Sudheer,
do you have any suggestions how to manipulate the object key dynamically?
As far as I’ve seen this can only be set as a fixed value:
In the Advanced configuration you can change the creation mode:
However, this is not flexible and we’d like to set header or exchange parameters as file name instead. Is there any option to address them here?
Best regards,
Sebastian
Hi Sebastian,
very much possible with variable substitution, set variable as #variablename#
you can read headers value in variable substitution using message:HeaderName
you can read property value in variable substitution using exchange:PropertyName
Thanks
Hi Sudheer,
right before your answer, I’ve found out myself.
Anyway thanks for your fast reply.
Best regards,
Sebastian
Hello Sudheer Anugu
Great blog and thanks for informative blog.
I would like to know if kafka adapter supports SAP CPI full fletched & how flexible it is (in comparison with SAP PO Kafka adapter) ?
Secondly does SAP CPI supports schema registry as of now, and also avro & json conversions?
Is it a tactical long term reliable solution to use via SAP CPI ? Not sure about the license cost and post using Kafka adapter which should not ideally end up in capacity or feature constraints ?
Sebastian Seiler
Ashish Bahl
Looking forward for your valuable thoughts in elucidate. Thanks in advance!
Cheers