Skip to Content
Technical Articles
Author's profile photo Kishor Gopinathan

SAP Cloud Integration data archiving to BTP SDM

Introduction

As part of a customer engagement, I was asked to use SAP Document Management Service(SDM) for archiving data from SAP Cloud Integration. I used Marty’s blog on data archiving to NEO DS as the starting point to understand the required steps for NEO DS. Since there was no step by step document on SDM usage for archiving purpose, I had to refer multiple documents and even had to contact multiple teams internally to resolve the issues that I faced while performing this task. I wrote this blog to basically share the differences in the configuration, issues faced / resolutions applied and more over to have a step by step document on this topic.

Prerequisites

Make sure that you have the necessary entitlements for Document Management Service – Integration option and Document Management service – repository option. Repository option is required for SDM Integration (PaaS) / Application (SaaS) options.

Steps

The following steps are required to configure SDM for data archiving from SAP Cloud Integration.

  • Create a service instance for SDM Service – Integration option
  • Add repository to SDM using onboarding API
  • Create destination in subaccount (Basic Authentication – for activating archiving)
  • Activate log archiving on SAP Cloud Integration
  • Configure archiving on SAP Cloud Integration
  • Adjust the destination configuration for archiving (using OAuth2ClientCredentials)
  • Verify the replicated data using OpenCMIS workbench

Create a service instance for SDM Service – Integration option

You have to create an instance of SDM Service– Integration option (plan – standard is recommended since there are limitations with free plan) in the subaccount. It is not mandatory to have the SDM instance created in the same Cloud Integration subaccount. Since you are connecting the Cloud Integration to SDM repository via destination configuration, you can set up SDM repository in the same Cloud Integration or different subaccount. You can use this help document to create an instance of SDM – Integration option

SDM%20Instance

Service Instance

 

After creating the service instance, you have to create the service key to access the the instance locally or from external clients manually. (We need the service credentials to basically onboard the repository). You can refer to this help page for service key creation

Sample%20Service%20Key

Sample Service Key

You can open the service key and note down the following details from the json credentials for the next step.

ECM URL endpoints->ecmservice->url
Client Id uaa->clientid
Client Secret uaa->clientsecret
Token Service URL uaa->url


Add repository to SDM using onboarding API

We need to onboard an internal repository(Document Management Service’s cloud repository) so that the archives from SAP Cloud Integration can be replicated to this repository as .zip files during archiving. Document Management Service – Repository option entitlement is required in the subaccount to onboard an internal repository. To create this internal repository, we have to connect the SDM Integration option with the SDM repository option. This can be achieved with the help of invoking repository onboard REST API using a REST client. I used Postman for this. You can check the details of different SDM Integration option APIs in this API Business Hub Link

The repository onboard rest end point url for your tenant can be formed by concatenating <ECM URL> (retrieved from SDM Integration option, service key) and rest/v2/repositories. End point for repository onboard rest api will be like https://api-sdm-di.cfapps.eu20.hana.ondemand.com/rest/v2/repositories . ECM URL varies with provider and subaccount region. In my case, it is Microsoft Azure / Europe (Netherlands).

The supported authentication method for all the SDM Administrative operations is OAuth2 (OAuth2ClientCredentials).

You have to make a POST request using any REST client to the repository onboarding rest service end point with Bearer Token Authorization option. There are different ways to include Bearer Token as Authorization type in the request.

  1. Make a separate POST call to generate the Bearer Token and this token can be set as Authorization–>Type (Bearer Token) –> Token in the repository onboard POST request
  2. For the repository onboarding POST request, set Authorization–>Type(OAuth 2.0)–>Configure New Token (Configuration Options tab)

We need Token Service URL, Client Id and Client Secret that are retrieved from the Service Key to generate the Bearer Token.

I used the second option (Configure New Token) to include Bearer Token to the repository onboard POST request. Postman request details in my case are shown below.

Authorization%20Tab

Authorization Tab

Body%20Tab

Body Tab

If you want to create a repository with readable repositoryId (in my case I passed repocpiarchiveb), you should pass repositoryId also in the onboarding POST request. A dynamically generated repositoryId will be returned if you do not pass repositoryId.

Create destination in subaccount (Basic Authentication – for activating archiving)

We need the destination configuration in subaccount level to connect SAP Cloud Integration to BTP SDM for the archiving purpose. Destination is basically used for the following purposes.

  1. To activate the archive functionality on SAP Cloud Integration
  2. Replicate the archive .zip files to SDM during archiving

As the first step, we have to create a destination in SAP Cloud Integration subaccount following this help document . As mentioned in the document, destination name should be CloudIntegration_LogArchive. 

Both Basic Authentication and OAuth2ClientCredentials are mentioned as the supported authentication mechanisms for the destination in this document, however, there is a limitation at present for activating the archiving functionality(to SDM) that the destination should be configured with Basic Authentication only initially.(help document will be updated soon with this restriction). You need to pass  “Client Id” and “Client Secret” as the “User” and “Password” respectively in the destination. The repositoryId you got as the response during onboarding should be maintained as an additional property “RepositoryId” in the destination. URL used here can be formed by concatenating the ECM URL from Service Key and /browser .

In my case, it is “https://api-sdm-di.cfapps.eu20.hana.ondemand.com/browser“(to show the complete URL format, I have removed some part from this in the below screenshot. Dont be confused with this!) Destination should look like as shown below:

Destination%20-%20Basic%20Authentication

 

Destination – Basic Authentication

Activate log archiving on SAP Cloud Integration

To enable archiving, send a POST call to the URL : https://path-to-odata-api/api/v1/activateArchivingConfiguration, where path-to-odata-api is specific to your Cloud Integration host.

I used Postman to make the OData call to activate archiving functionality.

OData End point: https://<<Cloud Integration Host>>/api/v1/activateArchivingConfiguration

In general, under Authorization tab in Postman, you have to use “Basic Auth” as the “Type” and make sure to maintain Username / Password of a user, who has necessary rights for activating the archiving. User maintained should have DataArchiving.Activate (to activate) / DataArchiving.Read (to see the archive configuration) roles assigned.

At first, we need to fetch the x-csrf-token.  Set a Header property “x-csrf-token” to “Fetch” and make a GET request to the URL to fetch the x-csrf-token.

This retrieved “x-csrf-token” should be passed as the value for header parameter “x-csrf-token” to make the POST call. It is as shown below:

Activate%20Archive%20POST%20call

Activate Archive POST call

You should see the message “Archiving successfully enabled for tenant <your tenant name>” on successful POS request call.

A successful activation call checks:

  • that a destination with the correct name exists,
  • activates the archiving job,
  • and enables configuration of the archiving settings in the user interface.

Additional Information–> SAP Cloud Integration is handling the archiving functionality with the help of SAP Job Scheduling Service. Upon successful activation of SAP Cloud Integration capability, Jobs to handle different functionalities(monitoring , archiving..) would have been created for the specific tenant. When you activate the archiving functionality for your Cloud Integration tenant, internally the default schedule created for “Archive Monitoring Data” job would be activated (by default it is in deactivated state). There is no option for the Cloud Integration user to make changes to the schedule configuration. There are different properties in the schedule configuration, one of it is “archivingTimeThresholdInDays“, which is the threshold time to start the archiving for an integration flow after it’s execution. By default this value is set as 7. If you want to make changes to this default archive threshold setting, you need to create a support ticket in “LOD-HCI-PI-OPS” mentioning your Cloud Integration tenant details.

Configure archiving on SAP Cloud Integration

After activating the log archiving functionality on the SAP Cloud Integration tenant level, the next step is to configure Log archive settings in individual Integration Flow level. As the prerequisite, make sure that the user has TraceConfigurationEdit and TraceConfigurationRead roles granted through an assigned role collection. In SAP Cloud Integration, you need to navigate to “Monitor–>Manage Integration Content–><Select the Integration Flow>–><Open Log Configuration Tab>” and click on “Configure” button to modify the current setting.

Modify%20Archive%20Data

Modify Archive Data

Archiving is available for the following options:

  • Sender Channel Messages: Messages received from Sender and responses returned to Sender.
  • Receiver Channel Messages : Messages sent to Receiver and responses returned by Receiver.
  • Persisted Messages: Messages stored via Persist flow step.
  • Log Attachments: Message Processing Log attachments.

Adjust the destination configuration for archiving (using OAuth2ClientCredentials)

After archive activation / configuration, based on the threshold time to start the archiving for your specific tenant (default is 7 days and technically this can be reduced to 1, however is based on your specific business requirements), the actual archiving would start. For all SDM – Integration option administrative operation REST APIs, the supported authentication mechanism is “OAuth2ClientCredentials” only. During archiving, configured log files would be compressed as .zip files and replicated to the onboarded SDM Repository. Due to this, we need to reconfigure the created CloudIntegration_LogArchive destination to support “OAuth2ClientCredentials” authentication. For this, you need “Token Service URL” in addition to “URL”, “Client Id” and “Client Secret”.

‘Token Service URL‘ to be used in the destination can be formed by concatenating “uaa”–>”url” property from the Service Key for SDM – Integration option and /oauth/token

The format of this Token Service URL is –> https://<Subdomain>.authentication.<region based on provider>.hana.ondemand.com/oauth/token

Destination%20-%20OAuth2ClientCredentials

Destination – OAuth2ClientCredentials

 

Verify the replicated data using OpenCMIS workbench

If the configurations were correct, once the threshold time to start the archiving is passed, log archives from the configured Integration flows would be exported as .zip file (name of the zip file would be <Message ID>.zip) and replicated to onboarded SDM repository. There is no standard tool provided by SAP to see the contents from this repository. Due to the same reason, you need to use Open CMIS Workbench tool from Apache Chemistry . Please refer to the given link and follow set up instructions to set the workbench for your OS.

You need the ECM URL and Bearer Token to login to SDM – Repository using Open CMIS Workbench. Please follow the “Bearer Token” generation steps explained before to generate the token.

After opening the CMIS Workbench, click on the “Connection” button and enter/select the options as shown below. Only the following values should be adjusted accordingly, all other default settings can be retained.

URL ECM URL
Binding Browser
Username Bearer Token
Authentication OAuth 2.0 (Bearer Token)

Open%20CMIS%20Workbench

Open CMIS Workbench

Click on “Load Repositories” to see the contents in onboarded SDM repository. You should be able to see the repositories as shown below. In my case, it was repocpiarchive

Loaded%20Repo%20-%20repocpiarchive

Loaded Repo – repocpiarchive

Click on the “Login” button to see the replicated contents.

Replicated%20Archive%20files

Replicated Archive files

You can see the selected ZIP file. Name of the zip file would be <Message ID>.zip . Here the zip file name is AGK32cCH5LENvH_0wnLOut77_JhL.zip (cmis:name property). If you double click on the selected file, you would be able to see the contents of the .zip file.

If you search with the above Message ID in Monitor Message processing, you should be able to locate the Message.

Monitor%20Message%20Processing

Monitor Message Processing

 

Conclusion

I have captured all necessary steps with screenshots here hoping this document will help you to configure data archiving from SAP Cloud Integration to BTP SDM.

Please feel free to leave your valuable feedbacks / questions as comments on this blog. Let me know if you need more information / clarification for any of the steps. I will be really happy to assist you!

 

Thanks & Regards,

Kishor Gopinathan

Assigned Tags

      15 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Ajay Garg
      Ajay Garg

      Excellent and well explained Kishor. I am also using this for one of customer. It is a very good use case of standard archive capability offered by Integration Suite (CI) and making use of BTP's Document Management Service.

      Author's profile photo Kishor Gopinathan
      Kishor Gopinathan
      Blog Post Author

      Thanks a lot for your valuable comments. Nice to hear that it is of help for you.

      Author's profile photo Nick Yang
      Nick Yang

      Hi Kishor,

       

      Thanks for this detailed steps and information.

      Very helpful and I'll give it a try for sure!

      One question would like to ask you. Will this external archiving setup affect "Log Level" setting?

      In another word, if I set "Log Level" to trace after external archiving enabled, will the trace information also sent to SDM (in this case)? And how long it will be kept? Still same as before?

      Cheers.

       

      Regards,

      Nick

       

      Author's profile photo Kishor Gopinathan
      Kishor Gopinathan
      Blog Post Author

      Hi Nick,

      Thanks a lot for your valuable feedback and for raising this thought-provoking question!.

      I could check with the product development team and got the information that

      "the archiving set up does not affect 'log level' settings and no data will be archived for log levels NONE and ERROR and for INFO, DEBUG and TRACE the same data (the regular archiving information, no additional information) should be archived. These two functionalities are handled separately and persistency for traced / archived payloads are different. MPL Log Levels are somehow irrelevant for Archiving. The only prerequiste for archiving is that one of the Log levels INFO, DEBUG, TRACE are switched on. For archiving only the MPL skeleton is needed which still available after 60 mins."

      In addition, I have tested the archiving functionality in our test environment(I had reduced the archive threshold days value from 7 -default value to 1 day -minimum value) with different log levels. (especially trace / debug).

      For log level Trace --> Trace settings would be reverted back to the previously used log level after 10 minutes and the collected trace will be removed after 60 mins (the original restriction with log level Trace will be retained)

      For log level Debug --> Log level Debug expires after 24 hrs. (retained)

      For log level Error --> No data is archived.

      With different log level settings in different intervals, I could see that the same data was archived to SDM (irrespective of log level settings) for my sample test scenario.

      Hope this clarifies your concern. Let me know if you need any more information as part of it.

      Thanks & Regards,
      Kishor Gopinathan

      Author's profile photo Nick Yang
      Nick Yang

      Hi Kishor,

       

      Thank you very much for your explanation and clarification!

       

      Regards,

      Nick

      Author's profile photo Abdul Khadar
      Abdul Khadar

      Hii Kishor,

      Thanks for the wonderful blog on SDM,

      I have tried passing "repositoryId" and also "cmisRepositoryId" in the onboarding Repository POST request . I am not able to dynamically generate repositoryId ,can you please guide me on this

      Regards,

      Abdul Khadar

      Author's profile photo Kishor Gopinathan
      Kishor Gopinathan
      Blog Post Author

      Hi Abdul Khadar,

      Thanks a lot for your valuable feedback. Nice to see that you have started configuring SDM for archiving purpose.

      Please note that,

      1. Onboarded repositories have both repositoryId and cmisRepositoryId properties.
      2. It is the repositoryId, not cmisRepositoryId ,which you will be maintaining in the destination.
      3. In general, if you want to use a fixed name in the destination, you should pass "repositoryId" also to the onboarding request.
      4. If you do not want to use a fixed name(which can be used across landscapes), then you should not pass "repositoryId" during onboarding call.

      In your case, you passed repositoryId also to the onboarding request, so, repositoryId to be maintained in the destination is "repocpiarchive".

      If your requirement is to use the dynamically generated repositoryId, then you should not pass repositoryId attribute itself during onboarding call.

      You can view the onboarded repository details with another GET call to the CMIS browser binding URL (make sure to pass Bearer token for authorization purpose).

      For example, URL like - https://api-sdm-di.cfapps.eu20.hana.ondemand.com/browser. Please collect this information from service key.

      Hope this clarifies your doubts. Let me know if you still have any concerns. I will be really happy to assist you.

      Thanks & Regards,
      Kishor Gopinathan

      Author's profile photo Abdul Khadar
      Abdul Khadar

      Hi Kishor,

      Thanks for the quick response , can you kindly answer the below queries im facing

      1. Apologies for the typo in the above question , i was trying to achieve readable repository id instead of dynamic one  , which I'm not able achieve even if i pass   repository id or cms repositoryid ,im getting dynamic  values in the response instead of readable repositoryid

       

            2.In my Onboarded repositories , We  Only have  cmisRepositoryId property and dont have  repositoryId property  can we consider "id" as  the repositoryid  , i have added attachment for your reference of the onboarded repository details (get request) that you have suggested(attachment_1)

       

      Attachment_1

       

      2. We have Maintained, value of  CmisRepositoryId At the destination with name repository id in additional properties , as there is no repository id in the onboarding repository details , should we maintain "id" value as repository id in the destination instead cmisrepositoryid(attachment_2)

      Attachment_2

      Attachment_2

      3. Can we create Multiple destinations for different packages in the Tenant to store the Data of each package in a designated repository?

      4. I have referred the below standard document for the data to be posted on the onboarded repository where they have referred id as repository id ,  is it the same we need to maintain in destination?

      https://help.sap.com/docs/DOCUMENT_MANAGEMENT/f6e70dd4bffa4b65965b43feed4c9429/d30200e0993a457888db2786d4bb5cd9.html

      Author's profile photo Kishor Gopinathan
      Kishor Gopinathan
      Blog Post Author

      Hi Abdul,

      As mentioned in my previous reply,

      For 1 & 2
      You can view the onboarded repository details with another GET call to the CMIS browser binding URL (make sure to pass Bearer token for authorization purpose).

      For example, URL like - https://api-sdm-di.cfapps.eu20.hana.ondemand.com/browser. Please collect this information from service key.

      Instead of ../rest/v2/repositories , try with the browser binding URL as given as example above. The response of this should give the repositoryId information. If you pass a dedicated name during onboarding, repositoryId would have that, otherwise a dynamically generated guid will be returned.

      Please note that, it is the repositoryId, not cmisRepositoryId that you should use in the destination.

      For 3

      At present, this is not supported. Usually, if we have the same feature request from multiple customers, based on the priority and urgency, product development confirms this as a new requirement and the same would be included in product with the latest version. This has to go through a process before being added to the product.

      For 4

      Yes, id referred here is the information you should use in destination. If you try to get the details of onboarded repository using ..rest/v2/repositories, you can see this information with "id". If you use the browser binding URL to see the details(which is the one internally used for repository operations), you should see this information with repositoryId property. If you use ../browser to see the repository details, please check the repositoryUrl, which you can see almost at the end of the response. Like below.

       "repositoryUrl": "https://api-sdm-di.cfapps.eu20.hana.ondemand.com/browser/testrepocpiarchive", . Here, whatever you see after browser/ (/browser/<repositoryId>), is the repositoryId.

      Hope this clarifies your concerns.

      Thanks & Regards,

      Kishor Gopinathan

      Author's profile photo Abdul Khadar
      Abdul Khadar

      Hi Kishor,

      Thanks for answering my Queries  patiently ,Trust me your blog works like a miracle , Able to Archive the Payload successfully,

      Archived%20Artifact%20on%20cpi

      Archived Artifact on cpi

      Archived%20Artifact%20Msg%20id%20on%20CMS%20Work%20Bench

      Archived Artifact Msg id on CMS Work Bench

       

      Payload

      Payload

      Any suggestions , If i want to Query back the specific Artifact pay load through CPI dynamically, by using object ID  or any any suggested  parameters of posted document of your choice  we can leverage to implement the scenario

       

      Thanks & Regards

      Abdul Khadar

      Author's profile photo Kishor Gopinathan
      Kishor Gopinathan
      Blog Post Author

      Hi Abdul,

      Nice to hear that it is of help to you!

      Not sure if I got you correctly.

      If you are looking to filter it from the CMIS workbench, you can use the query option available with workbench. I have discussed this as part of another comment.

      If you are looking any API from the SAP Cloud Integration side to query the archives, I need to verify this first (to basically check if we have any APIs or not)

      Thanks & Regards,
      Kishor Gopinathan

      Author's profile photo Vinod Patil
      Vinod Patil

      Hello Kishor,

      Thanks for excellent blog.

       

      1. I assume that after 30 days in BTP Integration Suite monitoring, you wont be able to search the message even if its archived. Is that right? If that's the case, how do we know which log is for which iFlow, searching it becomes difficult.

      Replicated%20Archive%20files

      2. 

      Can we use SAP Document Management Service(SDM) - Application Option (https://discovery-center.cloud.sap/serviceCatalog/document-management-service-application-option?service_plan=standard®ion=all&commercialModel=cloud) which provides UI web application to access the repository?

      This will avoid usage of OpenCMIS workbench to access repository.

      Regards,

       

      Vinod

      Author's profile photo Kishor Gopinathan
      Kishor Gopinathan
      Blog Post Author

      Hi Vinod,

      Nice to hear that it is of help to you.

      1. Monotoring information will be cleared from the Cloud Integration side. Since the logs are already archived, there is an option on the CMIS Workbench side to query the logs using different query parameters. When you click on the 'Query'' button there in the workbench window, a query editor pop up will be opened, where you can mention the specific CMIS query.

        Here in the below attached image, the following are marked.

      Query : To basically open the query editor and you can filter the contents based on different properties.

      Type: When you click on the 'Type' tab, you should be able to see all the standard as well as custom types by SAP. There is a Queryable colum, which indicates whether a the property name can be used to query or not. true / false values are there to indicate it.

      For eg-: SELECT * FROM mpl:message where mpl:iFlowName = 'E2E_HTTP_Receiver' . Here, I have given the iFlowName and can retrieve the logs pertaining to iFlow - 'E2E_HTTP
      _Receiver'. This query option is an option to list down logs that are relevant for a specific iFlow.

      2. There is a limitation here. As you know, while activating the archiving functionality using postman, you are using the "Client Credentials" to execute the service. When you create an instance of "application" plan of SDM, different roles will be created and those have to be assigned to user via role collections. This can be done for a user added in the platform / cockpit level.

      Whatever repository you create via UI, wont be visible / cant be used to enable archiving functionality while triggering from postman using client credentials. Likewise, the repository you created via postman (using client credentials) will not be shown / visible in the UI level. Due to the same limitation, you cannot utilize the SDM - application option to display the archived folder information.

      Hope this clarifies your concerns.

      Thanks & Regards,
      Kishor Gopinathan

      Author's profile photo Vinod Patil
      Vinod Patil

      Thank you Kishor for answering both questions. This helps.

      Author's profile photo Uppena Nagaraju
      Uppena Nagaraju

      Hi Kishore

       

      I have created Destination using OAuth2ClientCredentials for SDM browser, but getting unauthorized issue, even I have tried with postman getting same issue, also tried using Basic Authentication using ClientID and Client secrete, can you help on this..