Skip to Content
Technical Articles

How to Connect Microsoft SharePoint with SAP Data Intelligence using SCP Open Connector

While the process of data integration and orchestration is one that many data engineers are familiar with, doing so via SAP Cloud Platform with third-party cloud storage would still present an obstacle due to its new founding. The usage of Microsoft SharePoint as a storage of data has increased dramatically as of recently due to both its simplicity and applicability. In this blog, I shall provide a rather succinct overview of how to integrate Microsoft SharePoint with SAP Cloud Platform’s Open Connector subscription. I would like to first and foremost thank Divya Mary (@divya.mary), Akitoshi Yoshida, Christian Senstock (@christian.sengstock), Rene Penkert (@renpen), Bogdan Leustean and Zhouyang Li for their immense support in making everything in this blog possible.

 

For this I assume that the reader has:

 

  • A valid Microsoft SharePoint Site
  • A version of the SAP Cloud Platform
  • Modified the SharePoint Site so as to grant third-party access to SAP Cloud Platform’s Open Connectors subscription*

*In order to do so, one can look at this amazingly clear and useful blog by Divya Mary, whose support was greatly appreciated in doing this task,

https://blogs.sap.com/2019/09/19/connect-to-sharepoint-online-via-sap-cloud-platform-open-connectors/. The first part of this blog borrows largely from Divya’s blog with her kind permission. Please make sure to not forget to have the API Key and API secret saved from this. First off, when one goes onto the main SAP Cloud Foundry Homepage, she must click on subscriptions.

 

Now, click on the relevant subaccount for which you wish to connect SharePoint with. Once, you’re in the relevant subaccount, you must click on subscriptions located in the panel on the left-hand side.

 

On the subscriptions page, click on the Open Connectors application and subscribe to it. Once, you have subscribed to it, you may click on, “Go to Application”

At this point, go onto the Connectors tab on the left-hand side and then go onto the SharePoint tab which will be located among many others and click on, “Authenticate”.

 

You will be presented with a screen which will ask you for certain information in regard to your SharePoint instance;

 

  • Name: This is the name that you wish to give to your connector instance.
  • SharePoint Site Address: The base URL for the specific SharePoint site that you wish to connect to.
  • API Key & API Secret: Assuming you have followed Divya’s Blog correctly, you should have the API key and API secret to put into these subsections.
  • MS OAuth Scope: This must be set to “AllSites.Manage” AND “MyFiles.Write” which was one of the rights that you configure the SharePoint site to give in Divya’s Blog.
  • Use Scope: Should be set to true so as to give the API access to files stored within the SharePoint site or else you will find that you are denied authorization to get any file later on.
  • Documents Library: This should follow the path to the specific folder that you wish to access within SharePoint up until you have reached, “Shared Files”.
  • Delegate User: Should be set to True

 

Once you have filled in the necessary information, click on Create Instance. A new tab may open up which should then ask you if you trust the third-party cloud source. If that happens, click on Yes, you will be redirected back to Open Connectors homepage and receive an on-screen message that states that you have successfully created an Open Connectors instance. Now, click on the Instances tab on the left-hand side and you should see the instance that you just created on the screen with the relevant description that you assigned it.

 

Click on API Docs and you will be brought to a screen containing the vastly numerous API options that you are allowed to use with the SharePoint instance. You may now decide what to do with your instance.

 

For the sake of providing an example for readers, I have decided to use the Get/files API call on the file section to acquire a file stored in my SharePoint instance.

 

Click on the Get/files API Call and you will be presented with a Parameters subpage. Click on the “Try it out” tab on the top-right corner of the screen. It will have three sections to fill out.

 

  • Authorization: The Authorization section should already be filled up with a custom-made User token for you. This can be saved in case you wish to perform API calls outside of Open Connectors.
  • Path: As stated, this is the just the specific path to the individual file that you wish to retrieve via Open Connectors from SharePoint.
  • Subsite: This has to be the individual subsite of the folder you wish to access in Microsoft SharePoint. Although Open Connectors does not label it as required explicitly, my colleagues and I found out that we are unable to access the files without writing in the subsite. As such, I highly recommend it as implicitly required.

 

Please note that you have to be very, very careful when writing the information in. Any small perturbation or typo such as a trailing space will cause this to return an error. Assuming you have written it correctly, you should see the following screen return with the CURL link, Request URL and response headers showing the file you have retrieved.

 

The preceding action is also possible from any RESTful API caller such as POSTMAN. As an example, I will now do it from Postman API.

In the Params configuration, you need to state the exact path to the individual file that you wish to acquire. This should be exactly the same as what was put in the path for the API call when you do it directly on the Open Connectors page as shown earlier. Once, you do so, the URL address in the bar above should self-generate assuming you have written the correct path.

 

Now, you will click on the Headers subtab and put information pertaining to Authorization and the subsite. The Authorization you write in should be exactly the same token that was automatically generated for you by the Open Connectors subscription and can be found by clicking on any API call in the Open Connectors page for the instance you wish. The Subsite will also have to be written here exactly the same as it was written for API calls on Open Connector. Ensure that the API call is set to GET before you send it. Once you press the blue send button the top right-hand corner, you should see the document pop on the Postman screen.Now, I shall show how to implement the process of downloading the file from OpenConnectors’ Sharepoint instance within SAP Data Intelligence. For the sake of simplicity, the model shall be relatively short but flexible so that a multitude of edits could be made to it.

 

In order to implement the following pipeline, you will need:

 

  • SAP Data Intelligence
  • SAP HANA (this could be any 3rd party storage such as an Amazon S3 bucket or an internal data lake)

 

The pipeline follows a simple pathway which I will describe sequentially. First, you will have the constant generator which output a string type containing some necessary parameters to be inserted into the following Open API Client Operator.

 

{“openapi.header_params.Accept-Encoding”: “identity”, “openapi.header_params.Subsite”: “<value>”,”openapi.header_params.Authorization”:”User<value>,Organization<value>, Element <value>”, “openapi.query_params.path”: “<value>”}

 

These parameters are necessary as they pass along the necessary path, authorization keys, subsite and encoding scheme for the Open API operator to use. While you should already know what the path, subsite and authorization keys are, they are the same that were used in the OpenConnectors API and Postman, the Accept-Encoding:”identity” is so that the Open API returns the downloaded files in its original byte-encoded format which can then be processed by any other operator (Python for example) downstream. If this specification is not set, the Open API operator will instead issue a compressed gzip file in byte form.Since it will be in this compressed form, SAP Data Intelligence will not be able to convert the file into a byte form which can be then either edited within SAP Data Intelligence or even inserted into any external data storage. As such, these parameters are absolutely necessary.The string is then sent to a toMessage converter which sends the newly made message to the Open API operator which takes in the aforementioned parameters.

Please note that it is also possible to set the authorization scheme in the Open API operator rather than in the constant message generator where they are defined in “openapi.header_params.Authorization”. To do so, you would need to set the API key in the authorization configuration as:

“authScheme”: “apiKey”,
“apiKeyValue”: “User wunpHPK8AfOc7fRqYp5Lq5MTJyzzKZ8S9iJiYoWxyec=, Organization a6e594005b00d61241c1e5524e7851e5, Element cLzTM8qI6U0V1AmoJkGi0p54zNmC/AbD4PGPs5lTE78=”,
“apiKeyType”: “header”,
“apiKeyName”: “Authorization”

After the aforementioned steps, the Open API operator will only need 2 mandatory subfields to be filled by you.

 

  • Host: You will need go type in the host URL for the API for which you have decided to use. This host URL usually consists of the base of the URL rather than the whole string.
  • Base Path: The service’s base path which entails the service to be used. i.e. v2
  • Include Response Headers: Writing in “content-disposition, content-encoding” will allow you to see the data type and encoding scheme which is being sent out by the Open API. (Optional but recommended for debugging)

Once, you have completed the mentioned steps, you will now have extracted data from your  Open Connector instance. What I have done is using a Python operator to simply transform the data into a CSV file using the Pandas library to be put into an instance of SAP Hana.

 

import xlrd
import pandas as pd
import io

def on_input(msg):
    df = pd.read_excel(msg.body)
    csv_file = df.to_csv(index=False)
    api.send('outFile', csv_file)

api.set_port_callback('inFile', on_input)

However, the versatility and flexibility of SAP Data Intelligence means that you are more than able to transform and insert your data in whatever method you wish and insert into whatever external storage you have such as internal data lake or an external S3 bucket. Congratulations, if you followed all the steps exactly, you will have successfully integrated Microsoft Sharepoint with SAP Data Intelligence via Open Connectors subscription.

 

 

Be the first to leave a comment
You must be Logged on to comment or reply to a post.