Skip to Content
Technical Articles
Author's profile photo Andreas Forster

Certificate Authentication to call an OpenAPI Servlow pipeline on SAP Data Intelligence

The OpenAPI Servlow operator in SAP Data Intelligence provides a REST-API that can be called from another program or process. Such pipelines can be used to obtain predictions from a Machine Learning model in real-time, but there are many other use cases.

In this blog you find the steps to call such a REST-API using a technical user’s certificate (and not your own user’s password).

Thanks to James Giffin, we looked into this together, and kudos to Mark Waldaukat and Florian Föbel who helped us along with DI’s certificate concept.

Prerequisites

You have already created a pipeline that uses the OpenAPI Servlow operator. The steps in this blog are based on the inference pipeline described in SAP Data Intelligence: Create your first ML Scenario

My own user is defautl\ext01 and it can call the REST-API using my own SAP Data Intelligence password.

 

Call the REST-API with a technical user’s password

First, the pipeline must be visible to the technical user. Undeploy any inference pipeline that might still be running. Then, still logged on with your own DI user, create a new version of the ML Scenario. The versioning makes the content visible to other users, including the technical user.

 

After having versioned the ML Scenario the technical user can see the notebooks, pipelines and saved models from the scenario. However, the Dockerfile is not yet visible for the tech user and must be shared specifically.

Still logged on with your own user go into the “System Management” and select the “Files” tab. With the search box find the folder of your Dockerfile. Select “Export as solution to solution repository”. These steps are also documented in the help under Sharing Files Using Solution Repository.

 

Now log out of your own user and back in with the technical user. My technical user is called ditech This technical user must have the role “sap.dh.Developer” assigned and possibly other roles. For a simple test I assigned all roles but you may want to be more restrictive.

Still on the “Files” tab of “System Management” (but now logged on as the tech user) click the “+” sign and select “Import solution from solution repository”.

 

Find the solution that you have exported and select “Import Solution”.

 

Now all required content should be accessible for the technical user. Go to the ML Scenario and deploy the existing inference pipeline. Select the trained model that was created by your own user. The first deployment will take longer as the Dockerfile is getting built for the technical user.

 

Once the pipeline is running you get the usual deployment url.

 

And now you can call the REST-API with the technical user.

 

Call the REST-API with a technical user’s certificate

Download and extract the “SAP Data Intelligence System Management Command-Line Client (CLI)“. This CLI will create the certificate that will be used to replace the password.

Logon with the technical user through vctl to SAP Data Intelligence. You will be prompted for the password. (click the screenshot to see the details)

vctl login https://YOURDISYSTEM.ondemand.com/ YOURDITENANT YOURDIUSER

 

Create the certificate files. The “-o .” parameter specifies that the files should be written into the current folder.

vctl user certificate generate -o .

Two files were created:

  • bundle.pem:  the certificate
  • key.pem: the private key

 

Now test a logon with these files through vctl. Logout first.

vctl logout

 

Log back in, now with the certificates.

vctl login  https://YOURDISYSTEM.ondemand.com/ --user-cert bundle.pem key.pem

 

Knowing that the certificates are working, use them to call the REST-API created by the OpenAPI operator in SAP Data Intelligence. Go back to Postman. First remove any cookies from Postman so that you are sure that the certificates will be used.

 

Then add the certificates to Postman. Go to “File” -> “Settings” -> “Certificates” and click “Add Certificate”. Use the details to add the certificates

Specify the host as: YOURDISYSTEM.ondemand.com/

CRT file: Select the bundle.pem file

KEY file: Select the key.pem file

 

Now in the existing Postman request change the Authorization to “Inherit auth from parent”.

Send the request and you should receive the response.

 

You have called a REST-API that is provided by an OpenAPI Servlow operator with a technical user’s certificate. No password required.

Assigned Tags

      7 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo DEEPAK RAI
      DEEPAK RAI

      Hi Andreas,

      We have followed the steps suggested by you and it was working correctly.

      However after 15day's we noticed error in postman Illegal URL,

      We have deleted and re generated keys and certificate, and then It started working.

      Query:- Do we have any validity period for these certificate and key's..?

       

      Thanks.

      Deepak Rai.

       

       

       

      Author's profile photo Andreas Forster
      Andreas Forster
      Blog Post Author

      Hello DEEPAK RAI, I had no luck finding the answer internally. Since each test would take two weeks I suggest opening a support ticket to clarify.

      Author's profile photo DEEPAK RAI
      DEEPAK RAI

      Thanks Andreas for your reply.

      I have already raised it with SAP - 219623 / 2022.

      My thought

      What if we can use policy and assign relevant policy  to communication user which allows user to access only API endpoint, this approach will also solve our purpose and smoothly we can set up communication between DI and API.

      However by default developer.dh policy gets assigned and it provide lots of access to access DI tenant which results into security issue.

      Thanks.

      Deepak Rai.

       

       

       

      Author's profile photo Andreas Forster
      Andreas Forster
      Blog Post Author

      Hi Deepak, For a similar requirement I was recently given suggestions below. The intention was to have one user trigger pipelines without being able to open them in Modeler. I haven't tried these steps myself but maybe they are helpful for your case.

      • If you have 2 users then these users have 2 separate user spaces. Each user can only start the pipelines in his user space.
      • If User A creates 2 pipelines that user B needs, then user A creates a solution that user B needs to deploy. Then user B can use them and modify them. There is no option to restrict the access to the code once it is in the user space.
      • If there are pipelines of secretive details, it needs to be deployed to a special user that has no access to the modeler app. Then you can provide an operator that triggers this pipeline via API in the user space of this special user.
      Author's profile photo Martin Donadio
      Martin Donadio

      Hi Andreas,

      Very nice blog post, thanks for sharing !

      Everytime the pipeline is restarted, the endpoint changes. Do you know if there's a way to keep always the same endpoint? For production use cases for example.

      Thanks !

      Martin

       

      Author's profile photo Andreas Forster
      Andreas Forster
      Blog Post Author

      Hi Martin Donadio , The OpenAPI Servlow operator can be given a fix string for the base path (which would provide the table url), but if that's done then the ML Scenario doesn't activate the "Deploy" button anymore to run the pipeline.

      You could try running such a pipeline with fixed base path directly from the Modeler. Or alternatively you could create a custom pipeline in Modeler, consisting of an "OpenAPI Servlow" and a "Python 3" operator. I am just working on such a case, please ping me directly if you want to discuss.

      Author's profile photo Rajesh PS
      Rajesh PS

      Andreas Forster

       

      Could you please provide your inputs here. Thanks mUCH!

      HANA DB TO REST API SYNC CALL