Skip to Content
Technical Articles

Using Azure Cognitive services with SAP Cloud platform

Introduction:

For quite some time, I have been working on SAP Cloud Platform and SAP Intelligent technologies. Recently, I have been learning more about power platforms such as Azure, AWS & GCP and how best we can integrate services available with SAP Cloud Platform.

Blogs from  Christian Lechner and Holger Bruchelt have been inspiring and I have been following them closely.

I would like to share about how we can integrate readily available Machine Learning services from Azure and Integrate them with SAP Cloud Platform to build a streamlined & efficient process around managing data wherever a printed version or form is used to capture data at any part of the process.

At SAP, we have multiple processes where a printed version of Form is provided to responsible person (there are various parties involved in process who may not have access to SAP system such as Warehouse guard etc.). That person updates printed form after completing the tasks and some one must update SAP again with updated value.

We will create a simple proof of concept. A Fiori Application that:

  • Accepts image as input
  • Extracts text from image
  • Reads information from SAP (We can use SAP Graph for this POC)
  • Updates value back to SAP (Use SAP Graph for POC)

I will not cover creation of SAP UI5 application to keep the blog short and will assume that simple UI5 application can be easily created.

Architecture Overview:

 

Step 1: Setting up Cognitive Services:

Azure provides a lot of pre-trained machine learning models for a lot of basic scenarios. These models can be easily consumed by creating cognitive services. You can read more about cognitive Service here. You will need trial version of Azure to create services.

Go to Azure Portal and create a resource for cognitive service by clicking on add. Search and select Computer vision:

 

Click on Create and provide:

  • Resource group (you can create a new one)
  • Name – Any Unique name for your service
  • Pricing Tear – You can choose any free pricing tear available

Click on Review & Create and then Create.

 

Once created. You can access endpoint and keys to user service by going to section KEYS & ENDPOINT. That’s it, now you have a service running that can be used to extract texts from images.

You can give it a try in postman. Please refer to API Reference guides .

We will be using https://{endpoint}/vision/v3.0/read/analyze api for extraction. API accepts url of image or blob with content type octet-stream. You can give it a try in postman with image URL. In our scenario, we will pass binary content of image.

Successful RUN for this api will give you results as 202 – Accepted. Go to response headers and you will need to make a get call with value in Operation-Location header and that will give you.

 

 

Step 2: Create a Serverless Function for business logic

As a next step, we will write our business logic in Serverless manner in Azure functions.

Why Serverless – With serverless approach we are only responsible for small piece of code we write for our business logic without worrying about application infrastructure. With Azure Functions, the cloud infrastructure provides all the up-to-date servers you need to keep your application running at scale. Read more about Azure functions.

Go back to Azure Portal home page and create a function app. Click on Add and provide:

  • Resource group (you can create a new one)
  • Function app name – Any Unique name for your service
  • Run time stack – Node js
  • Region – your closest region

Review and create.

Once function app is deployed, click on go to resource and Click on Advanced tools in Development tools section. Click on go and Kudu services will open in next tab-

 

Go to debug consoleàCMD

Navigate to siteàwwwroot by clicking on folders.

Run command – npm install node-fetch. This will install required module, I used to call cognitive services API in function. You may use other packages to call requests in nodejs.

 

Close the window and go to back to function app. Restart function app to take changes effect.

Go to functions section and create a function of type HTTP trigger. Go to code and Test and replace code of your index.js file with below:

 

const fetch = require("node-fetch");

module.exports = function (context, req) {


    let subscriptionKey = '<Cognitive Service Key>';
    let endpoint = '<Cognitive Service End Point>';
    var uriBase = endpoint + "/vision/v3.0/read/analyze";
    var parsedData;
    var readUri;
   
	const base64 = (req.query.image || (req.body && req.body.image));
    const data = Buffer.from(base64, 'base64');
    

    fetch(uriBase + "?" + {
        "language": "en"
    },
        {
            method: 'POST',
            headers:
            {
                'Content-Type': 'application/octet-stream',
                'Ocp-Apim-Subscription-Key': subscriptionKey,
            },
            body: data,    //post binary data directly, it applys to all post methods which data type is octet-stream in vision serice API 
        })

        .then((response) => {

            console.log("SUCESS OCCURED");
            console.log(response.headers);
            readUri = response.headers.get('operation-location');
            console.log(readUri);
            setTimeout(readResult, 1000);
        }).catch((error) => {
            console.log("ERROR OCCURED");
            console.log(error);
        });


// READ RESULTS FROM ANALYSIS
    function readResult() {

        fetch(readUri + "?" + {
            "language": "en"
        },
            {
                method: 'GET',
                headers:
                {
                    'Content-Type': 'application/json',
                    'Ocp-Apim-Subscription-Key': subscriptionKey,
                }

            })

            .then((response) => response.json()).then((data) => {
                parsedData = data;
                buildProducts();
            }).catch((error) => {
                console.log("ERROR OCCURED");
                console.log(error);
            });
    }

// Your Business Logic to identify Products from Extracted texts

    function buildProducts() {
        const result = parsedData['analyzeResult'].readResults;
        var line = result[0]['lines'];
        var output = [];

        line.forEach(function (table) {
            var flag = true;
            var text = table.text;

            if (text.indexOf("-") != 2) {
                flag = false;
            }

            if (text.lastIndexOf("-") != 5) {
                flag = false;
            }

            if (flag == true) {
                output.push(text);

            }


        });
      
        console.log(output);
        context.res = { body: output };
        context.done();

    }



}
// }

At end of this step, you have successfully created a function that accepts Image in BASE64 as an input in body and gives you required values as per business logic (for ex- products).

Once saved, you can test it in Azure console or in postman by getting Function URL.

Now you can call your function URL in FIORI Application.

sample file can be downloaded from here

Next Steps and Take away:

As next steps, you can create a Fiori application that allows users to upload images. On image upload you can call this Function that gives you products. For these products you can retrieve information from SAP GRAPH for POC or you can call actual SAP system and perform operations.There are various factors on security between 2 platforms that needs to be considered for production environment.

Integrating both the power platforms Azure and SAP Cloud Platform can bring value to any process within SAP for customer.

Sample POC Video:

 

 

5 Comments
You must be Logged on to comment or reply to a post.
  • Hi Ishaan,

    looks interesting to get started. It would be worth considering to add some components or details how to restrict communication between SCP and Azure. A good practice would be to add API management with JWT, OAuth and the likes in Azure. Of course access keys are sufficient for demos but not recommended for productive scenarios. In your case for instance you need a secure place to store the access key for the Azure Function to avoid putting it on the UI5 application code. Actually the SCP connectivity service would be your first line of defense because you can put credentials there too in a reasonably secure place.

    So consider using the Connectivity service not only for calling other SAP service but also the Azure Function or better the Azure API Management.

    Another line of thought would be Message Queing for resiliency when the UI5 app sends a lot of requests to be processed by Azure Cognitive Services.

    Keep it up! It is a great starting point.

    KR

    Martin

    • Hi Martin,

       

      Agree, a better practice on security will be required for production use. Intention for this blog is to demonstrate easiness of using what is readily available on azure power platform and how you can benefit that into SAP processes. may be will try to cover Azure API management in next blog around this.

      I am using connectivity service to connect to Azure Function from UI5 app. missed that in architecture diagram, it has been updated now.

      Thank you

      Ishaan

       

      • Agreed. In terms of API Management you could also use SAP API Management.

        The video is a good addition to the blog too. It would be even more powerful in my opinion if you could show the picture that has been analysed by Azure Cognitive Services. You extracted text from it, right?

        KR

        Martin