AppGyver UI Extension with SAP AI Core
Building a UI extension was never easy before, thanks to AppGyver which empowers citizen developers to design, build, and deploy business-critical applications in a productive environment in very less time. This section will not explain the fundamentals of AppGyver, but will see how we can build a UI extension that consumes AI core service and brings the intelligence to the device itself.
What will you learn?
At the end of this tutorial, you will be able to build a mobile application that will consume an SAP AI core service API to infer the data on the cloud using a pre-built image classification model. The approach that will be used to consume the AI core service API can be extended to other Business technology platforms API’s which require an oAuth2.0 authentication mechanism for authentication.
- You should have subscribed to the following service to get started with AppGyver.
- SAP Process Automation – Service Details
- Basic understanding of AppGyver solution and tools used for development – Developer Guide
- Access to the AppGyver Previewer App on a smartphone or tablet: iOS / Android
- Train and Deploy ML model using AI Core – Blog
It’s time to see things working in action. Let go step by step to build this UI extension.
- Create AppGyver Application
- Define Data source
- Design UI and declare page variables
- Design UI using AppGyver composer tool
- Test the application.
Create AppGyver Application – First will create an application using AppGyver option from the dropdown and give a name as per your choice.
Define Data Source –
To define a data source which in our case is AI core service API we have to navigate to the Data section from the top navigation bar on AppGyver web tool. AI Core service API uses oAuth2.0 authentication mechanism, which requires calling in an additional API that returns the Bearer token.
Before we move further, let’s get the API details and credentials handy.
- AI Core Service API endpoint
- Authentication URL
- Client ID
- Client Secret
- Convert the above credentials to base64 encoded string (Client ID : Client Secret)
First, will create a data entity that will call the authentication API and persist the “access_token” that will be later passed with the actual AI Core Service API.
Select the REST API direct integration option to create the first data entity.
Enter the Authentication URL as the base URL. This is the same URL that we had copied earlier
This will be a POST call, so we have to configure headers and a query parameter within the Create Record (POST) section. Will add Authorization in the header section and grant_type in the query parameter section.
Now its time to test the authentication api and set the schema from the response. Successful execution of api will return the access_token.Set the schema from the response which will create variables locally after looking at the actual response.
We are done with creating our first data entity, which is fetching the access token. It’s to time configure the next data entity which is AI Core Service. Again, the same process will be a POST call and will follow the same steps to create the entity.
Scroll down to the header section where we will add a couple of headers required and set the respective Is Static flag to them.
Note – For Authorization make sure you disable isStatic flag as it will be assigned a value at runtime.
Design UI using AppGyver Composer tool-
As the next step, it’s time to design UI using the UI Canvas option within AppGyver tool. Let’s drag the following UI components highlighted in the following screenshot on the canvas.
We also need to define few page variables which will be needed in our next to bind those with the UI components. The page variables will the information relevant to this page what we have created.
Note – imagePath variable will have type as image URL
We will have our logic assigned on the tap of the capture image, button. Select the “Capture Image” button and navigate the logic console, which will build the actual logic with different controls.
Please refer to the following video which shows the logic control flow which has been defined.
Finally, we have all the building blocks of our application in place, and it’s time to test it. The following video shows the demonstration of our application. We will use the application to capture the spare part photo and send inference the same to get details of the spare part.
Extending Application with SAP Ariba –
In this tutorial, we have not covered the Ariba integration, but we won’t disappoint you. Please check the Aribot demo. With help of this application, the user can capture the spare part photo and then get the result using AI Core and then he/she can select the quantity which is sent as a purchase requisition to SAP Ariba.
Let’s not restrict ourselves to this integration/use case, as you can follow the process with other intelligent services provided by BTP and come up with business applications that solve your day-to-day problem without having huge efforts involved in development. If you have liked this blog series, please do let us know which is the next service on BTP you would like to explore with us together.
Thank you !!
Vriddhi Shetty Thank you for helping with AI Core Integration.
Sanraj Mitra Thank you for building the ML model and realizing this E2E integration.