Integrating SAP Signavio Process Intelligence and SAP Data Intelligence Cloud: a concrete example with the new Ingestion API
Disclaimer: the paragraph “SAP Data Intelligence pipeline” has been updated on 2022, September 29th.
Welcome to the third episode of the series: Integrating SAP Signavio Process Intelligence and SAP Data Intelligence.
In case you missed the two previous blog posts:
- In Integrating SAP Signavio Process Intelligence and SAP Data Intelligence Cloud: process mining streamlined across complex data, Silvio Arcangeli explains how leveraging an enterprise-grade data integration and data orchestration solution (SAP Data Intelligence Cloud) in conjunction with a best-in-class process mining engine (SAP Signavio Process Intelligence) yields very powerful synergies.
- In Integrating SAP Signavio Process Intelligence and SAP Data Intelligence Cloud: a concrete example and step-by-step guide, Silvio showed a concrete example of a preliminary approach to combine the two products, with a step-by-step guide to setup a SAP Data Intelligence pipeline that was feeding data into SAP Signavio Process Intelligence by using a staging area implemented in an Amazon S3 bucket.
After the enhancements that were released in the product in June 2022, with the general availability of Ingestion API (in the Appendix you can find the link to the documentation), we can now overcome that preliminary approach and actually move to a direct connectivity between the products.
In this third post, I’ll therefore describe a concrete example that shows how you can use an SAP Data Intelligence pipeline to feed data directly into SAP Signavio Process Intelligence, via Ingestion API, bypassing the need for a staging area.
Ingestion API allows you to create an external data pipeline and ingest data into SAP Signavio Process Intelligence via API with an authentication token. This facilitates the integration with SAP Data Intelligence Cloud, where a data pipeline can be created with an operator at the end pushing the data into SAP Signavio Process Intelligence.
The example implemented in this blog shows how to read data from a customer survey from a Qualtrics system and how to then feed it into SAP Signavio Process Intelligence.
What is interesting is that such approach can be applied to any of the numerous sources supported by SAP Data Intelligence Cloud leveraging on its data processing and transformation capabilities, but it can also be applied to any kind of SAP and non-SAP data pipeline, external to SAP Signavio Process Intelligence.
Configuring an SAP Data Intelligence Cloud pipeline to move data from Qualtrics to SAP Signavio Process Intelligence via Ingestion API
Let’s first introduce some context.
Some of you already know about journey to process analytics, an innovative process management practice that connects experience and business operations data, aiming at understanding, improving and transforming your customer, supplier and employee experience. If you haven’t heard of it yet, I would suggest you to read this 5 minute blog post by Aida Centelles Ahicart.
Let’s now imagine that we want to analyze an Incident-to-Resolution process. From a traditional process mining perspective, the approach is quite straightforward: we configure a connection to ServiceNow, extract the data, create an event log, and apply process mining techniques on it.
But what if we could collect customers feedback, at the end of their journey, in a Qualtrics survey?
This experience data could be combined to the operational data to bring the customer experience perspective into play, enriching the data model and leading to a broad variety of new and valuable insights.
However, currently there is no native connector to Qualtrics in SAP Signavio Process Intelligence. Hence, we can create a data pipeline in SAP Data Intelligence to extract the survey data from Qualtrics and push it to SAP Signavio Process Intelligence.
Let’s see how we can quickly set this up.
First of all, let’s create a new Ingestion API data source in SAP Signavio Process Intelligence.
You can either create a data source or an integration. Either way, the other one is then automatically created with the same title and with the same type (Ingestion API), and they will be interconnected with each other.
The data source provides two parameters:
- The API endpoint is the target URL to make a request to ingest external data into SAP Signavio Process Intelligence
- The Token is the authentication token required for a successful request pushing data into this specific pair of data source and integration
Second, let’s connect to Qualtrics to retrieve the information we need.
Login into your Qualtrics system and go to Account Settings and then to Qualtrics IDs
Here you can retrieve the following parameters
- Datacenter ID
- API authentication parameters
- User ID
- Survey ID
Last, let’s create a data pipeline in SAP Data Intelligence Cloud.
In this GitHub repository you can find a ready-to-use SAP Data Intelligence graph, with a connector to Qualtrics and a connector to SAP Signavio Process Intelligence.
You can download and archive the content of the 3 folders (that you can find in the repository) into 3 zip files, so that they can be imported as solutions into your SAP Data Intelligence tenant. Do not archive the folders, but for each folder select the content and “add to archive”.
In order to import solutions in SAP Data Intelligence, you can go to System Management, go to Files, click on the ‘+’ symbol, select “Import Solution” and choose your zip file.
In SAP Data Intelligence, you can create an OPENAPI connection to Qualtrics
- Host: qualtrics.com
- Authentication Type: Basic
- Username: the Qualtrics data center ID (can be found under your Qualtrics User, Account Settings, Qualtrics IDs)
- Password: the Qualtrics token (can be found under your Qualtrics User, Account Settings, Qualtrics IDs)
In SAP Data Intelligence, you can create an OPENAPI connection to SAP Signavio
- Host: spi-etl-ingestion.eu-prod-cloud-os-eu.suite-saas-prod.signav.io (note: the API endpoint url of Ingestion API will change soon)
- Authentication Type: Basic
- Username: it’s not relevant, you can type “signavio”
- Password: the Ingestion API token (can be found in the Ingestion API data source)
In SAP Data Intelligence, you can open the Modeler and add the imported graph (“Qualtrics to SAP Signavio”).
You can configure the graph.
- In the Qualtrics operator, add (1) the connection, and (2) the Qualtrics Survey ID
- In the SAP Signavio operator, add (1) the connection, and (2) the name of the target table to be pushed in SAP Signavio Process Intelligence
You can now save and run the pipeline.
Once its status is “completed” you can go to the SAP Signavio Process Intelligence Ingestion API integration and check the logs.
A new execution log will appear, with “Running” status. After a few seconds the status will become “Done” and you can check in the Tables tab that a new table has indeed been uploaded, with the name provided in the second python operator of the DI pipeline.
The data is now ready to be used for process mining!
Additional step: Add Experience Data to your Data Model!
As we stated above, the whole idea started with the business requirement to add Qualtrics data to the Incident to Resolution process, to enrich operational data with the customer experience perspective.
To achieve that, you can create a new data model selecting ServiceNow as source system.
Select the available Incident-to-Resolution transformation template, as an accelerator to transform raw data into an event log with a series of preconfigured SQL scripts.
Select New Integration and connect to an existing ServiceNow data source.
With the new Process Data Management (ETL 2.0) interface which was released in June 2022 you can now see the whole data pipeline, spanning connection, integration, and transformation. You can add the target Process Intelligence process.
Click on “+ Data source and integration” and select your Ingestion API data source. It will be added to the data pipeline.
Now you can start changing the existing SQL scripts to combine operational data with experience data coming from Qualtrics. For testing purposes, you can edit the Transformation block by adding a new “Qualtrics” business object.
At this point you can add a new event collector with a simple SQL script to preview the survey data that has been extracted from Qualtrics and pushed to the Ingestion API integration through the SAP Data Intelligence data pipeline.
Qualtrics survey data is now ready to be combined with ServiceNow data to create new valuable insights!
As a summary, in this blog post I have covered:
- How to create and use an Ingestion API data source to push external data into SAP Signavio Process Intelligence
- Which information you need to gather from your Qualtrics system in order to extract data from a survey, and how you can retrieve it
- How to connect SAP Data Intelligence Cloud to Qualtrics and then create a data pipeline with two operators, that respectively:
- Connect to Qualtrics and extract survey data
- Push the data to SAP Signavio Process Intelligence via Ingestion API
If you’re interested to know more about Ingestion API, here are some additional resources that you can check:
- Documentation on the Ingestion API connector:
- How to upload data using the Ingestion API:
- How to set up data ingestion:
- How to manage the ingestion API access token:
- How to use Ingestion API with code examples
In this GitHub repository you can find the data pipeline and the two python operators to be downloaded and imported into your SAP Data Intelligence tenant.