CRM and CX Blogs by SAP
Stay up-to-date on the latest developments and product news about intelligent customer experience and CRM technologies through blog posts from SAP experts.
cancel
Showing results for 
Search instead for 
Did you mean: 
On Tuesday 26th of November, we presented the latest Customer Data Cloud Webinar with a focus on how to activate your customer data to build trusted relationships. The presenters of the session were:

  • Ratul Shah – Senior Product Marketing Manager, SAP Customer Data Cloud

  • Stephen Purvis – Product Expert, SAP Customer Data Cloud

  • Julien Goulley – Technical Architect, SAP Customer Data Cloud


In the webinar, we discussed the importance of building a trusted relationship with customers while ensuring compliance with data protection rules, before going on to showcase how to implement synchronous and asynchronous integration in a working demo.

If you were not able to attend the live session, you can find a link to the recording here.

This blog post explains the steps required to build real-time, near real-time and batch integrations on top of the existing Webinar demo site used in previous Webinars in September. The functionality features:

  • How to ensure a high level of data quality by using Extensions

  • How to build a batch integration using Identity Sync

  • How to support near real-time integration processes with Webhooks


Extensions


Like client-side integration, Extensions allow real-time integration with 3rd party APIs. However, unlike client-side integration, Extensions endpoints centrally manage customisation, allowing customers to reduce implementation costs and simplify maintenance and implementation of changes.

In the example below, we'll imagine we are working with a customer who is managing food catering for SAP employees. Although the customer website allows self-registration, only SAP employees should be allowed to register and login to the website.

When building an Extension endpoint, you first need to understand the format of the request and the response.

The request body contains a JWS (Json Web Signature) token composed of 3 parts separated by dots. The first part is the header, which includes the algorithm and key ID used to sign the token. The second part is the payload and includes:

  • the target api key

  • the extension point (register, login or update)

  • the data submitted

  • the current data stored in Customer Data Cloud (not applicable to registration)

  • the origin IP


The third and final part is the signature.
{
"apiKey": "...",
"parentApiKey": "...",
"callID": "121176d53c3548d3b2da7aa46d7eab19",
"extensionPoint": "OnBeforeAccountsRegister",
"data": {
"params": {
"email": "xxxxxxxxxx@sap.com",
"password": "password",
"profile": {
"firstName": "Xxxxx",
"lastName": "Xxxxx"
},
"lang": "en",
"regSource": "https://eatery.cfapps.eu10.hana.ondemand.com/",
"preferences": {
"terms": {
"ToS_eatery": {
"isConsentGranted": true
}
}
}
},
"context": {
"clientIP": "193.106.224.1"
}
}
}

The response body contains the details of the response:

  • Either a success so the process can continue, as in the below example:
    {
    "status": "OK"
    }​


  • Or an error which stops the process, as in the below example:
    {
    "status": "FAIL",
    "data": {
    "validationErrors": [{
    "fieldName": "email",
    "message": "@sap.com emails only"
    }]
    }
    }​



You can find the official documentation on the Developers site but below is a sample PHP script which performs the following steps:

  • Verify the JWS token signature

  • Check the payload

  • Fail the request if the email is not a sap.com email


<?php
/** External libraries **/
// PHP SDK -> https://developers.gigya.com/display/GD/PHP
require_once('theme/php/GSSDK.php');
// RSA -> https://github.com/phpseclib/phpseclib/tree/master/phpseclib
include('Crypt/RSA.php');
include('Crypt/Math/BigInteger.php');

/** Extract request parameters **/
$payload = $_POST;
$parameters = array_keys($payload);
$response = array();
$message = file_get_contents('php://input');
$messageJSON = json_decode($message, true);
$jws = $messageJSON['jws'];

/** Read JWS **/
$splitToken = explode(".",$jws);
$header = $splitToken[0];
$payload = $splitToken[1];
$tokenData = $header . '.' . $payload;
$signature = $splitToken[2];

/** Verify Signature **/
// Retrieve exponent & modulus from accounts.getJWTPublicKey
$e = 'AQAB';
$n = '....';
$signature = sanitizeAndBase64Decode($signature);
$n = sanitizeAndBase64Decode($n);
$e = sanitizeAndBase64Decode($e);
$rsa = new Crypt_RSA();
$rsa->loadKey([
"n" => new Math_BigInteger($n, 256),
"e" => new Math_BigInteger($e, 256)
]);
$rsa->setHash("sha256");
$rsa->setSignatureMode(CRYPT_RSA_SIGNATURE_PKCS1);
$isValid = $rsa->verify($tokenData, $signature) ? true : false;

/** Execute business logic **/
// Default - Success
$response = array("status" => "OK");
if ($isValid) {
$jwtPayload = sanitizeAndBase64Decode($payload);
$jwtPayloadJSON = json_decode($jwtPayload, true);
$extensionPoint = $jwtPayloadJSON['extensionPoint'];
// Registration Failure
if ($extensionPoint === 'OnBeforeAccountsRegister') {
if (stripos($jwtPayloadJSON['data']['params']['email'], "@sap.com") === false) {
$registrationErrors = array();
array_push($registrationErrors, array("fieldName" => "email", "message" => "@sap.com emails only"));
$response = array("status" => "FAIL", "data" => array("validationErrors" => $registrationErrors));
}
}
echo json_response($response, 200);
} else {
$response = array("status" => "FAIL");
echo json_response($response, 400);
}

/////////////////
/** FUNCTIONS **/
/////////////////
// Build HTTP Response
function json_response($message = null, $code = 200)
{
// clear the old headers
header_remove();
// set the actual code
http_response_code($code);
// set the header to make sure cache is forced
// header("Cache-Control: no-transform,public,max-age=300,s-maxage=900");
// treat this as json
header('Content-Type: application/json; charset=utf-8');
$status = array(
200 => '200 OK',
400 => '400 Bad Request',
422 => 'Unprocessable Entity',
500 => '500 Internal Server Error'
);
// ok, validation error, or failure
header('Status: '.$status[$code]);
// return the encoded json
return json_encode($message);
}
// Base64 to URIBase64
function sanitizeAndBase64Decode($str)
{
$str = str_replace(['-','_'], ['+','/'], $str);
return base64_decode($str);
}
?>

Once the endpoint is up and running, the configuration of the Extension in Customer Data Cloud is done from the “Settings” area of the parent site. Within the “Extensions" section, click “Add”.







The Extension configuration form will be displayed and that's where the details of the endpoint are defined:

  • Pick a name

  • Select 'onBeforeAccountsRegister'

  • Enter a quick description

  • Enter the endpoint URL

  • Define the timeout (max 5 seconds)

  • Define the fallback policy; if FailOnAnyError is selected then any timeout would fail the registration (or the API selected earlier)




From that point on, anyone trying to register with an email address outside the SAP organisation would receive an error message.



Identity Sync


Identity Sync is an ETL (Exchange, Transform and Load) platform which allows orchestration and management of batch import/export. In Identity Sync, you have the ability to create dataflows which are composed of blocks or components. The list of components breaks down into 4 categories:

  • Data source: read, write and delete

  • File: format/parse and compress

  • Field: transform

  • Record: custom field transform


To build a batch integration you must first create a dataflow. New dataflows are usually created from predefined templates, but there is also the option to duplicate an existing flow. Firstly, you need to navigate to the Dataflows section under Settings > IdentitySync and click Create Dataflow.





At that point, using a mix of clicks and drag&drops, you can remove components which are not needed (by clicking the red bin icon), add new components and link them to each other.





Once all components are created and connected, you can edit each component configuration by clicking the edit button (pencil icon). Each client configures Customer Data Cloud differently with their own consent, options and profile schema. When extracting the data, Identity Sync allows custom user selection. For example, only fully registered and verified users who accepted the application terms of service and privacy policy need to be transferred to 3rd party applications. The below screen shot expands the first component of the above dataflow and shows how easily you can define the list of fields to extract, as well as the selection criterion.



For exports, the final step of the flow is a 'writer' which is used to push the data out of Customer Data Cloud. The data can be transferred to a repository (SFTP, S3 etc.) or pushed directly to the external systems using standard connectors or the generic api component.

Once the dataflow is complete, it can be scheduled to run once or on a regular basis. During scheduling, you have the option to:

  • define the frequency

  • define the execution type (Delta vs. Full)

  • define the log level (Debug, Info, Trace)

  • define the distribution list for the success and failure notifications


Schedules can be added or edited by navigating to the Scheduler menu as follows.





Finally, you have the option to monitor previous runs of your dataflow by navigating to the Status menu. The logs are categorised under 3 sections:

  • Trace section shows the logged statements

  • Step metrics section shows the duration and counts for each dataflow step

  • Errors section shows details about the records that failed



Webhooks


Some business processes require user profiles to be provisioned in 3rd party systems as quickly as possible after the user creation in Customer Data Cloud. A typical scenario is that a consumer registers online to report an incident with their product and calls the support centre immediately afterwards. At this point, the support engineer should be able to find the consumer in the appropriate support system. Webhooks is a near-real time event-based mechanism which enables notifications to be sent to an external endpoint. This endpoint can then process the notifications and synchronise the data accordingly.

Supported events can be found on the Developers site.

The first step when setting up Webhooks is to build the endpoint. Any technology can be used to host the endpoint, as long as the below can be achieved:

  • Support HTTPS requests

  • Verify the HMAC-SHA1 hash

  • Queue events for asynchronous processing

  • Make requests to Customer Data Cloud API (accounts.setAccountInfo)

  • Make requests to 3rd party APIs


For the demo, I hosted my endpoint in SAP Cloud Platform Integration. Below is a screenshot of the iFlow.



Once the endpoint is up and running, the configuration of the Webhooks in Customer Data Cloud is done from the “Settings” area of the parent site. Within the “Webhooks" section, you’ll have to click on the “Create Webhook” button. At that point, you can complete the configuration:

  • Define the endpoint URL

  • Define the signing secret (partner secret or application secret)

  • Select the triggering events

  • Define custom headers (optional)






Once the configuration is saved, notifications start flowing to the endpoint. The below screenshot shows an example of payload and hash.


Conclusion


This ends our blog post which discussed how we can use the SAP Customer Data Cloud platform to build a trusted relationship with customers while ensuring compliance with data protection rules. It has show-cased the tools to build secure integration and how easily they can be used.

To learn more about SAP Customer Data Cloud, please sign up for our next webinar.
3 Comments
0 Kudos
Hi Julien,

 

thank you for this detailed blog CDC and Dataflows, extensiosn etc.. Really interesting.

I have a question about updating some information in Account like subscription, some field in Data field etc...

How should the dataflow looks like?

Is the gigya.importaccount writer useful for thus update

Thank
0 Kudos
Hi Yoro,

gigya.accounts and gigya.importaccount can be used to update existing full accounts. We typically decide which component to use based on requirements. The main things to consider would be:

  • What actions are needed?

    • Create -> gigya.importaccount

    • Update -> gigya.account or gigya.importaccount

    • ...



  • Which data are you synchronising?

    • Password hash -> gigya.importaccount

    • Subscription/Preferences: If the date is set in the past then gigya.importaccount otherwise gigya.account

    • ...



  • Do you need to trigger Webhooks notifications?

    • Yes -> gigya.account

    • No -> gigya.importaccount or gigya.account (with muteWebhooks = false)



  • ...


Here is a link to a couple of sample flows.

Thanks,

Julien
0 Kudos

Thanks for sharing, Nice blog @juliengoulley. Covers most of the details.