Principal Propagation in SAP Integration Suite from external system to an SAP Backend
Note: This post is part of a series. For a complete overview visit the Principal Propagation in SAP Integration Suite.
As explained in SAP Cloud Integration help page, you can set up Principal Propagation, that is forwarding the identity of a user across several systems.
With the migration from SAP Cloud Integration in Neo environments to SAP Integration Suite in Hyperscalers the question arises if it can be still set up. The answer is yes, but with some remarks.
It has to do with the way how the sender logs in while doing an inbound connection to SAP Cloud Integration. For this, you can use several ways (basic authentication, OAuth2, client certificates). The scenario explained above needs the Cloud Connector, which requires a user created in the Identity Provider trusted by SAP Cloud Integration account. Independently of the environment, if you use basic authentication in Cloud Integration, you have a user in the Identity Provider, so the scenario will work without any further considerations. See how to set up basic authentication with user instead of service keys in Cloud Foundry in the blog posts Basic Authentication for Cloud Integration in Cloud Foundry Environment and Enable SAP IAS as Custom IdP for Basic Inbound Authentication in Cloud Foundry Environment. How to propagate the user to the backend is found below in step 5 (Configure the Cloud Connector).
However, if you want to use OAuth2 or Client Certificates for the inbound authentication, there are some differences. When using OAuth2, in Neo you register a client, which is assigned to the Identity Provider. So, when being logged in, you had a logged in user with the client id as username. In SAP BTP Cloud Foundry environment however, you use a service key, which does not correspond to a user in the Identity Provider. For this you can use “password grant” as one of the supported grant types per OAuth2 specification while authentication. Like this you assign an identity to the service key, which can be propagated.
Using client certificates, the best practice in Neo was to do a certificate-to-user mapping to avoid scenario downtimes while certificate renewing. Using the certificate-to-user mapping you again had an identity. In Cloud Foundry you assign the client certificates to service keys and as explained in previous paragraph, service keys are not assigned to users in the Identity Provider. But also for this there is a roundabout way of doing this, which is a bit more tricky. It uses SAP API Management to initiate an OAuth2SAMLBearer grant flow. Here SAP API Management, as a trusted issuer, will generate a SAML artefact and exchange it with SAP BTP’s XSUAA for a JWT token that can then be presented to the integration flow. I will try to explain it as easy as possible in this blog post. This approach will also work if your sender is using OAuth2 with grant type Client Credentials as authentication. You only must adjust the settings in the SAP API Management policies how to extract the user from the client certificate or from the client credentials.
Let’s see the scenario for Client Certificate Authentication:
The configuration steps are:
- Setup SAP API Management as a trusted Identity Provider
- Create your integration flow in SAP Cloud Integration
- Enable two-way SSL in SAP API Management
- Create the API Proxy in SAP API Management
- Configure the Cloud Connector
- Configure the Backend
1) Setup SAP API Management as a trusted Identity Provider
In this step you will need to trust a new custom Identity Provider in your SAP BTP subaccount, so that SAP API Management can have a trusted relationship with XSUAA to enable SAML2Bearer grant flow.
In this blog post you can see how to generate the certificates for signing the SAML Assertions (see the chapters “Generate Certificate for Signing SAML Assertion” and “Generate JAR containing Certificate”) and how to obtain the SAML Metadata for the new Identity Provider (see the chapter “Generate SAM IdP Metadata”). After generating the certificate, you should have the following files (certificate, key and keystore):
As explained in the blog post, you can use a tool like https://www.samltool.com/idp_metadata.php to generate the metadata for your new custom Identity Provider. Give an EntityId (apim.proxy.idp in the example), an Http redirect url (not relevant) and as SP X.509 cert paste the content of the cert.pem generated in the previous step.
Next step is trusting the new Identity Provider in your subaccount. Go to Security–>Trust Configuration and upload a New Trust Configuration.
Select the Metadata XML file generated in the previous step and provide a name for the Identity Provider (apim.proxy.idp in the example). You probably would like to disable the flag Available for User Logon, so that you keep on using the default Identity Provider to access the Cloud Integration WebUI. The flag Create Shadow Users on User Logon allows you to automatically create a user in this Identity Provider when a request arrives, otherwise the user must exist in this Identity Provider before doing the call or you’ll receive a 401 Unauthorized error.
Next step is to create or reuse an existing role collection with the MessagingSend role assigned to it to allow the generated JWT Token to execute the integration flows. If you have configured your integration flows to use a different role, then this different role has to be assigned to the role collection.
In the new custom Identity Provider map the attribute group given when generating the SAML artefact to the above mentioned role collection. In this example, the attribute group is called “it-rt-….ESBMessaging.send” and will be configured in the API policy in step 4 – Create the API Proxy in API Management–>samlHelper.js–>AttributeValue.
As said before, if you do not allow shadowing users, then the needed user must be created manually for the new Identity Provider in the SAP BTP cockpit. The user does not need any role as it will be given while generating the SAML assertion. In this example I will use the common name of the client certificate as username.
2) Create your integration flow in SAP Cloud Integration
Here you create an integration flow where the receiver adapter uses Principal Propagation as authentication method. We do not need to go in detail as it does not differ from any other scenarios.
3) Enable two-way SSL in API Management
As we do not call the integration flow directly, but through SAP API Management, we need to configure it.
To enable inbound client certificate authentication in SAP API Management you must request Two-Way SSL certificate. See Enable Client Certificate Authentication on API Proxies to know how the process works.
Per default, the virtual host of your API Management is secured with a one-way SSL certificate. That means, that the sender does not use client certificates. If two-way SSL certificate is used, then all the API proxies called through this virtual host will need a client certificate, otherwise they will not work anymore. A best practice for this is to request an additional virtual host and use this virtual host just for those API Proxies requesting client certificates. Doing this, all the remaining API proxies will still work the old way.
4) Create the API Proxy in SAP API Management
Create a key store in API Management with the jar file created in step 1 (Setup API Management as a trusted Identity Provider–>Generate certificate for signing SAML assertions) to sign the SAML assertion to be exchanged with the BTP’s XSUAA.
Create the API proxy pointing to your integration flow. It is important not to use an API Provider with credentials defined, otherwise it uses those credentials to access the Cloud Integration instead of the generated token. Use a URL instead as provider.
Search for the policy template “Connect to SAP Cloud Foundry services”–>“SAP Cloud Foundry SAML2OAuth Flow” on SAP API Business Hub and apply it to your API proxy.
In the samlHelper.js in the policies enter the following info:
- Variable sapapim.issuer: origin key of the custom identity provider created in step 1 (Setup API Management as a trusted Identity Provider–>New Trust Configuration)
- Variable sapapim.audience: download SAML Metadata of your subaccount from BTP Cockpit–>Trust Configuration and take entityId
- Variable sapapim.recipient: also from SAML Metadata, take the AssertionConsumerService Location (…/oauth/token/alias…)
- Variable sapapim.username: context.getVariable(“client.cn”) to get the common name of the client certificate making the request
- Variable sapapim.storename: store name used to sign the SAML assertions to be exchanged with XSUAA
- Variable sapapim.keyname: name of the key used to sign the SAML assertions
- Variable sapapim.clientId: client id of the Process Integration Runtime (integration-flow plan) service key with the ESBMessaging.send role assigned
- Variable sapapim.secret: client secret of the Process Integration Runtime (integration-flow plan) service key with the ESBMessaging.send role assigned
- Attribute Groups: attribute Groups mapped to the role collection used in step 1 for sending messages to Cloud Integration (ApplicationIdentifier.RoleName)
The instance Process Integration Runtime instance for the service key used above must have following configuration:
In the policy getOAuthToken enter as HTTPTargetConnection the value of AssertionConsumerService Location found on the file downloaded from the BTP Cockpit–>Trust Configuration–>SAML Metadata.
Once you save and deploy the API proxy you can debug it and obtain the generated JWT token.
If you decode the token, you can see the extracted user (the common name of the certificate – e.g. ####.hana.ondemand.com) and the group/role assigned to him, among other information.
5) Configure the Cloud Connector
Next, you must configure the Cloud Connector to create temporary certificates for the propagated user to connect to the backend system. You can see the whole configuration in the guide Principal Propagation in an HTTPS Scenario. In a nutshell, those are the steps:
- Configure system certificate and a local CA certificate for signing temporary certificates
- Define how the temporary certificates for principal propagation are created. In this example, the username is used as CN
- Generate sample temporary certificate used to create a rule in the backend system
- Trust your subaccount
- Trust your backend system. Backend system certificate is obtained in step 6 below (Configure the Backend), in STRUST transaction
- Configure Access Control with principal type x.509 certificate
6) Configure the Backend
Https communication must be enabled on the application server. It can be checked in transaction SMICM–>Services.
The required service for the scenario must be activated. If it is a webservice it must be activated (with X.509 Client Certificate authentication) in the simplified service configuration of the SOAMANAGER.
Following parameters must be set in the backend profile (tx. rz10):
- Icm/accept_forwarded_cert_via_http: This parameter specifies if the ICM accepts a client certificate forwarded via HTTP
- icm/trusted_reverse_proxy_<x>: SUBJECT=”*”, ISSUER=”*”. This parameter is available in new systems. It replaces both previous parameters (icm/HTTPS/trust_client_with_issuer and icm/HTTPS/trust_client_with_subject). Both parameters cannot be used together. This parameter allows to put several rules to trust different certificates.
- icm/HTTPS/verify_client: 1. This parameter instructs the application server to request a certificate from clients trying to access any resource of it
- login/certificate_mapping_rulebased: 1. This parameter allows the application server to map, based on rules defined in transaction CERTRULE, the identity contained in an identity certificate received during the authentication with an internal user.
Trust the issuer of Cloud Connector system certificate in STRUST transaction and restart SMICM.
For test purposes you can assign the generated sample certificates in cloud connector (step 5) to the needed users in view VUSREXTID, without activating the rule based certificate mapping.
Otherwise, if you have activated the rule based mapping, you must use the transaction CERTRULE to define the rules or explicit mappings from certificates to users.
In this blog post you have seen the steps needed to call a backend service from an external client via Cloud Integration without needing to enter the credentials in each involved system and without using technical users, or in put another way, using Principal Propagation.
HANA Academy YouTube channel: