SAP private linky swear with Azure – business as usual for iFlows with SAP Private Link Service
This post is part 2 of a series sharing service implementation experience and possible applications of SAP Private Link Service on Azure.
Find the table of contents and my curated news regarding series updates here.
Looking for part 3?
Find the associated GitHub repos here.
📢Update 22.08.22: SAP published an official guide to connect SAP BTP PaaS/SaaS apps via SAP Private Link. Check out Harut‘s blog and the SAP samples repos.
Update: SAP enabled the CloudFoundry approuter to act as the waypoint into the SAP PLS. As of the release 11.2.1 you can replace my provided proxy app with an officially SAP supported option. I already replaced the Java proxy app on fig.2 to reflect that.
You seem to be one of those brave folks, because you made it to part two of this Private Link Service Beta series. Last time I introduced the new service and shared my views on the implementation. Today we will build on top of that and hook it up with SAP Integration Suite.
Quick intro: Integration Suite gets you everything you need for standard integration with SAPs SaaS portfolio, Connectors, API Management etc. Formerly the SAP Cloud Connector played its role at making private SAP backends available to SAP Business Technology Platform (BTP). Let’s bridge that with the new private link service too.
|🛈Note: For http based integration I would advise to read SAP’s official guidance that we co-developed. For non-http like JMS, SFTP etc. you might still consider my example Java app as described here and in this post.|
“Rock on” as they say.
Fig.1 pinkies “swearing” some more
I extended the architecture overview from the first post with the Integration Suite service and potential “callers” of the deployed iFlows. That involves C/4Hana for instance. Basically, every app that can reach your Integration Suite instance – or going forward API Management – will be able to communicate with your SAP backend on Azure through a private connection.
Speaking of API Management: Have a look at the post from Niklas Miroll on how to setup Azure AD with S4 including Principal Propagation with BTP API Management for further details on that aspect. Have a look here to look into the setup from Azure API Management perspective.
Fig.2 architecture overview
Standard integration pattern stays the same
All developers may continue with the practices and technologies they are used to. Like with the Cloud Connector setup, you would add the URL of your BTP backend app to your iFlow.
During my first post I showcased two implementation variations. One in Java with SAP Cloud SDK and one using CAP. In this example I fed the CAP implementation as target for the OData connector.
Fig.3 iFlow OData connection with SAP Private Link for Azure
The path was derived from the routing config in the CAP application. It masks “sap/opu/odata/sap/epm_ref_apps_prod_man_srv” behind “product”. Depending on your implementation this might look different. If you did everything correctly, the OData EntitySets will show up on the wizard:
Fig.4 OData Entity discovery via private link
Once finished this will create the EDMX file on your iFlow. Next you should verify on the connection tab of the connector that the URL was taken properly from the wizard (see fig.3). Find my iFlow package here.
Great, let’s deploy and test this.
Fig.5 Postman request to iFlow for products on S4 on Azure via private link
Cool, that concludes the initial groundwork to get started with SAP Private Link for Azure and CPI 😊
For the time being we need to integrate with the private link service from iFlows via the “relay” app or the SAP-supported CloudFoundry approuter. Any upvotes to get it done through the connectivity service directly?
Linky swears are not to be taken lightly. I believe SAP is making good use of the Azure portfolio creating another integration scenario that will become popular going forward. Today we saw the setup process for SAP Integration Suite with this new private link service and verified the usual integration development approach can be applied.
In part three of this series, we will look at the deployment styles in more detail. Any other topic you would like to be discussed in that regard? Just reach out via GitHub or on the comments section below.
Find the mentioned Java, CAP projects etc. on my GitHub repos here. Find your way back to the table of contents of the series here.
As always feel free to ask lots of follow-up questions.
Thank you for a great blog.
Does a private link allow communication from Customer VNET to BTP (Example S4Hana -> Integration Suite)?
Can we use the private link with HUB and Spoke pattern, with HUB VNET hosting private link, and spoke private links with S4?
the current scope of SAP's beta implementation covers only communication initiated from BTP to a customer VNet. The underlying Azure service in general allows your scenario too.
You can have the private link in your Hub but would need a Firewall as a target to eventually resolve to your VM targets in the spokes.
Hey Gov TOTAWAR ,
I added another post in the series, that talks about deployment styles. Hub-Spoke is discussed as one of them: https://blogs.sap.com/2021/07/27/btp-private-linky-swear-with-azure-how-many-pinkies-do-i-need/
Does that mean, we can avoid cloud connector if we have all our system in Azure cloud?
If you don't require layer 7 functionality like audit logging yes.
What use cases are you looking for? REST interfaces, SFTP, SSH etc?
Our initial requirement is to connect Kafka and SAP back end systems. And later we will proceed with SFTP and JDBC etc..
Most of the rest and soap services are available in internet as we are using Azure APIM to expose the services.
Ok, is there any dependency on BTP? Where do you run Kafka? Outside of Azure? Are you fronting Azure APIM with App GW using the app firewall?
I am considering to write a next blog on BTP Private Link Service connectivity with Azure APIM. Would that be of interest?
Kafka runs in azure private cloud. Our current requirement is to integrate sap cpi and kafka, so that we can use kafka native connection using cpi kafka adapter.
But in order to use cpi kafka adapter, the kafka host should be available in internet. When we asked kakfa team if they can expose their server in internet, they have raised question whether SAP CPI is with in PCF(Public Cloud Framework). I have no clue what this means and when checked with azure team, they have directed to SAP Private Link service.
Unfortunately, kafka adapter is not available with cloud connector option. So we are checking on how to expose kafka system from azure private cloud to SAP CPI.
Ok, so we are talking messaging protocols for Kafka. I haven't tested that scenario yet, but the challenge would be the capability of the proxy app I talked about in this blog post. It anticipates http calls. The Kafka adapter operates with different ones.
The SAP Private Link Service (PLS) exposes Azure Private Link. Since it operates on layer 4 in the stack it would support your scenario. To this point though, CPI cannot directly reach throught the PLS due to the required CF service binding. Kafka REST API would work with the proxy. For the streaming protocols I need to investigate more if we can somehow handle this with a proxy app.
Can you move ahead with Kafka REST api and the proxy behind CPI?
Judging from this post, there might be the opportunity to call via a specialized proxy. You could apply the concept to Cloud Connector or PLS the same way. But I believe you cannot use the Kafka adapter in CPI. The adapter moves into the proxy app so to say. You call it via REST from CPI then.
Let me know what you find.
Thanks, Martin, for your response. This clears my doubt.
Yes. Kafka Proxy rest api is already exposed via Azure APIM. We are already consuming these apis using sap po.
We will continue to use SAP PO to connect kafka, but once CPI and kafka is connnected via native connection, we will switch from sap po to cpi.
We see value when connecting kafka via cpi using native connection, as consuming rest api is kind of complicates the development, and it is not cost-efficient.
At the moment we have 2 options and they are unlikely to happen any time soon.
To be clear Muni: you can connect to Azure IaaS (virtual machine) and Azure PaaS (through future releases or port forwarding VMs) using BTP PLS and the approach with the CF proxy described in this article. Do you run Kafka on a VM on Azure or as PaaS?
I will be describing the port-forwarding mechanims for Azure APIM and BTP PLS soon. Have a look here for the general idea.
My argumentation before and the specialized CF app was about streaming protocol support and not REST API. For REST api above guidance applies.
Thanks. Sorry I did not go in detail on how CF app works.
We are running kafka HD Insight.
I will have to check with Azure team about port forwarding vms.
As you have mentioned in the blog, we are interested in accessing the host without app to proxy. We will wait for it.
If SAP implements direct “line of sight” for Cloud Integration, Connectivity service and PLS we would no longer need an app to proxy.
as promised the next post on SAP PLS for Azure APIM is out. Also I am working on enabling the BTP approuter to act as the proxy for PLS. That way you get an official proxy-approach by SAP for CPI for instance.
Thank for this great blog and your efforts.
I see that we can use Private Link when calling from CPI Iflow.
How about if i wish to call the CPI endpoint using private link. I understood that the private link service is to call the services available in Azure but is there is possibility for the opposite scenario??
Hi Venkat Shiva Nag Kondaveeti,
we are working with SAP on a variety of private link scenarios, but private access to CPI iFlows is not part of that. They are Internet-facing by design. Always worth though to officially approach SAP with your requirement going forward. Note: a private scenario would be doable with SAP PI/PO on Azure however.
Philipp Becker, Gowrisankar M, Harut any additional thoughts on this ask?