Setting up SAP Cloud Platform Transport Management for SAP Cloud Platform Integration
In this blog I will describe how to setup SAP Cloud Platform Transport Management (TMS) for SAP Cloud Platform Integration (CPI) running in the Neo environment. I will concentrate on the points which seem to be a little more challenging and refer to the documentation for the rest.
- (At least) two SAP Cloud Platform Integration tenants (development/source and production/target) running in two SAP Cloud Platform Neo subaccounts.
- The service ‘Solutions Lifecycle Management’ is enabled in all Neo subaccounts being part of the transport landscape.
- Setup SAP Cloud Platform Transport Management as described in the SAP documentation. This includes:
– Buy TMS
– Entitle a Cloud Foundry Subaccount for TMS
– Subscribe to TMS
– Create role collections and assign them to users
– Create a TMS service instance and a service key
There are three different types of destinations needed to setup the CPI / TMS scenario. Please see the picture below:
- Destination pointing from the Solutions Lifecycle Management Service in the Neo subaccount hosting the CPI source tenant to the service instance of TMS. The name of the source node is provided as a parameter in this destination.
Please follow the instructions in SAP help to configure this destination.
Take note that the name of this destination has to be ‘TransportManagementService‘ (case senitive!)
- Destinations pointing to the target Neo subaccounts hosting the CPI tenants: these are configured in the destination service of the Cloud Foundry subaccount hosting TMS as described in the SAP documentation.
The names of these destinations can be freely chosen.
You have to create one destination for every Neo subaccount respectively CPI tenant you would like to deploy to.
These destination are then used to configure the transport nodes in TMS.
- Destination pointing from the Solutions Lifecycle Management Service to the CPI tenant running in the same Neo subaccount.
The configuration of this destination is described here in the SAP documentation.
It has to have the fixed name ‘CloudIntegration‘ (case-sensitive).
A more generic format of the destination’s URL is
‘https://<CPI tenant name>-tmn.hci.<data center>.hana.ondemand.com/itspaces/’
The destination shown in the documentation would refer to an SAP internal account in Canada (‘int/cn1’).
This destination has to be created in all subaccounts which are part of the TMS landscape (source and targets).The easiest way to retrieve the URL containing the CPI tenant name is to go to the SAP Cloud Platform cockpit of the Neo subaccount hosting the CPI tenant. Open the ‘Applications’ tab and the ‘Subscriptions’ subtab. In the list of ‘Subscribed Java Applications’ click on the one which name contains ‘tmn’. This opens a list of ‘Application URLs’ where you can copy the one which ends with ‘itspaces’.
Users, Roles and Identity Providers
Destination to Neo subaccount
The technical user(s) used in the destinations pointing to the target Neo subaccounts (type 2 above) has to be a member of the subaccount and needs to have the role Developer (or Administrator) in the corresponding subaccount. This role can be assigned when adding the user as a member to the subaccount. Alternatively you can assign a custom platform role with the scopes Manage Multi-Target Applications and Read Multi-Target Applications.
If you are using SAP Cloud Identity Authentication Service (IAS) as your platform identity provider the user should be a local user in the IAS tenant and cannot be a user integrated from another Identity Provider. The reason for this is the lack of industry standards for propagating basic authentication requests.
Destination to SAP Cloud Platform Integration tenant
The technical user for the destination pointing to the CPI tenants (type 3 above) needs to have the roles AuthGroup.IntegrationDeveloper and IntegrationContent.Transport. How to assign these roles is described here in the SAP documentation.
If you are using SAP Cloud Identity Authentication Service (IAS) as your application identity provider please take note that ‘Basic Authentication’ for this destination by default points to the SAP Identity Service and does not use SAP IAS. However, this is technically possible and can be changed via a ticket to the security operations team (BC-NEO-SEC-IAM). After the configuration is done also the basic authentication will be done against the SAP IAS tenant used as application identity provider.
As above the user should be a local user in the IAS tenant and cannot be a user integrated from another Identity Provider.
Administrator role for enabling TMS as transport tool in CPI
The administrators who should be able to change the transport mode used for CPI additionally need the role AuthGroup.Administrator.
Enabling TMS transports for CPI
Once you have configured all the destinations above and the corresponding transport landscape in TMS, you have to switch the transport mode of CPI to actually use TMS. That switch is somewhat hidden…
In the CPI web client you have to select the ‘Settings’ tab, then the ‘Transport’ tab and then press the ‘Edit’ button in the lower right corner (which can be far away on a large screen).
This enables the drop down list where you can select ‘Transport Management Service’. Don’t forget to save…
Using TMS from within CPI
Once you have configured and activated TMS for CPI as described above you can use it to create transport from within the CPI development environment.
For that, switch to the ‘Design’ tab in the CPI development tenant and select the CPI package you would like to transport:
You can now perform your changes to the package and save them. You initiate the TMS transport by pressing the ‘Transport’ button:
Provide some information of the transport and press the ‘Transport’ button:
Now a new transport request is created in TMS, the CPI package is put into a Multitarget Application (MTA) archive file and attached to the transport request. The transport request is then released and put into the queue of the transport node which follows the development node (in this example the test node). The confirmation message tells you into the queue of which node the transport has been placed.
In the Transport Management service UI, you will find the new transport in the queue of the node specified in the success message above. From this queue you can trigger the import into the target CPI tenant.
This concludes this blog about the configuration and usage of SAP Cloud Platform Transport Management for SAP Cloud Platform Integration. Have fun using this scenario!
Another great blog for SCP TMS.
We have SCP TMS working in a technical capacity to transport CPI content between Neo subtenants.
We are hitting an issue in "So.lution Lifecycle Management" that reports an error "Integration Content Missing" after package deployment to CPI, however in SCP TMS and the CPI system no error is reported and the CPI package is working as expected ( though it did need to be deployed manually )
Is there any advice in how this can be troubleshooted / where logs may exist please?
HI Harald Stevens
We have recently subscribed to SAP CPIS ( cloud foundry ) integration service and need to setup the TMS on top of it to transport the content between various CPIS tenants( on cloud foundry).
Can you suggest, similar to above blog if you can guide us on below
Hi Shobhit Taggar,
the setup of the TMS - CPI integration on CF is somewhat different to Neo. The role of the Solution Lifecycle Management service (collecting content and packaging it into a multi-target application archive) is taken over by the Content Agent service on CF.
Please see this blog post by my colleague Abhishek Nath who is the Product Manager for Content Agent service: https://blogs.sap.com/2020/08/30/introducing-sap-cloud-platform-content-agent-enhanced-transport-capabilities-for-sap-cloud-platform-integration-suite-content/
Thanks for sharing the blog. I have one more question on the Subaccount for TMS Service.
Scenario: We have 2 CF Subaccounts(QA and PRD) in Singapore Azure region and as per SAP TMS service is available in this region
Q1: Can we use existing CF subaccounts to activate the TMS service
Q2. If Above is yes, then which subaccount to use to activate this service, (QA or Prd)
Q3: Service will be activated in 1 tenant or all tenants in the landscape
Q4: If We wish to apply new Subaccount for this service, Will it be in Singapore OR US(E) / Frankfurt. I find in SAP documentation somewhere that TMS subaccounts need to be created in only US(e) and Frankfurt region. Kindly suggest on this.
let me try to answer your questions:
Q1: as TMS is available on the Azure data center in Singapore you can use the existing CF subaccounts
Q2: in the subaccount you use for TMS you have to configure the role collections for TMS and assign them to the users who should be operating TMS. You also configure the destinations to all target subaccounts in this subaccounts. Besides that you will need for full TMS functionality a space inside the subaccount in which you create one service instance of TMS (you can reuse an existing space for that if it has free service instance quota).
Based on this information it might be easier for you to decide which subaccount will be better suited. Technically for TMS there is no difference.
My gut feeling would prefer the QA subaccount to clearly separate the productive applications from operative tasks like TMS:
Q3: You need to activate (subscribe to) TMS in only one subaccount.
Q4: If I understand your question correctly you are thinking about creating a new subaccount for TMS. You can use any data center (region) in which TMS is available. That would include Singapore. If (most of) your target subaccounts would be in Singapore as well it could create some (slight) performance benefits to have TMS in the same data center due to reduced network latency.
Could you please point me to the place in the documentation which calls for US or Frankfurt only, so that we can correct this outdated information.
Thanks a lot.
Thanks for your response.
Aside I raised the same questions to SAP Support BC-CP-LCM-TMS support group and I just got a response from the team. All replies are same as you said above except Q3.
As per your response above you mentioned we need to Activate ( subscribe) TMS in only 1 sub account. Where as SAP support in ticket no 759320/2020 responded it should be activated in all Subaccounts, those part of TMS landscape
NOW kindly guide further on this.
thanks a lot for pointing me to the blog post. I have corrected it accordingly.
As for the video in the SAP Help Portal, we are planning to rerecord them because some of the UIs have changed. However, this will take a few more weeks.
Concerning the question of how many subaccounts need a subscription to TMS my information is correct. The complete transport landscape is managed from one subaccount which is the only one with the TMS subscription.
However, there might be a confusion with the Solutions Lifecycle Management service in the Neo environment. This has to be enabled in all subaccounts involved in TMS transports. It is available by default and free of (extra) cost but has to be enabled once.
Solutions Lifecycle Management service in the Neo environment
With reference to SHobhit's Query (above), I have one more question on the Subaccount for TMS Service.
As above,I have created a new subaccount for TMS. Now, I am trying to create destinations by using client service key from TMS account to QA.
1) Am I doing the right approach? How should I create destinations from (new TMS subaccount or CPI DEV) to CPI QA?
2) I tried with “CPI QA service instance and service key” and “Basic authentication which I have admin access in QA” , Both authentications are failing (connection established but 401 authorization error) please help.
the above blog post describes the setup in the CPI on SCP Neo use case. Are you running CPI on Neo or on Cloud Foundry? In case it is CF the following blog post by my Colleague Abhishek Nath is more relevant: https://blogs.sap.com/2020/08/30/introducing-sap-cloud-platform-content-agent-enhanced-transport-capabilities-for-sap-cloud-platform-integration-suite-content/
If we are talking about Neo, please let me know.
HI Harald Stevens ,
While i am trying to create Transport Notes, The "+" is no inactive. It means authorization is missing to nodes. May i know which role is required?
We deliver several predefined roles with Transport Management, see their documentation here:
The roles who are allowed to change the transport landscape and therefore are able to create transport nodes are LandscapeOperator or Administrator.
Thanks, Harald Stevens.
Now, I can able to create Nodes. Quick question, Initially, i have done Transport path from DEV to QA. All was working well.
But now, I am trying to add PROD node. Please advise me what other configurations have to do from QA to PROD?
great to hear that the authorizations issue has been solved.
The complete configuration for TMS happens in TMS itself or in the subaccount from which you have subscribed to TMS. For adding a PROD node you would have to:
Please also have a look at the chapter on landscape setup here:
Thank you for your quick reply.
I have done the following steps. ( as shown above).
Quick help needed,
let's start with one basic question: are you running CPI on Neo or on Cloud Foundry?
The above blog describes the Neo use case.
For CF I would like to point you to https://blogs.sap.com/2020/08/30/introducing-sap-cloud-platform-content-agent-enhanced-transport-capabilities-for-sap-cloud-platform-integration-suite-content/
and my colleague Abhishek Nath because here the configuration works somewhat different.
In case of Neo we can follow up here...
I am asking for Cloud Foundry. THanks for your reply.
Thank you for the introduction! We have everything in place and working fine. I was wondering if there's a setting that transported artifacts get deployed automatically? It's a little annoying to deploy each iFlow individual after transporting it.
I am glad to hear that you liked the blog post and could use it successfully.
Currently there is no configuration setting to directly start an import/deployment after a new transport request has been added to the queue of a node.
One option would be to use the scheduling function of TMS (upper right corner of the screen when you are looking on the queue). This would automatically trigger an import of the complete queue. The minimum interval that can be currently configured is one hour, so that in worst case you would have to wait one hour until the CPI package is imported. This could be a valid option for your test environment, but I would rather not recommend it for a productive tenant.
Another option would be the usage of our APIs for TMS (see https://api.sap.com/api/TMS_v2/resource ). One of them can trigger an import of a single transport and another the import of a complete queue. Some customers use that approach in their development scenarios in conjunction with a CI/CD pipeline to trigger the import after the transport request has been created by the CI server. However, this approach does not fit so well to the CPI use case because triggering the import would still be a separate (manual) step.
we have the same setup as you described (CPI tenants in NEO environment + TMS tenant in CF environment). I am really struggling to understand how we should achieve correct authentication in destinations to CPI (type 3 or type 2). We don´t have any IAS, we are using the default SAP Identity service. This means that we should use S-users or P-users as a technical user in destinations. P-users are public users and SAP strongly recommends not to use them in productive scenarios. This leaves us with S-users, which were recently tied to Universal ID and are meant to be used as a representation of human person, and not as technical users (duplicate emails to company departments or distribution lists are also not possible anymore). So what kind of users should we use when only basic authentication is possible? I would gladly use Oauth or maybe even client certificates, but this authentication method is not supported for TMS (I tried to configure Oauth, but connection check in TMS gave me an error that Oauth is invalid and Basic auth needs to be used i nstead.
What options do we have? We really want to avoid additional purchase of IAS, especially when you cannot use it with our corporate IdP in TMS usecase.
Thank you, for answer.
unfortunately there is no 'good' solution for this requirement. The technical background is that for deploying content within BTP it is not enough to have a clientID / clientSecret of a service instance of the deploy service (on CF) or the solution lifecycle management service, because for some content types (for example Java apps) further authentication towards other runtime is necessary. This is why we need a platform user (with the necessary authorizations) for the destination.
Depending on the content it is possible to use OAuth based on a service instance (I have seen it working for Cloud Integration on Neo), but as said it does not work for all content types. If this would be interesting for you please let me know. I will provide some details.
So currently the only option (without IAS) that I see is the usage of S-Users. Maybe you could open a ticket with SAP describing your need for an S-User NOT pointing to a real person.
Sorry that I don't have better news.
thank you for confirming my thoughts. It is a real pity that authorization options currently supported for TMS and CPI are so limited. I can understand the limitation with OAuth, but then why we could not use client certificates?
Anyway, since our scope is to transport SAP Cloud Integration content only, I would be really interested in setting up OAuth as we are not planning to transport Java applications at the moment. If you could share the details with me, I would be grateful.
P.S. I tried to explain that to SAP via ticket some time ago, but I was not successful. Back then, it was ok for me, as using S-users as technical users was possible, but now with all the limitations (Universal ID, duplicate emails, correct last name... etc.) it is really too complicated and cumbersome to use them so I opened a new ticket to SAP. Let´s see what happens.
Thanks a lot.
as said I only know how it would work on Neo:
In the 'Security' tab of the cockpit click on 'OAuth'
Select the tab 'Platform API' and click on 'Create API Client'
Select 'Solutions Lifecycle Management' and 'Lifecycle Management'
The latter might not be necessary, but I have seen both...
Store them because the Client Secret cannot be retrieved later
The URL contains 'oauth' instead of 'basic' so that it looks like this:
The Authentification type is 'OAuth2ClientCredentials'
Take the Client ID and Client Secret from the API client above
Token Service URL should look like this:
I tried to find a similar option for CF but was not successful...
I am running a Poc to set up TMS with the below nodes.
Source Node, DEV, this is in CF.
Target Node, QA, This is in Neo.
I tried to import the transport order(this contains a package and an iFlow) to the Target Node but it is showing the below error in the log. Any idea what is missing?
thanks a lot.
Unfortunately it is not possible to do a cross platform transport between Neo and CF (and vice versa). The technical reason is a slightly different MTA format on Neo and CF. So you need a uniform landscape (all Neo or all CF).
Therefore it is also not possible to use cTMS to migrate Cloud Integration content between Neo and CF.
Sorry and kind regards
Ok, I got it. Thanks Harald!
I have done the set up and it working fine. 🙂
Is there a feature to compare transport orders in CTM? Is there an feature to activate the previous transport order? so we can do a kind of rollback of iflow changes.
great to hear that you successfully implemented cTMS.
Unfortunately there is no comparison feature available in cTMS. You could look into the content of the transport request (behind the paper clip icon in the import queue). At least for single artifact transports for Cloud Integration it shows the version of the artifact:
Unfortunately this information is not available on package level.
As for the rollback question: after a transport request has been successfully imported, its status can be reset to be imported again. For that mark the transport in question and click on the 'Reset' button:
After that the status of the request changes to 'Repeatable' and you can import it again. If it contains an older version of the content it will overwrite the newer content and thereby perform a 'rollback'.
However, there is one pitfall: to save space in our persistence layer we delete the content of transport requests (not the logs) 30 days after it is not in an importable status in any import queue anymore (=after it has been imported in all systems in the corresponding landscape). After this has happened the transport request cannot be reset anymore.
Thank you Harald!
I am adding the integration content(package+iflow) using the API( /v2/files/upload and /v2/nodes/upload).
The content was added successfully but when I try to import by the UI option "import selected", I am getting the below error.
Transport request 'Uploading a change from CI Pipeline' (id: 5309): Exception during start of deployment for deploy type 'SLP_CTS': Error during deployment initialization: Unauthorized
Any idea what authorization is missing?
Just to make sure: do you by chance use an user for which you had recently to change the password (due to 'normal' password rotation)? We had cases where this change has been done to the target tenant destination, so that it was working ok - but where the update of the password has not been done for another destination (or ABAP-based RFC destination). This then led to failed login attempts by this other destination, which in turn locked the user, creating a similar error message.
Background: the destination service can't use the SSO certificate during the logon, so that the calls fail during the locking period. if this might apply here as well, solution would be to check all usages of this user in destinations and to update the password everywhere accordingly.
Thank you for your answer.
Yes, I changed to a new S-user in the destination. The previous one was expired.
Thanks a lot for the clarification - then, please make sure that you have updated the password in all destinations (on SAP BTP + ABAP-based) where the use has been used.
I hope this helps!