SAP Datasphere – Life cycle management and deployment options
Transporting of Content. aka. Lifecycle management in SAP Datasphere is a bit different than in traditional on-premises systems because SAP is responsible for software, hardware, and infrastructure update.
A landscape with a single system is appropriate for very small deployments, or for the initial phase of your SAP Datasphere. This landscape is not supported for reliable lifecycle management.
However, you can store test and production content in different spaces.
Memory and DP Agent consumption and configuration are to be high observant aspects from a performance view.
Add at least two SAP Datasphere tenants to your landscape. You can use one to develop and test the content, and other as production environment as source of truth.
Here are some of the advantages of such a scenario:
The test content is fully separate from the production data and content.
Improved performance, as there is no need to use production data sources for testing and development tasks. One may want to add more non-productive systems so that the SAP Datasphere landscape matches the structure of the source systems.
Lifecycle management is easier. One can use Import and Export features to promote content, and the production system contains fully tested content.
Use Case scenarios for content transports
1. Transport within Single Tenant
a. Same Space but with a different connection
Export the content to a CSN file which has connection pointing to development connection (example: ECD_ABAP).
- After exporting the JSON file, change the connectionID to ECC_ABAP(connection pointing to Quality)
Import the JSON file, connection name will be changed in the Dataflow.
b. Different Space with different connection
Follow the steps of scenario (a) and import the CSN file to a different space of the same tenant.
c. Different Space with same connection
It is a recommended practice to maintain the same connection name across to have smoother transition of the content to forward landscapes or tenants
Step 1: select to open an object that needs to be exported from the source space -> Data Builder option to the target space. For example, Data Flow –
The file will be exported to the downloads of the local desktop as CSN/JSON format provided there are no issues with the object.
Step 2: Import the file into the target space as below –
Select the space -> Data Builder section or menu where the object needs to be imported
Select the option to import the CSN/JSON file and select from the downloads folder
The file gets imported and as next step open the object and deploy for the successful transport.
Note – Always look for the existence of the dependencies as active or deployed while doing so.
d. Share the Content between the Spaces/Cross Space Sharing
This allows us to use data from another spaces (Functional areas or teams) without replicating the data physically in both spaces. To show this functionality is working we are going to use an example of sales data.
Step1: Select and open the object from the data builder or business builder and click share icon as highlighted below
Select or choose the target the space name
Go To the target space -> data builder and looks for the section shared objects
Now this can be consumed into any view or dataset to consume for further actions.
2. Transport across the Tenants (Multiple)
a. Content with the same connection
When moving the content across the tenants use the SAP Datasphere option of Export and Import for the Content management along the different landscapes
Create a package and choose the objects that needs to be exported to the target tenant. Post selection of the objects the below will be the dialogue window that appears.
Go to the target tenant and Import the package
b. Content with Different Connection
It is recommended to always have the same connection as source in the target to avoid any disrupts to content working and functionality therefore avoid corruption.
As any in a case of this as 1%, may be an option of CSN/JSON file format with connection string changes could be foreseen as a workaround. But this should be tried and tested further based of use case.
** Content provided based on real time project experience.
There are multiples ways to Transport Datasphere Objects and Connection, we can chose which will be the best option as per the project requirement.
Please share your feedback or thoughts in a comment section and follow my profile for similar content.