Skip to Content
Technical Articles
Author's profile photo Farooq Ahmed

Integration of SAP CI(BTP IS) with Datasphere

Integration of SAP CI(BTP IS) with Datasphere




SAP Datasphere, the next generation of SAP Data Warehouse Cloud, is a comprehensive data service that enables every data professional to deliver seamless and scalable access to mission-critical business data.

Whenever there is a requirement to have data transferred from various external systems to the Datasphere though it has its own connection paradigm if there is already a SAP Cloud Integration connected to all the external systems in the enterprise landscape then rather creating new connection to Datasphere from those its always better to have Cloud Integration connected with Datasphere. This way no need for duplicate connections and also the overall monitoring capabilities gets enhanced.

What is SAP CI(BTP-IS)?

Cloud Integration(BTP-IS) is a set of services and tools provided by SAP on its cloud-based Business Technology Platform (BTP) to enable integration between different systems, applications, and data sources. The key benefit of CI(BTP IS) is that it enables organizations to quickly and easily integrate their systems, data, and applications without the need for extensive coding or custom development. This helps to streamline business processes, reduce costs, and improve operational efficiency.

How Datasphere can be integrated?

SAP Datasphere can be integrated with wide range of sources, both on premise and cloud of SAP and Non-SAP sources and tools. Based on the native connection types which are already available in SAP Datasphere like Generic JDBC, Generic SFTP, Generic ODATA and various others can be used to be connected with external world. Some of the common use cases would be like below.

S4H > Datasphere – Connection with DP Agent (SAP S/4HANA On-Premise)
SAC -> Datasphere – Connection of Type: Cloud Data Integration
Datasphere -> SAC – Direct Data Live Connection
Datasphere -> IBP – IBP connection

How CI integrates with Datasphere?

The integration between CI and Datasphere can be done in multiple ways like below.

  1. CI can load the data directly in Datasphere using JDBC protocol.
  2. CI can consume the exposed OData V4 service from Datasphere.
  3. CI can make use of the open connector on BTP IS if the open connector connection is activated in Datasphere.

Note: Among the possible integrations with Datasphere, OData and Open Connector methods are already explained but the easiest and most traditional method of JDBC was not clearly explained anywhere.

Integration of SAP CI(BTP IS) with Datasphere through JDBC


  1. SAP BTP IS Tenant Access with required CI roles.
  2. SAP Datasphere Access with required “Data Catalog User, DW Viewer, DW Modeler, Data Catalog Administrator” roles.
  3. SFTP access to read the test file.


Step-1: Create a new space in Datasphere space management.

 If you have the required access then you create the space or else you can connect with the Basis team.


New Space in Space Management


If basis team is creating the space make sure you are also the member of this space without which you cannot view the space also for the Database schema/user/password/hostname/port details with the permission to open the underlying Database explorer.


Step-2: Create a new Database user in the created space.

After the space is created then the schema and the database user details to be created with the required Read, Read(HDI) and Write permissions.

Space Schema: “Space_Schema”

Database Username: “Space_Schema#User_Name”

Open SQL Schema: “Space_Schema#User_Name”

Hostname: “<<Hostname>>”

Password: “Password”

Port: 443


Database User Details


Step-3: Create the table in the SAP HANA Database Explorer. 

Create the table as per your required columns, datatypes and length.


New Table creation in HANA DB


Step-4: Create the JDBC Data Source in Manage JDBC Material of Integration Suite.

For creating the JDBC Data Source use all the parameters from the Step-2 like User and password.

The Database type should be selected as “ SAP HANA Cloud “

And for JDBC URL follow the format as


Deploy the JDBC Data source successfully.


JDBC Data Source


Step-5: Create the new IFLOW to send the data from CI to Datasphere.

Here for the demo purpose the source is selected as a SFTP file but as per the business requirement it can be anything.




There can be multiple steps as per the requirement but the main target structure should be using the JDBC XML Format as below.



<dbTableName action=”INSERT”>










The tablename which was created in the HANA Cloud database should be provided in the Message Mapping as below


Table Name in Message Mapping


The JDBC Data source which was created in step-4 should be provided in the JDBC Receiver channel as below


JDBC Channel with Datasource


Deploy the IFLOW successfully after configuring the SFTP channel with the required parameters.


Step-6: Validate the Data received from CI in Datasphere.

All the data processed from CI to Datasphere will be available in the HANA DB table as below


Data inserted in HANA DB table


To make the data visible in the Datasphere application the table should be given a Graphical View in the Data Builder.


Data Builder Graphical View


Once the local view of the table is deployed then the entire data present will be displayed in the Datasphere for further reporting/analytics.


Data in Datasphere Graphical View



Once the SAP’s latest integrator and SAP’s latest data warehouse integrated together in the digital landscape of an enterprise any data present across the systems can have a seamless consumption.

If the data from the Datasphere has to be pulled and sent to other systems then similar configurations to be done considering the Datasphere table as sender for CI through JDBC protocol.


This article is only for informational purpose that CI can be integrated with the Datasphere also.

Only an integration of one method has been shown as an example but no limitations in any perspective.


Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Abraham MD
      Abraham MD

      I appreciate the informative blog. In our case, we often encounter scenarios involving large data volumes with Datasphere, which need to be pulled from external sources. Based on the ISAM methodology, this falls under data integration style, and there are various technologies available for handling such cases.

      In my opinion, while CPI (Cloud Platform Integration) allows us to achieve process integration effectively, it might not be the ideal solution for data integration purposes. Data integration typically requires specialized tools and technologies designed specifically for handling data movement, transformation, and synchronization across various systems. Therefore, I believe that CPI might not be the most suitable choice for data integration tasks.What is your opinion?

      Author's profile photo Farooq Ahmed
      Farooq Ahmed
      Blog Post Author

      Yes that is true when we have large data volumes then CI would not be a fit to standard approach. It will only be used where some transactional set of Data is being moved with already connected systems of CI into Datasphere.

      Author's profile photo Rajesh PS
      Rajesh PS

      Abraham MD

      Farooq Ahmed

      SAP data intelligence cloud will be better to handle large volumes like data orchestration/middleware and can also perform via Table views/mirror db.
      Author's profile photo Martin Kreitlein
      Martin Kreitlein

      Hello Farooq,

      you seem to be a real data integration specialist 🙂

      In my current project, I have the requirement (besides S4HANA connection) to upload daily one MS sharepoint list.

      I've been waiting for 8 weeks now that my sales resp. finds a suitable SAP contact to discuss the easiest and in proportion to costs best way to solve that.

      I read that SAP Datasphere should be able to read an SQL server live, which in turn should be able to read a sharepoint list live. But I could not get any confirmation about this setup, yet.

      What is your recommendation for such small scenarios?

      Thanks and best regards, Martin

      Author's profile photo Farooq Ahmed
      Farooq Ahmed
      Blog Post Author

      Hi Martin,

      Thanks for your valuable words.

      Regarding your requirement to read the MS Sharepoint list into Datasphere you can check the below top down approaches

      1. Open Connector on BTP Integration Suite enabled to connect MS Sharepoint which will indeed used at the Datasphere connection(disabled by default) like below. This can be one live connection.

            2.  If your integration requirement does not have too many files in MS SP then CI also can be  another option like MS SP=>CI=>Datasphere without any open connector referring my blog.

            3.  If there are many files which can be modelled into a SFTP location then a direct generic SFTP connection into Datasphere also will be the best option.

      Other options of having a SQL server or third party connectors will be involving too much configuration on both the source and Datasphere side and there can be some grey areas which might have some risks associated. So according to me the easiest and in proportion to cost best way would be generic SFTP ingestion from Datasphere without much configurations for your requirement of one MS SP list daily.

      Hope this helps you.



      Author's profile photo Rajesh PS
      Rajesh PS

      Martin Kreitlein

      Farooq Ahmed

      SAP CP & BTP IS  not completely matured and full fledged to connect with Apache kafka event drive, Blobs/Azure Services and Sharepoint/one drive it still looks to be a broken solution and needs improvement. Not sure about SAP BTP Datasphere.
      Does BTP IS has open connector for apache kafka?