SAP Data Intelligence – What’s New in 3.3
SAP Data Intelligence 3.3, on-premise edition is now available.
Within this blog post, you will find updates on the latest enhancements in 3.3. We want to share and describe the new functions and features of SAP Data Intelligence for the Q3 2022 release.
If you would like to review what was made available in the previous release, please have a look at this blog post.
This section will give you only a quick preview about the main developments in each topic area. All details will be described in the following sections for each individual topic area.
SAP Data Intelligence 3.3
Connectivity & Integration
This topic area focuses mainly on all kinds of connection and integration capabilities which are used across the product – for example in the Metadata Explorer or on operator level in the Pipeline Modeler.
Support additional warehouse field in Snowflake connection
A Data Intelligence Administrator will be able to configure a Snowflake connection with an active warehouse in the Connection management.
Using this new parameter, the Snowflake connection is not dependent on the connection user having a default warehouse assigned.
The Snowflake connection schema will now show a new configuration property “Warehouse” in UI when creating a connection.
Cloud Table Producer Operator for Snowflake & Google BigQuery
Provide a generation 2 operator “Cloud Table Producer” that offers the ability to write data into Snowflake & Google BigQuery as a target including the usage of a staging area in form of files.
The supported intermediate persistency looks as follows:
- Snowflake using Amazon S3 or Azure Blob Storage (WASB)
- Google BigQuery using Google Cloud Storage (GCS)
The data from the source system will be first loaded to files in the defined staging area and in a second step from files inside the staging area replicated into Snowflake or BigQuery.
Once the data in form of files is successfully written to Snowflake or BigQuery via Cloud Table Producer, the file will automatically be deleted in the staging area.
Support SNC for ABAP_LEGACY
As a system manager, you can activate SNC (Secure network communication) for RFC when using an ABAP_LEGACY connection, so that you can create a connection to legacy ABAP On Premise/ECC system in a secure way with encryption.
The following connection parameters are introduced and need to be set when using SNC authentication:
- SNC Client Name
- SNC Partner Name
- Client Certificate
- ABAP Server Certificate
For using SNC via SAP Cloud Connector (SCC), SNC need to be configured on SCC with username + password authentication in the connection created in SAP Data Intelligence Connection Management.
This topic area covers new operators or enhancements of existing operators. Improvements or new functionalities of the Pipeline Modeler and the development of pipelines.
Support Kafka Generation 2 operators
Availability of Apache Kafka producer and consumer operators as Generation 2 operators, including state management and resilience
New data type API (Generation 2)
- Easy data to message transform
- Streaming support
- Standardized Error ports
Operators work seamlessly with other Generation 2 operators
- Consume Kafka data and ingest into selected data targets
- Produce Kafka messages from consumed sources
- No dedicated commit handling needed
Python 3.9 as default version
Python version 3.9 is now the default in the Python subengine. Python version 3.6 is removed since it has officially reached its end of life.
These are the new functions, features and enhancements in SAP Data Intelligence 3.3, on-premise edition release.
We hope you like them and, by reading the above descriptions, have already identified some areas you would like to try out.
If you are interested, please refer to What’s New in Data Intelligence – Central Blog Post.
For more updates, follow the tag SAP Data Intelligence.
We recommend visiting our SAP Data Intelligence community topic page to check helpful resources, links and what other community members post. If you have a question, feel free to check out the Q&A area and ask a question here.
Thank you & Best Regards,
Eduardo and the SAP Data Intelligence PM team