Modeling – Data Layer
A number of new features support making changes to data entities that are used as sources by other objects:
Replace Sources in the Graphical View Editor
You can now replace sources in the graphical view editor with a simple drag and drop action. The output structure of the view is preserved as far as possible, and you are guided through mapping the columns from the old source to the new source. Any impacts caused by the replacement in the view are indicated by validation messages.
For more information, see Replacing a Source.
Delete Used Columns
You can now delete a column in a table or view even if that column is used in views built on top of the current entity. You will receive a warning message after deletion, and modelers of dependent views will receive a notification along with validation messages to guide them through repairing their views.
Review and Navigate to Dependent Objects of Tables and Views
You can use the new Dependent Objects section in the properties of tables and views to review and navigate to the objects that depend on it either as a source or via an association.
Object Status Enhancements
The Deployment Status property of tables and views is renamed to Status, and now has new values to indicate errors in the design-time or run-time versions of the object. In addition, the list of objects in the Data Builder start page now includes Status and Deployed On columns, to provide visibility for this information without opening an editor.
You can now edit both the business and technical names of your view columns in a projection node.
For more information, see Reordering, Renaming, and Excluding Columns.
You can now save changes to data layer objects, even if your space is locked.
For more information, see Space Status.
Related article: When the Space is locked
In a data flow, you can define filter conditions for the consumption of datasets from CDI, OData, and ABAP CDS sources in the Source Filters section of Properties pane. Filter conditions can be defined on same or different columns.
To improve transparency regarding features such as Remote Tables or Data Flows that you can use with a connection and to easily enable or disable a feature for the connection, you can now find a Features section in the connection creation wizard and in the editing dialog. Selecting the Data Provisioning agent to use remote tables for accessing data in on-premise sources moved to the new section. You can disable a feature for the connection as long as the connection isn’t used yet.
For more information, see Create a Connection.
With the new connection type SAP S/4HANA Cloud you can directly connect to SAP S/4HANA Cloud instances. It makes connection creation easier because it provides configuration options tailored to the source and replaces the need of using the SAP ABAP connection type for connecting to SAP S/4HANA Cloud. You can use the new connection type for data flows as well as for building views and accessing data with remote tables.
For more information, see SAP S/4HANA Cloud.
For your legacy ABAP systems that haven’t installed the ABAP Pipeline Engine extension (APE) or DMIS Addon, you can now leverage Operational Data Provisioning (ODP) connectivity when modeling data flows. Connection types SAP ECC and SAP BW support respective source systems that haven’t intalled APE, connection type SAP ECC additionally supports SAP S/4HANA versions lower than 1909.
You’re using SAP Landscape Transformation Replication Server (SAP LT Replication Server) to facilitate trigger-based replication of data in your system landscape? Go for using connection type SAP ABAP to establish a connection to SAP LT Replication Server and use the connection and sources it provides in data flows.
For more information, see SAP ABAP.
SAP ABAP-based data sources need specific data types and format for Date.
With the DATE_SAP we now support the modeling of generated time dimensions with these sources and enable you to create models and reports on various time granularities such as year, quarter and month.
To get the update, go to your space, open the space details → Time Data and click (Update) .
We’ve changed the terminology for the database users. Instead of speaking about data ingestion and data consumption, we now refer to the two different privileges that you can select when creating a database user; read (formely known as ingestion) and write (formerly known as consumption).
There’s no need any more to leave SAP Data Warehouse Cloud for analyzing Data Provisioning Agent issues. We have integrated the agent adapter framework log (framework.trc file) and the agent adapter framework trace log (framework_alert.trc file) into SAP Data Warehouse Cloud. Administrators can enable the log access for the agent in the Data Integration tab of the Administration and review the logs from the agent’s tile.
For more information, see Monitoring Data Provisioning Agent Logs.
Related article: DP Agent Monitoring