Skip to Content
Business Trends
Author's profile photo Axel Meier

SAP HANA SQL Data Warehouse – Data Warehousing Foundation

The SAP HANA Data Warehousing Foundation option is a series of packaged data management tools to support (large scale) HANA SQL Data Warehouse use cases. With SAP HANA Data Warehousing Foundation, you can achieve smart data distribution across complex landscapes, optimize the memory footprint of data in SAP HANA and streamline administration and development. 

In this post, we will introduce the tools and solutions that are available as a part of the SAP HANA Data Warehousing Foundation (DWF) and provide training resources. The tools we’ll be covering include:

  • Data Distribution Optimizer (DDO)
  • Data Lifecycle Manager (DLM)
  • Data Lifecycle Manager (DLM)
  • Native Datastore Object (NDSO)
  • Data Warehousing Scheduler (DWS)
  • Data Warehousing Monitor (DWM)

Data Distribution Optimizer (DDO)  – XS-Classic

The Data Distribution Optimizer (DDO) provides the capability to plan (simulate), adjust and analyze landscape reorganizations for SAP HANA scale out systems (BW4/HANA, BWonHANA, HANA SQL DW, Data Mart, S/4HANA)

 

SAP HANA Academy DWF Channel: 

 


Data Lifecycle Manager (DLM) – XS-Classic

The Data Lifecycle Manager (DLM) provides the capability to archive / displace data from a SAP HANA persistency to Multi-Store Table, HANA Extension Node, Dynamic Tiering, SAP IQ, Hadoop or SAP Vora** (**restricted shipment with DWF 2.0 SP02 – XSA)

 

SAP HANA Academy DWF Channel:


Data Lifecycle Manager (DLM) – XS-Advanced

The Data Lifecycle Manager (DLM) for XSA provides the capability to archive / displace data from a SAP HANA persistency to Hadoop or SAP Vora** (**restricted shipment with DWF 2.0 SP02 – XSA)


Native Datastore Object (NDSO) – XS-Advanced

The Data Warehousing Foundation (DWF) native DataStore Object (NDSO) has been established as the central persistency object to model HANA SQL Data Warehouse systems

The NDSO is integrated with

  • SAP Web IDE for SAP HANA (XSA) for Modeling, Implementation & Administration, Enterprise Architecture Designer (EAD) for Modeling
  • HANA SDI Flowgraph for NDSO Data Load
  • Data Warehousing Foundation (DWF) Task Chain and Data Warehouse Monitor (DWS) to execute and monitor the NDSO Data Load and Data Activation tasks

The NDSO allows merging of delta data and full data loads into its reportable content and also provides delta-data processing capabilities to connected data targets

The NDSO data merge process is triggered by the Data Activation task

The NDSO is capable to process records based on the Source-data specified RECORDMODE values (like: INSERT / UPDATE / DELETE – aka. CDC)

Simplyfied Flowgraph design

  • No need to design specific INSERT / UPDATE / DELETE handling within the flowgraph, as the NDSO manages the individual data processing based on the RECORDMODE values

The NDSO is capable to rollback/delete previous loaded and activated data-sets


Data Warehousing Scheduler (DWS) – XS-Advanced

The Data Warehouse Scheduler (DWS) is capable to maintain dependencies between single tasks with the focus to provision data warehouse models supported task types (HANA SDI Flowgraph+ NDSO Data Activation) and Data Tiering supported tasks (HANA Procedures).


Data Warehousing Monitor (DWM) – XS-Advanced

The Data Warehouse Monitor provides the capability to schedule and monitor the execution of Task Chains in the data warehouse


HANA SQL Data Warehouse – XS-Advanced

Additional resources on SAP HANA SQL Data Warehousing:

DWF SAP SDN Page:

http://scn.sap.com/docs/DOC-62482

DWF SAP Help Page:

http://help.sap.com/hana_options_dwf

Explore next-generation Data Warehousing solutions from SAP.

Assigned Tags

      5 Comments
      You must be Logged on to comment or reply to a post.
      Author's profile photo Mel Calucin
      Mel Calucin

      Axel,

      Thanks for this blog.

      One of the things that can be done on SAP BW which I haven't figure out how to do in Native Hana DW is distributed processes. Transformations exists in BW which are executed by DTPs. Within the DTP, the package size coming from the source data is set well as the number of background work processes to be used. The background work processes can span multiple ABAP application servers. If more background work processes are needed, another ABAP application server can be added.

      What is the analogous distributed process framework for Native Hana DW?

      Thanks.

      Regards,

      Mel Calucin

      Author's profile photo Axel Meier
      Axel Meier
      Blog Post Author

      Mel,
      we're leveraging SAP HANA SDI (Smart Data Integration) within the context of the HANA SQL DW (Native HANA DW) to process the data between database persistence objects (tables, NDSO).
      The SAP HANA SDI flowpgraph provides the capability to configure "Partitions" / data-sets (based on column-name and data ranges). The number of "Partitions" to be processed in parallel is configurable.

      SAP HANA SDI provides a Workload and Resource Consumption Management, which is integrated with the SAP HANA workload management. (Section 7.10 in below referenced SDI Admin Guide)

      Pls feel free to have a look at the SDI Admin Guide to check the details:

      https://help.sap.com/doc/PRODUCTION/dd356918159e4ba1a32927b0b917b78c/2.0_SPS00/en-US/SAP_HANA_EIM_Administration_Guide_en.pdf

      Thanks,
      Axel

      Author's profile photo Mel Calucin
      Mel Calucin

      Axel,

      Thanks for the reply. I understand now how to do it in HANA directly.

      I think one advantage of BW is that it is easier and more cost effective to scale. With BW, we can add another app server.

      Is working with HANA distributed computing just as easy to scale and just as cost effective?

      Thanks.

      Regards,

      Mel Calucin

      Author's profile photo Axel Meier
      Axel Meier
      Blog Post Author

      Mel,
      the HANA SDI Data Provisioning Agent can be installed on regular Hardware (NO HANA Hardware).

      You may have multiple instances of the Data Provisioning Agent installed on different/multiple machines.
      In addition to installing SAP HANA in a multiple-host configuration, you can use agent grouping to provide
      automatic failover and load balancing for SAP HANA smart data integrationand SAP HANA smart data quality
      functionality in your landscape.

      The Master Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality provides the details:
      https://eaexplorer.hana.ondemand.com/rest/mimeRepositories/12447/file/SAP_HANA_EIM_Master_Guide_en.pdf

      Thanks,
      Axel

      Author's profile photo Nathan Crawford
      Nathan Crawford

      Axel,

      Is there a product availability matrix available?

      I want to see if our Hana platform can support DLM.

       

      Thanks,

      Nate.