Overview

I thought I would share this experience with screenshots on how simple it is to perform some minimalist HANA data footprint control using the SAP tool Data Lifecycle Management (DLM)  which comes as a tool in the HANA Data Warehouse Foundation (DWF).

I do not have SAP IQ or Data Tiering, and my DLM policy is to simply delete/destroy the data when it gets to a certain age, so we will be performing some simple data destruction. It’s not a bad start to get into the whole DLM strategy, and there is more to it than, what I will go into, here.

The Installation Guide is very good, and is what i followed, mostly through the process, this blog is intended to support, confirm and assure your install along side the Installation Guide http://help.sap.com/hana/SAP_HANA_DWF_Installation_Guide_en.pdf.

There are also some good DLM videos over in the SAP Academy – https://www.youtube.com/user/saphanaacademy

Blog Structure

  • Downloads
  • Installation
  • Configuration
  • Operation

Downloads

The DLM comes as part of the DWF, which today is on version 1.0 SP05, for our HANA on SPS12.

The DWF is non-cumulative, so you can install the latest without having to patch through various revisions, etc. Just make sure you choose the right SPS for your HANA SPS.

The DWF has four components, not all are necessary, depending upon what your intentions are.

  • DATA LIFECYCLE MANAGER 1
  • DATA DISTRIBUTION OPTIMIZER 1
  • HANA DATA MANAGEMENT 1
  • HANA DWF DOCU 1

For the sake of this blog, I will be downloading and installing 3 components, in this order, as the Installation Guide suggests.

  1. HANA DATA MANAGEMENT 1 – HCOHDM05_0-80000034.zip
  2. DATA LIFECYCLE MANAGER 1 – HCOHDMDLM05_0-80001006.zip
  3. HANA DWF DOCU 1 – HDCHDM05_0-80001017.zip

 

Files

For completeness, the HCO_HDM_DDO is for the Data Distribution Optimizer.

Installation

During the install, a 3 schemas are created.

We performed the Delivery Unit install using our usual authorized user.

Configuration

I will not document the configuration process as the Installation Guide is quite good, however, here are some screen shots you may find useful, under the respective headings in the Installation Guide.

4.2 Configure SAP HANA System Properties

xsengine.ini settings

4.3 Activate SQL Connection Configurations

XS Artifact Admin

http://<myhostname>:8030/sap/hana/xs/admin/

Navigate to the specific area (do not perform a search, as it will return something different) here is were you need to be

and activate

 

4.4.2.2 Custom Privileges at Entity Level for Data Lifecycle Manager

Source Privileges

I created a new user – DLM_ADMIN to perform the DLM activities.
DLM_ADMIN will perform various actions on tables within schemas. E.g. DELETE. In my case, all my custom tables containing the data I want to delete, exist in a single schema, so i applied the relevant 3 source privileges to the Schema than each individual table.

Target Privileges

This user will later have more authorizations (default sap.hdm.dlm.role.GNR.Administrator) as and when the target “Storage Destination” is created.

4.6.3 Generate Default Schema for Generated Objects and Roles Needed for Data Lifecycle Manager

At this stage, we now assign our DLM_ADMIN user the appropriate privileges for ownership of the default generated schema –  SAP_HDM_DLM_GNR, to be.

Give DLM_ADMIN the required prerequisites.
● System privileges DATA ADMIN and ROLE ADMIN

● Object privilege EXECUTE on “_SYS_REPO”.”GRANT_ACTIVATED_ROLE”

● Role sap.hdm.dlm.role::Administrator

Readiness

performing the Call statement as DLM_ADMIN

DLM_ADMIN Roles

DLM_ADMIN Privileges

Now ready for use

Operation

MANAGE STORAGE DESTINATIONS

As your DLM_AMIN navigate to “MANAGE STORAGE DESTINATIONS” which is where we set up the “Storage Destination” details to be used as a “Storage Destination” during the creation of a “Manage Lifecycle Profile”.

You can see from above, the “Storage Destination Type” selected is “Deletion Bin Destination”, and the default schema “SAP_HDM_DLM_GNR”.

“Save”, “Activate”, and “Test Connection”, and you should end up with something like below.

MANAGE LIFECYCLE PROFILES

“MANAGE LIFECYCLE PROFILES” are used to configure sources and targets for the DLM. There can me more than one, depending upon your use.

Source Persistence is configured for a specific “SAP HANA Table”, in my “MONITIQ_TABLES” schema called “hist-linux-cpu”.

For the sake of this blog, I will be triggering the DLM using a “Scheduled” job.

My table has a defined key, however, it there is not one, you would have to specify a key in the “Nominal Key” field (to be figured out)

Destination Attribute

The “Destination Attributes” are all self explanatory, and you can now see the “Storage Destination” created earlier.

Destination Persistence

“Destination Persistence” will appear after activating the profile. There is no interesting Data Flow for a deletion profile.

Rule Editor

The “Rule Editor” only contained “SQL Based Rule Editor”, and was sufficient for my use. A nice feature actually shows the number of affected records in real time, based upon your current rule.

A simple query in Studio can confirm numbers

Now to “Save” and “Activate”

Note the “SMO Destruction Bin” “Data Distribution” section never displays, and is greyed out.

Generated Object(s)

“Generated Object(s)” will appear after activating the profile

I can navigate out to the Schema and see this database Procedure

Simulate

At this stage we can run a couple of simulations

Data Relocation Preview

 

Data Relocation Count

Import/Export

I can Export/Import this configuration as a text file, to/from other environments


Run

Now we are ready to actually perform the DLM activity. For the sake of this blog, I configured the DLM Profile to Scheduled earlier, so “Schedule” is the only option I have to run. I set it to run a few minutes in the future, with no recurrence.



After the Job run, take a look at the “Logs” (ignore the start time inconsistency, as i had to re-schedule)

Which brings me to the point, what I do not like about the scheduling. There is nowhere to see the intended schedule, prior to it being run. For example, how can I confirm I have actually scheduled this.

Click “ID” to get more detailed information



After a successful report from the logs, I went to investigate.

Before

Reminder of above


After

A recount certainly shows records have been removed

The Profile graphic has updated the Source number of record to suit, but does not not update the “SM0 Destruction Bin”. Whether it is meant to, who knows.

Anyway, records are gone… for ever. There is no functionality to put back the records within the DLM.

Miscellaneous

In the top right hand side, there are two tags

Open XS Job Tool” shells me out to the HANA XS Admin Tool Job Tool, where I can see more information about the Job

The Job can be further edited to an extent

Versions” simply give me some basic  information about the Profile

Summary

So that about wraps up my quick and easy, minimalist experience with DLM using the Deletion Bin functionality. Hope there was something in there for you.

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply