As with any application working with big data, archiving plays an important role and we need to periodically archive data that is rarely accessed. Memory is more expensive than disk, and hence we need to have hot data (frequently accessed) in SAP HANA and cold data (rarely accessed) in disk.      

The focus of this article is to explain how archiving of data (moving cold data to disk) related to a business scenario in SAP Operational Process Intelligence can be achieved with SAP HANA Dynamic Tiering and SAP HANA Data Lifecycle Manager (which is part of SAP HANA Data Warehousing Foundation).

We also explain how archived data (cold data) can be brought back to SAP HANA.

For more details on:

  • SAP Operational Process Intelligence, visit this space
  • SAP HANA Dynamic Tiering, refer this link
  • SAP HANA Data Lifecycle Manager, refer this link

Pre-requisites

  • SAP Operational Process Intelligence 1.12 or higher
  • SAP HANA Dynamic Tiering installed and configured
  • Data Lifecycle Manager installed and configured

Data that will be archived           

  • Completed or abruptly ended instances of a business scenario
  • Completed tasks associated with a business scenario
  • Completed checklist associated with a business scenario

Required Authorizations

Role

Description

sap.opi.pv.roles::OPINTSERVICE

To access data related to the business scenario

Object privilege for TASKMGT schema with Delete, Execute, Insert, Select and Update privileges

To access task related to the business scenario


Procedure to perform archiving

The following procedure will archive data for only one scenario. In order to archive data for other scenarios you need to create different profiles for the respective scenarios.

Step 1: Launch Data Life Cycle Manager

To start the Data Lifecycle Manager tool, depending on whether HTTP or HTTPS port has been configured, enter one of the following URLs in your browser:

  • http://:<<Host>>:80<<SAP HANA instance number>>/sap/hdm/dlm
  • https://<<Host>>:43<<SAP HANA instance number>>/sap/hdm/dlm

Step 2: Create Storage Destination

Create a storage destination with type as “HANA Dynamic Tiering Local”. After creating and saving the storage destination, activate it.

Refer the section Managing Storage Destinations in the Data Lifecycle Manager guide for details on creating storage destination.

Note:

  • This step is to be performed only for the first scenario that you archive, as the same storage destination can be used across scenarios.

Step 3: Create Table Groups

Create two table groups, one for data related to the business scenario and the other for tasks.

Refer the section Managing Modeled Persistence Objects in the Data Lifecycle Manager guide to create table groups.

Group

Details

Business scenario tables group

Tables: Event log and context table(s) of the business scenario present under “SYS_PROCESS_VISIBILITY” schema

  • Event log of the business scenario

<<packagename>>.gen_<<scenarioname>>::SPVR_<<scenarioname>>_EVT

  • Context of the business scenario

<<packagename>>::SPVR_<<scenarioname>>_<<ParticipantFamilyId>>**_CTX

Common column: “SCENARIO_INSTANCE_ID”

Example:

  • Event log : opint.retail.oci.gen_oci::SPVR_OCI_EVT
  • Context : opint.retail.oci::SPVR_OCI_O2C_CTX

Note:

  • There can be more than one context tables for the scenario and all of these tables should be included while creating the table group

Task tables

group

Tables:

  • “TASKMGT”.”sap.bc.taskmgt.task::HISTORY”
  • “TASKMGT”.”sap.bc.taskmgt.task::ASSIGNMENT”
  • “TASKMGT”.”sap.bc.taskmgt.task::COMMENT”
  • “TASKMGT”.”sap.bc.taskmgt.task::TASK”
    • Tip :  For adding the TASK table use “::TASK”   
  • “TASKMGT”.”sap.bc.taskmgt.task::TASK_PARAMS”
  • “SYS_PROCESS_VISIBILITY”.”sap.opi.pv.insight2action::

SPVR_TASK_REFERENCE”


Common column: “TASK_ID”


Step 4: Create Lifecycle Profiles:

There are 3 three lifecycle profiles which need to be created.

  Refer the section “Managing Lifecycle Profiles in the Data Lifecycle Manager guide for details on creating lifecycle profiles.

Business Scenario Profile

  1. In the Source Persistence tab:
    1. Source persistence type – “SAP HANA table Group”
    2. Table Group Name – Business scenario tables group
    3. Trigger Type – “Scheduled”
  2. In the Destination Attributes tab ensure the following:
    1. Relocation Direction – “Hot to Cold”
    2. Clash strategy Hot to Cold – “Overwrite”
  3. Go to the Rules Editor tab and add the query BusinessScenario_query” which is attached with this article. Replace the placeholders <<<EVT>>> with the fully qualified table name of the scenario EVT table. Change the parameter of “ADD_DAYS”. For example, give -90 to archive all business scenario instances which got completed before 90 days. Click on “Validate Syntax” to verify the query.
    1. Sample query would look like:
    2. /wp-content/uploads/2016/06/query_example_923232.png
  4. Save and Activate the profile.
  5. The profile should be activated without any errors.

Task Profile

  1. The Relocation Direction, Clash strategy, Source persistence type and Trigger Type should be the same as for business scenario profile.
  2. Table Group Name should be given as the task tables group.
  3. Go to the Rules Editor tab and add the query Task_query” which is attached with this article. Replace the placeholders <<<EVT>>> with the fully qualified table name of the scenario EVT table and <<<scenario_def_id>>> with the scenario definition ID.
    1. Note: Scenario definition ID can be found in the event log table of the scenario. The value in the “SCENARIO_DEF_ID” column is to be given as the  value in the query for <<< scenario_def_id>>>
  4. Change the parameter of “ADD_DAYS” to the same value as given in business scenario profile. Click on Validate Syntax to verify the query.
  5. Save and Activate the profile.
  6. The profile should be activated without any errors.

Checklist Profile

  1. In the Source Persistence tab:
    1. Source persistence type – “SAP HANA Table”
    2. Schema – “SYS_PROCESS_VISIBILTY”
    3. Table – “sap.opi.pv.insight2action::CHECKLIST_REFERENCE”
  2. The Relocation Direction, Clash strategy and Trigger Type should be the same as for business scenario profile.
  3. Go to the Rules Editor tab and add the query Checklist_query” which is attached with this article. Replace the placeholder <<<EVT>>> with the fully qualified name of the scenario EVT table. Click on Validate Syntax to verify the query.
  4. Save and Activate the profile.
  5. The profile should be activated without any errors.

Step 4: Run the lifecycle profiles

The lifecycle profiles created above has to be scheduled to run at specific intervals so that cold data can be moved to SAP HANA Dynamic Tiering and cannot be viewed in space.me.

  1. Click on “Run” -> “Schedule” to run archiving in the required frequency
  2. Check the logs in the Data Lifecycle Manager tool to check the results.


Procedure to restore archived data

If you are interested in viewing the archived data in space.me you would need to move archived business scenario data (cold data) to SAP HANA by following the below procedure

Step 1: Edit the profiles


Business Scenario Profile

  1. In the Destination Attributes tab ensure the following:
    1. Relocation Direction – “Cold to Hotter”
    2. Clash strategy Cold to Hot – “Skip”
  2. In the Rules Editor tab, change the query to “SCENARIO_INSTANCE_ID” IN (-999).
    1. Note: This is a dummy ID so that all data from dynamic tiering tables is moved back to SAP Operational Process Intelligence tables.
  3. Save and Activate the profile.
  4. The profile should be activated without any errors.


Task Profile

  1. Ensure the Relocation Direction and Clash Strategy Cold to Hot as described above.
  2. In the Rules Editor tab, change the query to “TASK_ID” IN (-999).
    1. Note: This is a dummy ID so that all data from dynamic tiering tables is moved back to SAP Operational Process Intelligence tables.
  3. Save and Activate the profile.
  4. The profile should be activated without any errors.


Checklist Profile

  1. Ensure the Relocation Direction and Clash Strategy Cold to Hot as described above.
  2. In the Rules Editor tab, change the query to “CHECKLIST_ID” IN (-999).
    1. Note: This is a dummy ID so that all data from dynamic tiering tables is moved back to SAP Operational Process Intelligence tables.
  3. Save and Activate the profile.
  4. The profile should be activated without any errors.

Step 2: Run the lifecycle profiles

Run the lifecycle profiles so that the required business scenario data can be moved to SAP HANA and you will be able to view the data in space.me.

  1. Click on “Run” -> “Schedule” to run archiving in the required frequency.
  2. Check the logs in the Data Lifecycle Manager tool to check the results.
To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply