Skip to Content
Product Information
Author's profile photo Tammy Powlas

Case Study: Data Aging in SAP HANA and S/4HANA

Please register for Introduction to SAP Data Volume Management webcast – a key part of this topic.

First, I would like to share a recent message about data aging from SAP:

Source: SAP

Below is an overview of data aging of application logs, object BC_SAL, done in both S/4HANA and Suite on HANA.

Source: SAP/ASUG

Improved resource management is achieved by using a multi-temperature data strategy

The older the items get the less they are used

Items that are cleared in previous years should not be changed

Items that are still used, such as current year data, are considered hot

Other less used items are cold

Data that is considered cold, can be moved out of main memory into slower storage and accessed when needed

Changing the temperature of data from hot to cold according to application rules and moving them to cold storage is called data aging

To activate data aging business function go to transaction SFW5 and search for DAAG_DATA_AGING and activate

Authorizations needed – SAP provides some standard roles, but I found they are missing authorizations

The list of data aging objects in Suite on HANA is small

List is much larger for S/4HANA

Use transaction DB02 to capture the statistics before and after.

Also run transaction TAANA

TAANA – transaction to perform table analysis

It means Table Analysis: Administration

You can use this to analyze tables according to certain fields, in this case, the DATAAGING field

Generally these are the steps to perform data aging

Partitioning means dividing tables into smaller pieces called partitions; can be managed when it comes to their in memory status

Data aging means moving data from one partition to another (not kept in memory)

There can only be one hot partition

Transaction DAGPTM

On the left panel, you see a list of partitioning objects and groups, assigned tables, and if already created, partitions.  Details on partitioning are on the right table

Initially tables are not partitioned, which is indicated with a red icon in the Is Partitioned column

To create a partition, need to go to edit mode and enter time ranges in the intervals section

Some rules:

Time ranges are consecutive – no gaps and they do not overlap

One partition must cover current date

Hot partition should not be entered; it will be created automatically

You can use the period button where a wizard will create the partitions – see above where you enter start data, unit, period value (1 year) and number of repetitions

Once the new intervals for the partitioning schema are defined, you can start partitioning by pressing F8 – this will trigger the background job

Once the partitions are ready, you need to activate the data aging object, using transaction DAGOBJ

The red status icon indicates the object is inactive

You can use the activate icon or CTRL+F2 to activate

After successful activating the object’s status is green

During activation, the system performs various consistency checks and ensures set up is complete

The last step in configuring data aging is to provide object-specific customization – defining residence time

Before triggering aging runs in update mode you can analyze potential impact by executing the analysis program

Run transaction DAGADM to launch display, select object, click Start Analysis run (not available for all objects) – only example I could find is FI_DOCUMENT

After successful execution, the program log is displayed.  You have Summary and Details tabs and you can switch between them

In the summary view, a distinct list of messages is shown along with the number of objects for each message

Items in green, which have exceeded their lifetimes will be moved to cold storage.  Items in red will not be aged.

You can also click the details view

Note that detailed logs are never recorded for data aging runs in update mode

Data aging in production mode can only be executed in the background

Run transaction DAGRUN

Aging runs are executed for each data aging group – you create or change data groups by going to the menu Goto > Edit Data Aging Groups

To schedule a run, press F8 or go through the menu.

Data aging job progress

A data aging job that is in progress is shown in yellow

You can stop a data aging run by using the stop button in the toolbar

When the job is finished the status icon turns to green

Transaction DAGLOG displays data aging logs from update and simulation runs

Use Display Data Aging Run logs or Display Analysis Log to jump to those logs for a chosen data aging object.

In the aging run logs, use the dropdown menu to go to an aging job log or spool

TAANA to check the data distribution across the participating tables.

After the completion of the Data Aging job, SAP recommends to execute UNLOAD followed by LOAD command on the relevant tables (BALDAT , BALHDR), to free up the corresponding space in Memory


So what do you think?

Other great blogs on this topic:

Please register for Introduction to SAP Data Volume Management webcast

Assigned Tags

      You must be Logged on to comment or reply to a post.
      Author's profile photo Lars Breddemann
      Lars Breddemann

      Seems like a consequential, albeit late decision to accept that "data aging" was not a solution that solved the data volume problems of customers. The SAP HANA platform certainly suffered/suffers from "featuritis" and maybe this is just the first of several functions that will be sunsetted.

      Hopefully, this helps to clear up what the main added value of the platform is over potential alternatives.

      Author's profile photo Tammy Powlas
      Tammy Powlas
      Blog Post Author

      Lars - great comment and thank you for reading.

      Author's profile photo Sri Krishna Chaitanya Kuppili
      Sri Krishna Chaitanya Kuppili

      Hi Lars,

      Could you please shed some light on the recommended approaches to reduce HANA memory footprint, if data aging might be sunsetted , when we have huge custom tables with LOBs.

      as LOBs will not be loaded onto the memory, is it required to age/archive the tables with LOBs?

      Author's profile photo Lars Breddemann
      Lars Breddemann

      You might want to direct this question to SAP, as they have announced (see above) to work on that.


      Author's profile photo Kirill Gorin
      Kirill Gorin

      Hey Tammy Powlas

      Why is it not possible to add more object to this list? As far as I understand using HOT partition is default for unquallified SQL statement. Even without rewriting major parts of the system clients could use this for example to make BSEG/FAGL/ACDOC aged table.

      Also what's the way to go with SAP HANA Native Storage Extension (NSE) here? Can it be used as an alternative to aging?

      Best regards,


      Author's profile photo Tammy Powlas
      Tammy Powlas
      Blog Post Author

      Kirill - thank you for reading and commenting

      I believe there is a data aging SDK where you could add more objects; before doing so please read first figure regarding SAP investment.

      I am not familiar with NSE / other alternatives to data aging, other than archiving.  Maybe ask at

      Author's profile photo Kirill Gorin
      Kirill Gorin

      Here's some description of NSE

      Where can I find out more about data aging SDK?

      Author's profile photo Alessandro Casarico
      Alessandro Casarico

      Hi Tammy,

      I cannot find the official SAP source about no further investing on data aging solution for S4.
      Could you please help me to find it?


      Author's profile photo Tammy Powlas
      Tammy Powlas
      Blog Post Author

      Hi Alessandro - it was on an ASUG webcast last year that SAP provided - I recapped it here:

      Author's profile photo Somraj Roy
      Somraj Roy

      Dear Tammy Powlas : Thank you for the blog. Can you kindly refer me one latest blogs/articles on DVM. Is archiving the only option as mentioned (data aging no longer recommended).  If we migrate a customer from ECC->S4 (conversion or greenfield) then what are the recommendations for data volume management (DVM).

      Author's profile photo Tammy Powlas
      Tammy Powlas
      Blog Post Author

      I think you are right, archiving is recommended as a DVM tool - see about downloading the latest DVM Guide here or Google it

      Author's profile photo FFCL_TAF _BASIS

      HI Tammy Powlas,

      Thanks for the very useful blog.

      We are planning to use data aging for one of the FI object and during partitioning step we are facing issue for one table where it says "Data aging partitions can be created at the second level only if the table is hash partitioned at the first level".

      Here not getting a way to do partitioning at the second level. Please can you provide your comments on this.