Skip to Content
Technical Articles

How to perform Data Aging in S/4HANA


Recently I played around Data Aging concept in S/4HANA system and today I would like to show you how to perform the initial configuration based on SFLIGHT data. This sample data model delivered by SAP allows us to understand the process of data aging without having to worry how to create sample business documents as data generation is simple and straightforward.

Having system running for months or years we need to properly maintain it. We need to ensure the system is available during business hours and performance is on required level. System is growing together with your company and at one moment you have to decide what to do with data that is no longer used. So far the best solution was data archiving, which allows us to store data on separate storage outside SAP system.

Data aging is new possibility to manage outdated information and helps to move sets of data within a database by specifying a data temperature. Hot data resides in current area of the database, when the warm / cold data is moved to historical area.


System preparation and SFLIGHT data generation

For data aging using SFLIGHT data SAP prepared separate set of tables:

DAAG_SPFLI – Flight schedule


DAAG_SBOOK – Single Flight Booking

You can either generate the data using SAPBC_DATA_GENERATOR and copy content of the tables or copy the whole report and change the target tables directly in the source code (I used the second approach).


You can copy the SFLIGHT model data to DAAG* tables using report RDAAG_COPY_SFLIGHT_DATA. Thanks Guenther Hasel for sharing this tip!


We can check the table contents directly in SE16n:


When our data is ready we can prepare system to support Data Aging. Firstly, we need to enable new profile parameter:


The second preparation step is to activate Business Function DAAG_DATA_AGING in SFW5.


Creating partitions
First Data Aging transaction we are going to use is DAGPTM – Manage Partitions, where we can create and modify partitions for all partitioning objects available in the system. Partitioning object is a set of tables that will be partitioned together. In our case it consists of two tables: DAAG_SBOOK and DAAG_SFLIGHT:


Currently there is no partitions defined in our system for SFLIGHT. There are two ways of creating partitions. We can do it manually or use additional tool delivered by SAP – Partition Proposal.

To enter Partition Proposal, click on desired Partition Object and choose “Propose Partition Ranges” from menu (or click F9).


Depending on partitioning object setting value in Date Field may be already filled. In our case let’s choose FLDATE as the date field for DAAG_SFLIGHT and DAAG_SBOOK. After execution, we can see Proposed Partition Ranges and Data Volume in each year / month.


In Proposed Partition Ranges we can easily simulate creation of new partitions and the tool will update us with information about projected data volume.

This time we will create partitions manually. Go back to Manage Partitions and click on Period button./wp-content/uploads/2016/10/9_1047253.png

After confirmation of the dialog box we can run partitioning.


Background job ran only for few second to create partitions.


You can see there is one partition created without Start and End date defined. This partition stores current (hot) data, others are for the historical area.


When we expand partitioning object and select table we can click on the ‘eye’ button to check how many data is in current and historical areas. As we didn’t execute aging, all data resides in current area.


Let’s have a look on what happened in HANA database. Choose DAAG_SFLIGHT table and display runtime information:


We can see our table is partitioned on database level according to our requirements defined in previous steps. At the moment there is no records in any other partition than the current one – which is good. We can also check if the partition is loaded to memory (last column).

Now we need also to decide what shall be the residence time for our documents. We can set thevalue in table DAAG_RT_SFLIGHT. It’s quite flexible – we can set different values based on Carrier and/or Flight Connection. For our testing scenario I want all objects older than one day to be moved to historical area.


Activating Data Aging object

To display all data aging objects, go to transaction DAGOBJ.


When we double click on the object name we can see details like participating tables or implementation class (which actually drives the data aging run by selecting objects to be moved to another partition).


To activate the object, on previous screen select DAAG_SFLIGHT and click Activate button from the menu.


During activation system is running various checks, like data object consistency or existence on partitions at database level.


Data aging run

Once our partitions are created and data aging object is activated we can execute the actual Data Aging. In order to do that enter transaction DAGRUN.


Now we need to define data aging group for our objects. In menu select Goto -> Edit data aging groups. On the screen create following entry:


Once saved go to Data Aging Objects and select DAAG_SFLIGHT model:


If you are not able to save your entries and there is error message displayed saying “DAAG_SFLIGHT is not an application data aging object” there is one correction needed. Please update table DAAG_OBJECTS by setting DAAG_OBJ_TYPE = “A” for data object DAAG_SFLIGHT.


Finally, we finished preparation steps and we can schedule Data Aging run.


After confirming the dialog box the job will start at chosen date / time.


And when it’s done we will find a green light in status column.


In statistics for Data Aging run we can see that 66 308 rows were processed:


Let’s see what happened to our data. Go to Manage Partition to display Hot and Cold Data for Partitions of Table DAAG_SBOOK.


We can also go to HANA Studio and check Runtime Information for table


Success! Our data is now aged!

Standard SAP records have this functionality already implemented, but what happens if we want to access the historical data in our custom reports? In SE16n we can see only current data:


Based on blog I was able to write a small report to count rows by setting data temperature.



You must be Logged on to comment or reply to a post.
  • This is awesome concept with practical views testing scenario. Data aging is MUST for all organisation. 
    I tested in our S/4HANA it works like a charm.

    In details - The data aging object for accounting documents (FI_DOCUMENT) replaces the archiving object FI_DOCUMNT, which makes standard archiving for financial documents no longer possible. Message FG025 (“Archiving is obsolete for the FI document; use aging”) is displayed when triggering the archiving program without the SAP Information Lifecycle Management (ILM) business function activated (see SAP Help for SAP ILM). FI_DOCUMENT is an application aging object, which means that it is delivered by the SAP application (SAP S/4HANA Finance) and that it has a runtime class that implements the IF_DAAG_RUNTIME interface.


    Dear Bartosz,

    thank you for this great step-by-step guide on how to activate data aging. Before I come to my question I have two suggestions: Perhaps you could replace the screenshot of your test report with the actual code and name the profile parameter abap/data_aging also in the text so it can be indexed by the search engines (solved that by this comment ;-)).

    Seems that data aging is a topic SAP is still improving a lot. I.e. the functionality to Propose Partition Ranges was not available in NetWeaver 7.50 SP0. According to my search result at it was introduced in SP3.

    But now finally my questions:

    1. In the slide you've posted at the beginning there is a mentioning of SAP Note 1872170 - Business Suite on HANA and S/4HANA sizing report with the text "Footprint reduction potential". Have you tried this report and where you able to identify the potential?
    2. Have you gained operational experience with data aging that you are able to share?
    3. As data aging is using SAP HANA Table Partitioning I would think the documentation Designing Partitions is highly applicable. Have you taken it into consideration?

    Looking forward for your thoughts.

    Best regards

    • Hello Gregor,

      I did some investigation around this topic for several customers, but so far everything is the planning phase. As for now I think the standard Data Archiving is more applicable for most of the customers, but it might change soon.

      Now the answers:

      1. This report is excellent to plan migration to HANA, but it didn't really helped me with Data Aging. Even now I run it to check the outcome, but it only says the data aging is activated for table ACDOCA.
      2. I think I still need to get more experience prior to sharing 🙂
      3. Currently the data aging in Netweaver is working based on document dates. It means you don't have to get into such details.




    • Also I'm trying not to give too much information in the form of copy / paste to encourage others to learn. I believe if you have to re-type some code it will stay in your head for longer (and very often it's just easier to understand). But thanks for raising it, I will try to include more data in the text to be search engine friendly!

      Best regards


  • Hi Bartosz,

    thank you for that informative blog. Gregor just mentioned also 2 of my questions already, so I just add one below.

    I wonder a little longer how to identify the data aging tables easily, do you or someone else have a hint for me?


    • Hi,

      I'm not sure if I understood your question correctly, but If you go to DAGPTM then you can find list of tables for each of the partitioning object.

      Best regards



      • Hi Forian and Bartosz,

        seems SAP is developing quite a bit in the data aging area. The functionality described by Bartosz regarding Proposed Partition Ranges was not available in the system that I've tried. But you still will see which standard Objects SAP enabled. And I think they created them for a reason. You can use Transaction TAANA to identify how the data distributes by date.

        Best regards

  • Hi Bartosz,

    thank you for sharing this. While adapting it I asked myself, is there some sort of functionality to merge partitions or even 'repartition' them within SAP GUI to the original version where there was no split with time selection? I am asking because I'm using an older version where I don't have the proposed partition ranges e.g. and I think there is more to come like Gregor said.

    Best regards



    • Hello Stefan,

      at the time I was writing the blog I couldn't find any functionality to re-partition the table or delete the partition. I used SE14 to delete the database object and create it again. Obviously this is something I wouldn't recommend on any other system than sandbox.

      Best regards


    • Hi,

      unfortunately there might be many reasons why you can't perform this activity? Do you have necessary authorizations? Is the button completely gone or you just can't click on it? It would be helpful if you post a screenshot.

      Best regards



  • Very helpful blog. I have two question though.

    1. From data volume management perspective data ageing helps us to reduce hot data foot print but Information life cycle management perspective how data ageing can help us with destroying data that's past its retention period.

    2. Does SAP ILM has a role in S/4 HANA ?

      • Thank you for replying.

        For application like simple finance, In S/4 HANA the only option I believe is data ageing and I cannot continue using data archiving on this application in S/4 HANA. So we cannot enforce retention policy for simple finance in S/4 HANA (Since it does not have data archiving / ILM in S/4)? 

  • Hi,

    For practical purposes is there any limit to the data which can be "aged"?

    4TB(in-memory) & 16TB(on-disk) should be okay?


    BR, Avi


      Dear Avi,

      Based on our understanding the limitation in memory is defined by the HANA product. To my knowledge the biggest HANA database is a scale out of 48TB in memory.

      The size on the disk is limited by max size of the HANA database. To my understanding the limitation is the amount of partition a single table in HANA is allowed to have + the amount of the entry in a partition.



  • Hi Bartosz,


    Thank you so much for the detailed explanation on Data Aging concept.

    I am following the steps you mentioned, but I encountered a problem(error) while activating the Data Aging Business Function in SFW5 transaction.
    Error - "DAAG_DATA_AGING cannot be activated as profile parameter abap/data_aging is not selected".

    I maintained the data aging profile parameter value as: "on".
    (though I don't know exactly what value I have to assign it). May be this is the reason for the error, but I am not sure.

    Can you please help me in resolving this error.


    Thanks & Regards,
    Vasu Attaluri

  • Hi Bartosz,

    I didn't understand what does "restart application" means here.
    Can you please give me steps on how to restart application.

    Vasu Attaluri

    • Sorry for confusion. I meant application server (your SAP system).

      after changing profile parameter it is required to restart application server that new settings are applied.


  • Hi Bartosz,

    when I am testing data aging with your steps, I faced 2 issues, can you please help:

    1. when I am taking the action of propose partition ranges for partition object with data filed fldate. I got an information with :The row count of the selected tables is less than threshold 300.000.000, when I click continue but no data updated on the screen?


    2. when I run partition jobs after configured the periods some errors in job log and the partition was failed. what is the first level hash partition?

    Table DAAG_SBOOK cannot be partitioned

    Message no. DAAG_PARTITIONING038


    Data aging partitions can be created at the second level only if the table is hash partitioned at the first level



  • Hi Bartosz,

    Thanks for the blog. It is very helpful,


    However, towards the end of your blog the screenshot from HANA studio with all partitions (hot and cold), the last column "Loaded" still says "FULL". Does that mean that in your example, your historical data also residing in HANA memory ? That should not be the goal, no ?

    Also, what happens to data in 'historical partitions' ? Can you still edit them ?

    Thank you with kind retards,



    • When you query a data, only the partition containing the data is loaded to memory. You’re correct, maybe the screenshot is not the best – I was probably playing around running different select statements and entire table got loaded.

      I'm not aware about any problems with editing the partitioned data, however it may also depends on how the report was written.


    Hello Bartosz,


    we are looking to implement the hana data aging in our landscape. Regarding this I have below two queries which i could not fine the answear anywhere. It will be great help if you can help me with the answear to these two queries.


    1. As data aging involves automatic table level change like addition of the column "_DATAAGING" to the concern table & other non-manual changes done by the data aging in the database. Is there any transport request generated for this whole data aging change process? If no such change request is generated during the process then will not there be a table structure inconsistency between different systems in the landscape considering if i had done the data aging for that table in one system & in other system i didnot do it(like between DEV & QA)?

    2. So when i will schedule the periodic background job for taking care of the future growth of the table, how the future growth will be hanle.
    For example i have done a data aging where i have three partition 1. for 2017 data(cold) 2. for 2018 data(cold) & 3rd for 2019 data(hot) & my data restriction says anything older then 1 year should be moved to cold storage in that case wil there be a 4th paritition created automaticly in 2020 & 2019 data will be moved to the 4th partition & then the 4th parititon be moved to cold storage?




  • Hello Bartosz,


    Is there a way to change the process.

    1. Have all data back as before in a single partition
    2. Increase the number of retained days to "refill" the main partition.
    3. change the partitions date in order to split or merge some dates

    Thanks for your help


    • Hello Thomas!

      At the time of writing the blog there was no such advanced tools.

      Have a look at this thread:

      According to SAP Note 2416490 - FAQ: SAP HANA Data Aging in SAP S/4HANA there are new features in SAP HANA 2 SP03 that allows to re-partition tables, however I haven't looked at it in more detials.



      • Hello

        It seems there is now a way to UNDO the changes with the transactions DAG_UNDO but I have not tried yet. This may help us in deciding to move on DAta Aging since the Business was concerned about Performance issues.



        • hello

          As specified in the SAP Note 2416490 – FAQ: SAP HANA Data Aging in SAP S/4HANA  DAG_UNDO is not working for Objects Change Document and IDOC.

          it is replaced by 2 other reports

          • RSEIDOC_MOVE_TO_HOT  -> for IDOCs
          • RSSCD_MOVE_TO_HOT     -> for Change Documents



  • Hi Bartosz,

    Thank you a lot for that article. It's really of big help for me.

    I am implementing data aging on SAP BWoH for our corporation, and need some help or any documentation to use data aging with /BIC/ tables created automatically while activating an aDSO per example.

    Can you please help out with this?

    Thank you a lot.


  • Hi,

    just let me add this information:

    to fill the demo tables DAAG_SPFLI, DAAG_SFLIGHT and DAAG_SBOOK with some data records you can use the report RDAAG_COPY_SFLIGHT_DATA. The report copies entries from the source tables SPFLI, SFLIGHT and SBOOK to the corresponding DAAG_* tables.



  • Hello Bartosz,


    I have configured SAP DATA Aging for BC_CHDO, Data aging job getting failed error as below:

    partition error: cannot determine partition for each row with error: Error allocating rows to parts;Could not allocate value '20110919' for column '_D